Dear all,
I am now considering modeling which fill the following requirement.
- 1.3 billion data are stored.
- 14 million records are uploaded daily
- data are extracted from SD
I think..
- use BI accelerator (roll up 14 million records to BIa index takes less than 5 min)
- delta data are uploaded to InfoCube directly (not via DataStore)
- data are uploaded parallel
- data older than 5 years are deleted monthly. so DB partitioning is required.
- DB index is not required for BIa and for rolling up to BIa. and uploading time will be shorter without creating DB index
- but it is better to create DB index when executing compression.
I wonder in this case,
- whether it is better to use partitioning for making data deleting time shorter.
- whether it is better to execute compression (if not, DB table is partitioned too small, by request and by Time characteristic. but compression takes longtime...)
- then whether it is better to create DB index before comression (creating index takes sometime, but withiout DB index, compression must takes long time.)
- by some period BIa delta index should be marged with standard index, is it possible to report during this marge process?
Kind regards,
Masaaki