Skip to Content

DB index requirement when using BI accelerator

Dear all,

I am now considering modeling which fill the following requirement.

- 1.3 billion data are stored.

- 14 million records are uploaded daily

- data are extracted from SD

I think..

- use BI accelerator (roll up 14 million records to BIa index takes less than 5 min)

- delta data are uploaded to InfoCube directly (not via DataStore)

- data are uploaded parallel

- data older than 5 years are deleted monthly. so DB partitioning is required.

- DB index is not required for BIa and for rolling up to BIa. and uploading time will be shorter without creating DB index

- but it is better to create DB index when executing compression.

I wonder in this case,

- whether it is better to use partitioning for making data deleting time shorter.

- whether it is better to execute compression (if not, DB table is partitioned too small, by request and by Time characteristic. but compression takes longtime...)

- then whether it is better to create DB index before comression (creating index takes sometime, but withiout DB index, compression must takes long time.)

- by some period BIa delta index should be marged with standard index, is it possible to report during this marge process?

Kind regards,

Masaaki

Add a comment
10|10000 characters needed characters exceeded

Related questions

1 Answer

  • author's profile photo Former Member
    Former Member
    Posted on Feb 26, 2007 at 07:58 PM

    I will let Major of the talking to be done by SAP..However in regards to Compression and Deleting data heres the Rationale..

    <b>Compression -</b>

    With BI accelerator indexes you do not have to compress after rolling up data packages. The data on the BI accelerator server already exists in a read-optimized format.

    However, in the following case it may be useful to rebuild the BI accelerator index, although this is not strictly necessary.

    A BI accelerator index is created for an InfoCube that is not aggregated, or a large number of data packages are later loaded to this InfoCube. If you compress this InfoCube, more data is contained in the BI accelerator index than in the InfoCube itself and the data in the BI accelerator index is more granular. If compression results in a large aggregation factor (>1.5), it may be useful to rebuild the BI accelerator index. This ensures that the dataset is reduced in the BI accelerator index too.

    <b>Deleting Data -</b>

    If you delete data from the InfoCube selectively, the BI accelerator index has to be rebuilt. When you execute selective deletion, the system automatically deletes the affected BI accelerator index.

    When you delete a data package (that is not aggregated) from an InfoCube, the index for the package dimension table is deleted and rebuilt. The facts in the fact index remain but are “hidden“ because they are no longer referenced by an entry in the package dimension table. Therefore, more entries exist in the index than in the table of the InfoCube. If you regularly delete data packages, the number of unused records increases, increasing memory consumption. This can have a negative affect on performance. In this case you should consider rebuilding the BI accelerator index regularly.

    Hope it Helps

    Chetan

    @CP..

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.