Skip to Content

Active Table for Write-Optimized DSO became Too Big - How to deal with it?


I have Write-Optimized DSO with more that 1 billion records in its Active table. This is the POS Analytics: Retail Control CM DSO which is filled by 0RT_PA_TRAN_CONTROL datasource.
The BW system is over HANA database.
When the POS DM process chain is running the BW system is almost unusable. From the BASIS team said that the when the process chain is updating this DSO, the whole data from the Active table is load at the RAM of the BW Application Server and such large amount of data almost reaches the RAM limit of the Server.
They want from me find a to restrict the amount data that is loaded at the RAM of the server on every DSO update.

My question is what is the proper way to deal with such big tables?

Is it possible to copy the original DSO in a Semantically Partitioned DSO which is partitioned on yearly basis - on 0CALYEAR for example. Then to fill the SP DSO from the original DSO.
Or there are other possibilities?

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • Nov 24, 2016 at 02:57 PM

    Hello Boyan,

    This is surprising. The whole table should not be reloaded to the memory (or at least this should be avoided).

    Have a look to the note 1767880 (chapter Implementation of non-active data in BW) which explains this mechanism (especially the "CAUTION" part about the access on the partition ID). If you access to this DSO without a partition ID: "This would load the complete table to the main memory".



    Add comment
    10|10000 characters needed characters exceeded