on 11-24-2016 12:09 PM
Hello,
I have Write-Optimized DSO with more that 1 billion records in its Active table. This is the POS Analytics: Retail Control CM DSO which is filled by 0RT_PA_TRAN_CONTROL datasource.
The BW system is over HANA database.
When the POS DM process chain is running the BW system is almost unusable. From the BASIS team said that the when the process chain is updating this DSO, the whole data from the Active table is load at the RAM of the BW Application Server and such large amount of data almost reaches the RAM limit of the Server.
They want from me find a to restrict the amount data that is loaded at the RAM of the server on every DSO update.
My question is what is the proper way to deal with such big tables?
Is it possible to copy the original DSO in a Semantically Partitioned DSO which is partitioned on yearly basis - on 0CALYEAR for example. Then to fill the SP DSO from the original DSO.
Or there are other possibilities?
Hello Boyan,
This is surprising. The whole table should not be reloaded to the memory (or at least this should be avoided).
Have a look to the note 1767880 (chapter Implementation of non-active data in BW) which explains this mechanism (especially the "CAUTION" part about the access on the partition ID). If you access to this DSO without a partition ID: "This would load the complete table to the main memory".
Regards,
Frederic
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
75 | |
9 | |
8 | |
7 | |
7 | |
6 | |
6 | |
6 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.