Skip to Content
author's profile photo Former Member
Former Member

Performance issues on write optimized DSO

Hi experts,

we are using SAP NetWeaver BI 7.0 and implemented the data model as an Enterprise Data Warehouse (EDW). The EDW-layer contains of write optimized DSO objects. Unfortunately the loading performance into these objects got worse the last months. In January the duration for loading 1.1 million records was approx. 1 hour. Now, 10 months later, the data load of 1.1 million records into this write optimized DSO lasts 12 hours! That means that we have a loading performance of 8% according to January.

Does anybody know how that can be possible? We have created indexes on all DSO objects on the first layer (Data Acquisition layer) to increase performance but the opposite happened, it got worse. Do write optimized DSO objects have a maximum number of records they may contain? Is it a database issue, for example bad statistics?

Any help is welcome.

With regards,

Daniel

Add a comment
10|10000 characters needed characters exceeded

Related questions

1 Answer

  • author's profile photo Former Member
    Former Member
    Posted on Oct 31, 2008 at 10:32 AM

    There can be various reasons for this.

    If there is huge amt od data in DSo and subsequent laods also fetch huge records then it will definitely take time and in Write DSO .

    Try partioning,craeting indexes/secondary indexes or archiving old data.

    Check these links for furtehr reference:

    https://blogs.sap.com/?p=44555

    /message/2987899#2987899 [original link is broken]

    Reagrds,

    Dhanya

    Add a comment
    10|10000 characters needed characters exceeded

    • Former Member

      Hi,

      thank you for your answer.

      Unfortunately that doesn't help us very much. We already have primary and secondary indexes on our tables and the table is partitioned by request id (as mentioned in your weblog). Archiving 10 months old data is not an option. Besides, 10 million rows in an DSO are not very much, other customers have more than 100 million records and the data load is much faster.

      I cannot believe that write optimized DSO cannot handle more than 10 million records, there must be an other cause. The database statistics are refreshed weekly. What about log files? Can they affect the data load process?

      Any help is welcome.

      With regards,

      Daniel

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.