cancel
Showing results for 
Search instead for 
Did you mean: 

Improve data load performance

Former Member
0 Kudos

Hi Gurus,

My question is rather than line item dimension is there any other way to improve the data load performance when the dimension table size is greater than 10% of the fact table. I have a cube with five dim tables out of which one dim table size is 33% that of the fact table with 5 chars, the data model has been built and running in production. I know that for this type of problems we need to make that dim table as line item dim. Would anyone suggest any other solution to improve performance when dim table has more than 1 char and running in prod. I greatly appreciate your time and guarantee of assign points.

Thanks

Baba

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi,

I think line item dimension wont help you in improving data load performance...

You can check following:

1) ABAP routines while laoding to ODS/Cube..Check start routine any other routine..

2) User Exits in source system...

3) Data package size

4) Number range buffering

5) Check dependencoes in process chains..

6) Delelte -> load-> bilt indexes

7) if data is too much..then drop aggregates rebuilt..

And so on..specially tyou need to see where most of time is consumed..and acordingly you need to rectify the issue...

its like you are doctor and you are sugggesting prescription../so for this you need to know where is the problem forst rather than consuming all medications at once..

and ignoring wht the reral issue was..

Hope it helps

--SA

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi Ashok,

Would you please explain me in detail how do we optimize the schema, about clustering and buffering the table space and pools. If you have any good doc. on that please fwd me akr2754@yahoo.com. Thanks for quick response.

Regards

Baba

Former Member
0 Kudos

Hi Baba,

there is no doucment i have regarding this.but indexing and clusturing are creating complex and normal indexes and buffer area we can allocate in the table spaces in the database as we are using oracle as the database.which can improve our loading performance.

Regards,

ashok

Former Member
0 Kudos

hi,

apart from making line item dimension, you can improve the loading through following factors.

1. switch-off aggregates when loading.

2. delete index before loading. create it after loading.

3. create secondary indexe's.

hope this help you.

regards

harikrishna N

Former Member
0 Kudos

Hi Baba,

as you said the line item dimension will be effected more on query performance not in the loading performance.creating B-TREE or BITMAP indexes will improve the loading performace this can be achieved by specifying the cardinality.droping the indexes and maintaining proper indexing will improve the loading performances and remaining all will effect the query performance.by specifying the line item you are avoiding the two biggest table joins which will effect in the query performance much and loading performace its same in normal or line item dimensions.but in the case of cardinality the loading will be effected much.there are very few methods of improving the loading performace like maintaining proper indexing and (perfect schema desigining ) dimensions are distributed with all the characteristics are distributed ....

SAP GERMANY people came to tune our system and they have done the tunining in the databse level like the schema optimaization and maintaining proper indexing and clustering and buffers of the table spaces and pool areas they have checked and tuned.

assign points if it helps...

Regards,

ashok.