on 06-25-2015 5:45 PM
Dear all,
There some planning application on a particular info Cube in my project.
we are facing a issue while opening or refreshing or retrieving data from cube ,we suspect that the applications are slow because of huge data in the cube.
For a particular record we have different versions which is being indicated by a Keyfigure "Flag".
Out of all the records for a particular combination we need to get latest value for this flag KeyFigure and aggregate all the other Keyfigure .
we cannot do cube compression because one of the keyfigure is being used as flag to indicate the status of the particular dimension . even standard zero elimination will not work .
we are thinking of the following approach .
1)We are planning to move previous year's data to different cube and create a Multi provider over both the cube . and migrate all the application from present cube to the multi provider .
2) We will create a DSO and load the data to DSO from existing cube . Latter in the start routine we will write ABAP code to aggregate the records .
Kindly advise which one to choose to Improve the performance of cube
Hi Praveen,
I would suggest you to move the data into different InfoProviders based on time dimension. Then implement multiprovider hint option by making use of RRKMULTIPROVHINT, please note there are pre-requisite to implement it. Refer the document http://scn.sap.com/docs/DOC-16126
Suman Chakravarthy Kexplained it clearly.
Hope it helps! don't forget mark helpful & right answers. It will help others (like you) to get the help instantly.
Thanks,
Umashankar
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Praveen,
1. Copy this cube data to some other cube and partition that cube based on time characteristics. Then create multiprovider on top of all the infocubes and execute the query. so performance will be good.
2. If you write code then performance will not be good.
Thanks,
Swapna Jain
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Praveen,
How many requests do you have in the cube ..? each request is a partition in itself .
Also have you tried aggregates - you can get away with not compressing the cube but use aggregates instead. ( Aggregates are compressed by default and you can always rebuild them )
Did you study the run schedule of the Query in RSRT ..? that will tell you what is taking time. ( You will find this in the Execute and Debug section of RSRT)
You can move historical data into a separate cube ( you could also compress the historical data instead - and make sure your cube is partitioned )
If you are using fiscal period etc for reporting - then partition the cube on those characteristics to get better performance.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
84 | |
24 | |
12 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.