on 01-31-2005 7:49 AM
Hi All,
The infocubes in our production environment are partitioned on the basis of 0CALMONTH. While in designing phase we partitioned it for 2 years (ie 26 partitions.
We now have to increase the number of partitions. The data is present in the cubes for past 2 years. Can anyone please suggest as to how come overcome this issue.
Note : We are on BW3.0b and database is Oracle 9i.
Thanx in advance
Ajay
hi Ajay
I have a idea
1. Keep Cube ( say A )as it is as you can not increase partisions.
2. Create another cube where here after data will be stored say it is as B
3. Create Multi Cube ( C)
4. Report will use a ABAP routine after or before depend on the business which will on the basis of selection criteria searches data in a or b .. make it to know.
advantages
no need to do any Cube creation or copy cube data
disadvantges
need to have a logic how --> look up the request or cube ..
Who did implemented
ex SAP PT Proto
take care
Kumar Vuppala
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I think you'll find copying the cube a fairly fast process. Not sure how many millions of rows you're talking about, and it is system dependent, but unless you're talking 50 million+, bet you could do it in a day.
You can also look dropping indexes and having number range buffering on when loading to the copy will help.
The reload from the copy to the prod cube will probably be even faster.
Also - if you have not compressed the current prod cube, compress it first (specifying zero elimination, depending on the data, can also cut volume)and cut the time down for this process.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi All,
Thanx for great responses.
The workarounds provided by you guys would help me to reanalyse different ways for solving the problem.
It does mean that SAP is not providing any way in which partitions can be increased on existing cubes.
The cubes which we are discussing about consists data more than 300 millions (E-Table). The average data that is loaded in cube is round 15 millions per month.
While processing a DB Stats activity we found that it analyses each and every partition of the cube and which was really time consuming.
Considering these aspects while designing phase itself we partitioned for 2 years ie 26 partitions and now we are facing a problem for extending it.
Thanx and Best Regards
Ajay
Considering the amount of data to be managed in two years let me suggest to split data in differente IC to be seen in a MultiCube: DB stat and Compressions should be faster and can be run in parallel (different IC ...).
Modification of ICs will be easier (add one characteristics ...).
The price to pay is the effor of maintenance at back end level, but with a better result at front end level.
For more detailed infos about check out "Performance" documents in http://service.sap.com/bw under InfoIndex.
Best regards
GFV
Don't know what release of Oracle you are running, but Look at turning on Monitoring. You can set a threshold for a percentage change in the number of rows in a table ( or partition in this case ) and it will only be analyzed when the number of rows added exceeds the threshold. That way the old partitions will not continue to be analyzed. Another option would be to develop your own analyze script/process and only analyze the active partitions.
Do you compress the 300 million row table? use zero elimination when you do?
IMHO, the issue of creating a multicube in this case only makes sense if:
- the queries are regulary run across years (assuming you would create cubes by year(s)) that would take advantage of parallel processing
- the volume of data returned to the OLAP processor is modest, since if the row count exceeds more than 30,000 rows xfered, the OLAP processor will terminate parallel processing anyway (some change to this in 3.1 SP19)
- you don't have a boatload of queries to redo. Multicubes from the beginning doprovide some query flexibility for tolerating cube changes, but most shops don't have the time to go back and redo lots of queries that already exist.
I would create a copy of the InfoCube, fill it from the original cube via Export DataSource, clear the original one, increase the partitions and load the data back. As far as I know there are no standard ways to expand the partitions and I would frown from meddling with table entries or DB commands.
Best regards
Dirk
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Even if you could came across the problem (nd accroding to me the are no standard ways ... so hard to be supported from SAP in case os disasters) you will keep wooking with "Low" level partitionig. May be combining this tecnique with "high" level partitioning (Multicube) will give better results (e.g. Query run in parallel ...). You could also combine two partitioning criteria: one for the low level and one for the high level ...
Did you consider such an option?
Hope it helps
GFV
Hi Ajay,
another idea. Create a copy of the cube, partition it for the next years and join both cubes under the same MultiProvider. Then copy the queries to the MultiProvider and switch the Update rules to the new Cube. Then you fill all the data in the new cube and leave the old one as it is.
Best regards
Dirk
User | Count |
---|---|
86 | |
10 | |
9 | |
9 | |
9 | |
6 | |
6 | |
5 | |
3 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.