cancel
Showing results for 
Search instead for 
Did you mean: 

Increasing the partitions of infocube

Former Member
0 Kudos

Hi All,

The infocubes in our production environment are partitioned on the basis of 0CALMONTH. While in designing phase we partitioned it for 2 years (ie 26 partitions.

We now have to increase the number of partitions. The data is present in the cubes for past 2 years. Can anyone please suggest as to how come overcome this issue.

Note : We are on BW3.0b and database is Oracle 9i.

Thanx in advance

Ajay

Accepted Solutions (0)

Answers (3)

Answers (3)

Former Member
0 Kudos

hi Ajay

I have a idea

1. Keep Cube ( say A )as it is as you can not increase partisions.

2. Create another cube where here after data will be stored say it is as B

3. Create Multi Cube ( C)

4. Report will use a ABAP routine after or before depend on the business which will on the basis of selection criteria searches data in a or b .. make it to know.

advantages

no need to do any Cube creation or copy cube data

disadvantges

need to have a logic how --> look up the request or cube ..

Who did implemented

ex SAP PT Proto

take care

Kumar Vuppala

Former Member
0 Kudos

I think you'll find copying the cube a fairly fast process. Not sure how many millions of rows you're talking about, and it is system dependent, but unless you're talking 50 million+, bet you could do it in a day.

You can also look dropping indexes and having number range buffering on when loading to the copy will help.

The reload from the copy to the prod cube will probably be even faster.

Also - if you have not compressed the current prod cube, compress it first (specifying zero elimination, depending on the data, can also cut volume)and cut the time down for this process.

Former Member
0 Kudos

Hi All,

Thanx for great responses.

The workarounds provided by you guys would help me to reanalyse different ways for solving the problem.

It does mean that SAP is not providing any way in which partitions can be increased on existing cubes.

The cubes which we are discussing about consists data more than 300 millions (E-Table). The average data that is loaded in cube is round 15 millions per month.

While processing a DB Stats activity we found that it analyses each and every partition of the cube and which was really time consuming.

Considering these aspects while designing phase itself we partitioned for 2 years ie 26 partitions and now we are facing a problem for extending it.

Thanx and Best Regards

Ajay

GFV
Active Contributor
0 Kudos

Considering the amount of data to be managed in two years let me suggest to split data in differente IC to be seen in a MultiCube: DB stat and Compressions should be faster and can be run in parallel (different IC ...).

Modification of ICs will be easier (add one characteristics ...).

The price to pay is the effor of maintenance at back end level, but with a better result at front end level.

For more detailed infos about check out "Performance" documents in http://service.sap.com/bw under InfoIndex.

Best regards

GFV

Former Member
0 Kudos

Don't know what release of Oracle you are running, but Look at turning on Monitoring. You can set a threshold for a percentage change in the number of rows in a table ( or partition in this case ) and it will only be analyzed when the number of rows added exceeds the threshold. That way the old partitions will not continue to be analyzed. Another option would be to develop your own analyze script/process and only analyze the active partitions.

Do you compress the 300 million row table? use zero elimination when you do?

IMHO, the issue of creating a multicube in this case only makes sense if:

- the queries are regulary run across years (assuming you would create cubes by year(s)) that would take advantage of parallel processing

- the volume of data returned to the OLAP processor is modest, since if the row count exceeds more than 30,000 rows xfered, the OLAP processor will terminate parallel processing anyway (some change to this in 3.1 SP19)

- you don't have a boatload of queries to redo. Multicubes from the beginning doprovide some query flexibility for tolerating cube changes, but most shops don't have the time to go back and redo lots of queries that already exist.

Former Member
0 Kudos

I would create a copy of the InfoCube, fill it from the original cube via Export DataSource, clear the original one, increase the partitions and load the data back. As far as I know there are no standard ways to expand the partitions and I would frown from meddling with table entries or DB commands.

Best regards

Dirk

Former Member
0 Kudos

Thanks Dirk,

We came accross this solution but would keep this as the last option as our cubes currently consist of millions of records and hence this activity time consuming..

Pls let me know whether we can do this kind of activity on the same cube.

Thanks again

Ajay

GFV
Active Contributor
0 Kudos

Even if you could came across the problem (nd accroding to me the are no standard ways ... so hard to be supported from SAP in case os disasters) you will keep wooking with "Low" level partitionig. May be combining this tecnique with "high" level partitioning (Multicube) will give better results (e.g. Query run in parallel ...). You could also combine two partitioning criteria: one for the low level and one for the high level ...

Did you consider such an option?

Hope it helps

GFV

Former Member
0 Kudos

Hi Ajay,

another idea. Create a copy of the cube, partition it for the next years and join both cubes under the same MultiProvider. Then copy the queries to the MultiProvider and switch the Update rules to the new Cube. Then you fill all the data in the new cube and leave the old one as it is.

Best regards

Dirk