cancel
Showing results for 
Search instead for 
Did you mean: 

What are the best practices of the weight / maximum records to a BW cube

Former Member
0 Kudos

Dear experts,

The data of ours cubes will increase significantly, we have to know the best practices about the BW cube storage : weight, maximum record, dimensions, etc.

Today we are in this configuration :

  • BW : 7.31
  • Data Base : SQL Server 2008
  • Cubes : 50 millions lines, 10 dimensions, arround 10 Go

We found just one documentation : http://help.sap.com/saphelp_nw73/helpdata/en/d1/468956248e4d9ca351896d54ab3a78/frameset.htm

What are the best practices ? And if we upgrade to netweaver 7.4 or 7.5 with SQL Server 2016, have we the same recommandations ?

Regards,

Goulwen

Accepted Solutions (1)

Accepted Solutions (1)

FCI
Active Contributor
0 Kudos

As I said, there is not really best practices. I've seen cubes with more than 100 millions records. It will depend on the performances of your queries (linked to the structure of the queries, the volume of data, your hardware, your system parameters etc.) and the expectations of your users.

Are the query performances currently acceptable ?

But, a 50 millions volume seems a good time for thinking about an archiving/deleting/partitioning strategy.

Regards,

Frederic

Former Member
0 Kudos

I understand, there is no best practices, it's depend of several parameter like hardware.

Yes the query performances are acceptable yet.

Regards,

Goulwen

Answers (3)

Answers (3)

FCI
Active Contributor
0 Kudos

In this case, I will probably survey the performances of the queries on this cube and take actions if a degradation occurs or is expected to occur soon.

If you have to bring some modifications to your cube at a time, maybe it will be the time to think about partitioning it. If you have a multi-provider on top of it, it should not be too difficult.

Regards,

Frederic

Former Member
0 Kudos

ok, it's clear.

Thank

Goulwen

former_member778252
Discoverer
0 Kudos

Hi Frederic,

But if this cube 7.3 is migrated to 7.5 and reach 150 mil records, how partitioning will affect queries performance, specially if someone make one query that read cross all years or partitions?

Merci in advance

Alberto

FCI
Active Contributor
0 Kudos

The read of a multiprovider is done on each component in parallel.

FCI
Active Contributor
0 Kudos

Hello Goulwen,

At the end, it all depends on the current performances of your queries, on your expectations and on the estimated growth of your cube.

Regards,

FCI

Former Member
0 Kudos

Hello Frédéric,

Yes, I think too. Rightly, today the cube weigh 50 millions of records, in two years, we think it will weigh 150 million. So, is it not too much ? We search a customers return, how many lines can we insert per cube ?

Thank

Goulwen

shanthi_bhaskar
Active Contributor
0 Kudos
Former Member
0 Kudos

Hello Shanthi,

We know this procedure.

Today we need to know what is the data records limits per cube ?

Do you know the best practices about that ? For example, X million record per cube ?

Thank

Goulwen