Skip to Content
0

Huge amount of data volume for BPC cube on HANA

Sep 28, 2017 at 10:33 AM

141

avatar image

Hi experts,

We have BPC classic (10.1 SP10) on HANA DB.

On one of our cube we have about 1,5 billion data volume for the moment and we do not find someone which can told us if ther is a data volume limit.

Do you know the max number of records on one model and is there a possibility on BPC classic to use the multi-providers to cut our model in multi semantic models.

Many thanks for your answers

Stéphane

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

6 Answers

Best Answer
Vadim Kalinin Sep 28, 2017 at 01:29 PM
0

"We do not have any issues at this time" - then keep it as it is!

"5 years data" - older data can be stored in the backup environment!

Share
10 |10000 characters needed characters left characters exceeded
Gersh Voldman
Sep 30, 2017 at 04:21 PM
2

Hi Stéphane,

There is no BPC limitations on number of records in a Model, but there is HANA limitation on 2 B records in a single table. If somebody is monitoring HANA stats they'd see a warning when this table is dangerously close to this limit.

Do you need to plan on all those 1.5 B records or some of them just for reporting? Records used for reporting only, can be moved to a separate cube under same MultiProvider and reported withing BW tools.

Regards,

Gersh

Show 1 Share
10 |10000 characters needed characters left characters exceeded

Hi Gersh,

Thanks for your answer, as Bilen said below we have done a partitionning on our fact table to bypass Hana limitation of 2 billion but normaly with this and Hana resizing we hope to continue working on good conditions.

Kind regards

Stéphane

0
Bilen Cekic Oct 03, 2017 at 02:55 AM
2

after 2.1billions rows of data in the model, you have to do partitioning in the facttable or you will get error during new row creation. my client has 32billions of rows in their sales model for planning data and everything is working fine.

Show 3 Share
10 |10000 characters needed characters left characters exceeded

Thanks for your answer, we perform the partitionning and we are now going to resize our HANA server so I think it is going to be fine.

Only for my culture, could you told me which size of HANA DB you have for 32 billion of data ?

Kind regards

0

Hi Stephane,

Disk size is 4.7TB but half is used. Memory size is 1.7TB and 1.2TB is used. I made a maintenance on the DB by moving unnecessary data to archive tables and unloading that tables from memory. Currently sales has 10+billion data and i can see from DB02 tcode which partition has how many records.

1

Hi Bilen,

Many thanks for this helpfull answer ;)

0
Vadim Kalinin Sep 28, 2017 at 12:33 PM
0

"1,5 billion data volume" - billion of what?

"possibility on BPC classic to use the multi-providers to cut our model in multi semantic models." - not possible!

Do you have any issues?

Do you think about archiving some old data?

Share
10 |10000 characters needed characters left characters exceeded
Stéphane GASA Sep 28, 2017 at 01:26 PM
0

Hi Vadim,

Thanks for your response, it is 1.5 billion of records.

We do not have any issues at this time but our monthly data volume is about 80 to 100 million records and our top management wants to get 5 years data so I don't know if 5 billion records is possible on only one fact table ?

Many thanks for your help.

Stéphane

Share
10 |10000 characters needed characters left characters exceeded
Stéphane GASA Sep 28, 2017 at 01:39 PM
0

Yes but I see someone who told us that it is a limitation @ 2 billion records so I prefer verify before crashing our system.

5 year data is the lower limit four our customers so it is difficult to told us that they can only access 1 or 2 years data because when we choose BPC on HANA that is first because SAP told us that there is no problem with huge volume of data :/

Show 2 Share
10 |10000 characters needed characters left characters exceeded

No issues with huge data volume, you just need to keep hardware OK with the volume required.

0

Thanks for your help, a syzing service is planned by our SAP max attention contract so I think that this point is going to be check.

Many thanks for your answers.

0