I have joined BPC project which the partner has implemented.
BPC version is BPC10.0 NW.
Their design is that 3 billion records will be contained to one BPC model.
We have critical performance concern for the design because we had some BPC project in Japan but
did not have such case of big data..
We investigate the method of improving performance without dividing the BPC model.
Please let me ask 3 questions.
1. Is SPO(Semantic Partition Object) effective and useful for this case?
(Can SPO(Semantic Partition Object) be used for the BPC model?)
2. Are there any other methods to improve performance.(Logical partition?)
3. Is there any example in the past project which had 3 billion data record in one model?
Generally, should we divide the cube(model) in such cases?