I'm in a BPC 7.5 MS 3 server environment: DB, MSAS and Web. All running Windows Server 2008 R2 / SQL Server 2008 R2.
We have over 700 million rows in our fact table. Performance was suffering so we addressed OLAP partitions. To reduce each partitions row count to around 20 million we have to slice data by year, dept and sub dept. This introduces 300 partitions.
After creating the 300 partitions processing the Retail application has run for over 72 hours. Prior to this change we had 15 partitions,sliced by dept. Processing would take 8 hours or less.
I have the:
I welcome any additional guidance and experience related to the number of partitions, processing performance and partitioning approached with larger sets of data?
Are 300 partitions, too many?
Can I partition the Fact table. Currently there is only 1 long term storage table: tblFact<AppSet>. Can I create tblFact<AppSet>2010, tblFact<AppSet>2011, etc. and manage optimizations and data loads against the appropriate fact table for the data?