I am new to this forum and new to BW as well. I just started the project and have been assigned the task of designing data model for loading historic data which contains more than 500 millions records. The data is there for more than five years. Because if I follow the normal procedure of loading data, it might take even a week(as per my understanding). Please give me some detailed information on how should I proceed with this scenario. My PM says, I have to take care of all dataload and query performance issues in designing the datawarehouse. I heard some partitioning of cube is helpful for yearwise data store related to performance but don't know for sure where I should start from.
Any help is appreciated.