Can someone explain one thing. Then we import some data to BPC cube (for example, Import from NW inforpovider), BPC generate too many unnecessary records. For example, there is one record in BPC:
and one record in BW cube:
After we perform import package it generates 2 records in BPC:
I mean, even if the new record is equal to existing in all fields, it first generate negative records to get null in cube, and then write new record. Equal record!
10 imports of the same data - 21 record in cube, and still 1000 in result. So we need to collapse data in cube (run light optimization) after each import of data in BPC ideally? In other case it dramatically reduce the perfomance.
Is that normal? How we can avoid this?