At the moment I have a larger problem with DTP package size. I have some large PSA requests. One request has a size of arround 250 mio records. I know that is very large and it is recommend to redice the size before extraction in SBIW. But in this case I can't change this. But I think this is not the core problem. The single Datapackages have a natural size of 80k until 100k number of entries. After the PSA the data target (classical infocube) should be filled with DTP. The ETL process goes over a larger transformation process with 3 Info Souces and coresonding transformations. These different steps are my core problem because I have a temporary multiplication of data records per package.
This means one packae with 80k until 100k number of records will be enlarged until around 700k number of record because of transformation logic and after all transformation logic the package size will bereduces to 20k number of records per package and will be write into info cube.
The problem is now that I get a error message in ST22 and red request: TSV_TNEW_PAGE_ALLOC_FAILED
I can start the request again with manual process but it don't run without problems. Fact is that I can't create another ETL process. This enlargement is set because the data which are transfered from source system were reduces in the source and on BW side we need the enlargement because of monthly values for customer reporting. In source system we reduce the values from month to calenderyear so that we don't have so much data for transfering into BW.
If I set the DTP package size it has no effect because of the source package size in the PSA (so if i set 10k or 50k package size in the DTP it will be ignored). Now i'm looking for a possible solution to reduce the package size with coding or other possibilities to reduce the memory allaction.
Have anybody an idea for this case? I work with 7.4 on a HANA database.