cancel
Showing results for 
Search instead for 
Did you mean: 

[Urgent]Session terminates during DS enhancement due to big data

Former Member
0 Kudos

Hi experts,

I have 0FI_GL_4 for DS enhancement. The business requirement is complicated but we have already fully optimized that ABAP code. Now our R3 side has 2.5 million records in BSEG and if we start BW initial extraction , the session will use more than 2G memory (use ST06 - Top CPU) after 0.5 million data's extraction which will automatically terminated by system!

Problem is that I can not write better code for the requirement. Any workaround to split this initialization?

Regards,

Aaron

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

That was a good one from Anil.

Also, u make also look at changing the size of the data package so that you extract minimum data. This can be set up from the soruce sytem in SBIW -> Maintain control parameters. This might help you to reduce the size of the data.

Former Member
0 Kudos

Hi Avinash,

Thank you. But I don't quite get it, is there any difference between 10000 records/package * 100 packages and 20000 records/package * 50 packages? Also I'm not familiar with this IMG settings, really appreciate if you can give me more guide or links

Regards,

Aaron

Former Member
0 Kudos

Yes Indeed, when you do this setting the value, this will send only the 10000 KBytes of data. This would not take 2 GB memroy for a particular process. Also, once more recommendation would be to use select the PSA and then update subsequently into the data targets.

You can refer to OSS note : 417307 for more details..

Former Member
0 Kudos

Hi Avinash,

Could you explain the relationship between package size and performance in both R3 / BW sides. Should I enlarge or reduce this value? Is it considered for a specific data source or could apply for general ones?

Once, what's the difference between parallel and subsequent updates of PSA and data target ?I have 150 packages to load into ODS in my case.

Thanks for you help and points will given.

Regards,

Aaron

Former Member
0 Kudos

Package Sizing depends on the many factors. Reducing this size might be also a problem with the performance.

This change will affect all the datasources. My advise would be to change for the init transfer and then change it back to whatever it was earlier. but again you need to sit with the basis to do the sizing of the datapackets.

Transfer happens for PSA and subsequent processing like

R/3 -> PSA -> DATATARGET

For the PSA and Data Target in Parallel, it would be

R/3 -> Data Target

R/3 -> PSA

So it would be updating paralelly..

Avinash

Answers (2)

Answers (2)

Former Member
0 Kudos

Thank you all. I have got through the hard time. Maybe I will write a blog to talk about this case

Former Member
0 Kudos

Hi,

You better go with multiple Delta inits instead one Delta init.

With rgds,

Anil Kumar Sharma .P

Former Member
0 Kudos

Hi Anil,

Thanks for you advice! Here's my concerns for multi initial / delta :

1, What's your suggest condition to divide the whole GL4 data source? If divide by fiscal year, my initial data is still that big ; if divide by company code, do we have to maintain new initial if a single new company code is added in R3 (that's highly probably in our project )

2, If we go with multi initial / delta, does the update method between GL4 ODS to its afterward data provider also the same with before? I mean will the ODS generate several deltas for them? Or it will only contain a delta package which we don't need to take special care on

Regards,

Aaron