Skip to Content
0

ABAP - Processing 40 million recrods to an outbound File

Dec 18, 2016 at 07:16 PM

107

avatar image

Dear Folks,

I would like to pick your brains on best practices and performance tips for processing around 40 million records to an outbound file. We will be running the report based on posting periods in BKPF and BSEG entries. BKPF would have around 8 million records for a posting period and BSEG could have around 40 million records for the 8 million fetched from the BKPF. We will need consolidate the file based on Doc Type. Second level of consolidation will be on G/L and profit center combination.

With this huge amount of data possibility of memory issues are high. The option to go with is using Open / Fetch cursor with package size of may be 100,000.

Please provide your suggestions.

Thanks,

Raj

10 |10000 characters needed characters left characters exceeded

It sounds a correct approach to work on smallest units, their size will depend on the memory requirements for every package (which additional SELECT you'll do for every package, etc.), and the total memory available.

0

Raj,

Open/Fetch/Close Cursor with proper primary keys or Index usage will yield positive results.You can explore the Primary Keys and existing indexes to have an idea on how far will they be able to help you in terms of performance and if required as a last option you can consider creating a new index to meet your requirement.

K.Kiran.

0

What Sandra said + what do you expect to do with such file? Forget the SAP program, even opening or sending somewhere such large file would be problematic. What exactly is the requirement? Why do you have to read so many records and why write into a file?

0
Jelena Perfiljeva

Finance is a shadow system in our environment. So we need to feed the SAP data for all SD and MM transactions to other system in that posting period. The file is going to be segregated at the FI doctype so ideally there would be only 30 header records and line items we are consolidating based on GL and profit center. The file will not have that many records since we are consolidating all the FI docs based on the above. It's the manipulation of the data at run time in the background and filling the file which I am worried about.

0

Raj,

In addition to performance optimisation of the select query,you can also explore using field-symbols while looping the respective internal tables.

K.Kiran.

0
* Please Login or Register to Answer, Follow or Comment.

1 Answer

Jelena Perfiljeva
Dec 20, 2016 at 05:59 PM
0

In case of "shadow FI" I believe some standard interface solutions already exists, possibly using IDocs. Try Google -> "external FI system site:sap.com" or maybe ask in FI forum... err tag.

Either way, it seems the design to SELECT the whole GL into memory when you actually need either consolidated data or small percentage of the records is flawed. This question seems to be about a specific issue that is caused by such design but the design itself is a problem here IMHO.

Share
10 |10000 characters needed characters left characters exceeded