Skip to Content

ABAP - Processing 40 million recrods to an outbound File

Dear Folks,

I would like to pick your brains on best practices and performance tips for processing around 40 million records to an outbound file. We will be running the report based on posting periods in BKPF and BSEG entries. BKPF would have around 8 million records for a posting period and BSEG could have around 40 million records for the 8 million fetched from the BKPF. We will need consolidate the file based on Doc Type. Second level of consolidation will be on G/L and profit center combination.

With this huge amount of data possibility of memory issues are high. The option to go with is using Open / Fetch cursor with package size of may be 100,000.

Please provide your suggestions.



Add comment
10|10000 characters needed characters exceeded

  • What Sandra said + what do you expect to do with such file? Forget the SAP program, even opening or sending somewhere such large file would be problematic. What exactly is the requirement? Why do you have to read so many records and why write into a file?

  • Finance is a shadow system in our environment. So we need to feed the SAP data for all SD and MM transactions to other system in that posting period. The file is going to be segregated at the FI doctype so ideally there would be only 30 header records and line items we are consolidating based on GL and profit center. The file will not have that many records since we are consolidating all the FI docs based on the above. It's the manipulation of the data at run time in the background and filling the file which I am worried about.

  • Kiran K Nagarajan Kumarappan


    In addition to performance optimisation of the select query,you can also explore using field-symbols while looping the respective internal tables.


  • Get RSS Feed

1 Answer

  • Dec 20, 2016 at 05:59 PM

    In case of "shadow FI" I believe some standard interface solutions already exists, possibly using IDocs. Try Google -> "external FI system" or maybe ask in FI forum... err tag.

    Either way, it seems the design to SELECT the whole GL into memory when you actually need either consolidated data or small percentage of the records is flawed. This question seems to be about a specific issue that is caused by such design but the design itself is a problem here IMHO.

    Add comment
    10|10000 characters needed characters exceeded