I would like to pick your brains on best practices and performance tips for processing around 40 million records to an outbound file. We will be running the report based on posting periods in BKPF and BSEG entries. BKPF would have around 8 million records for a posting period and BSEG could have around 40 million records for the 8 million fetched from the BKPF. We will need consolidate the file based on Doc Type. Second level of consolidation will be on G/L and profit center combination.
With this huge amount of data possibility of memory issues are high. The option to go with is using Open / Fetch cursor with package size of may be 100,000.
Please provide your suggestions.