Skip to Content
0

Batch Job: Export, Import & Free Shared Buffer by Batch

Mar 20, 2017 at 10:11 AM

110

avatar image

Hi,

I tried processing more than 80,000 records through a background job, which threw a time-out dump. I tried to split up the records and create batches of background jobs using Function Modules 'JOB_OPEN' & 'JOB_CLOSE'. My program looks something like below.

My problem is, I'm unable to process the data batch by batch as I'm supposed to. The data keeps on getting added to the shared buffer (like APPEND). If I try to free the shared buffer after each submit, when the job runs and 'IMPORT FROM SHARED BUFFER' is executed, no data exists in the shared buffer.

If I try to get the status of the job using 'GET_JOB_STATUS' and only if it is finished, then if I clear the shared buffer, will the process run as a whole? Will there be any time-out dumps again due to job queues?

Or is there any other optimized way to get this requirement done?

Loop itab.
*Consolidate Data
*When 2000 records processed
*------------------------------------------------------------*
CALL FUNCTION 'JOB_OPEN'.

EXPORT it_headdata TO SHARED BUFFER indx(st) ID 'HEADDATA'.

*------------------------------------------------------------*
SUBMIT zco_auto USER sy-uname
                VIA JOB lv_jobname
                NUMBER  lv_jobcount
                AND RETURN.

*------------------------------------------------------------*
*Inside SUBMIT zco_auto*
IMPORT it_headdata FROM SHARED BUFFER indx(st) ID 'HEADDATA'.
*BAPI Call*
*------------------------------------------------------------*
-------> *DELETE FROM SHARED BUFFER...*
*------------------------------------------------------------*
CALL FUNCTION 'JOB_CLOSE'.
*------------------------------------------------------------*
Endloop.
10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

0 Answers