Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Multiple jobs writing to a single file

Former Member
0 Kudos

Hi,

My program requires me to create 20 jobs. Each job needs to write into a single file on the application server. I have tried using open dataset, transfer and close dataset approach but it doesn't work.

I open the data set before submitting the report via jobs. in the each job the there is the transfer statement. When all jobs are completed close data set statement is triggered.

When i run this code the jobs give a dump at the transfer statement stating that the data set is not open.

Can anyone suggest a different approach to achieve this. I need to create a single file with all the processing details retrieved from the jobs.

PS: I have also tried using EXPORT LIST TO MEMORY while submitting, but it gives me an error that we cannot use use this when submitting the report via jobs.

1 ACCEPTED SOLUTION

ThomasZloch
Active Contributor
0 Kudos

Any chance you could write to 20 separate files and merge them into one file after the parallel processing is through?

Thomas

4 REPLIES 4

Former Member
0 Kudos

Hi,

In order to write file in application server you have to use Open dataset by any means.

The following logic might help you. Try this out. Please note you should be trying to open the dataset from a different program as it will not work any way.



data : lv_times typ i valur 99999.
do lv_times times. " You can make this infinite but there is a risk depending on your requirement you can set the value for      
                              "lv_times
try.
open dataset <filename> for appending..... ( Rest of the syntax you should write depoending on your requirement)
catch CX_SY_FILE_OPEN. " This will catch if the file is already opened.
******DO NOTHING.
endtry.

enddo.

trnasfer.....

****/////CODE LOGIC
close dataset <filename>.

Hope this helps.

Regards,

R

raymond_giuseppi
Active Contributor
0 Kudos

The batch job cannot share the OPEN DATASET of the caller...

You must open the DATASET in each job, and only one job at any time can do this if you dont want unpredictable results.

Lock

Manage error code during OPEN form if another job is currently filling the dataset (DO/ENDDO) - Lock during OPEN/TRANSFER/CLOSE

Locks

The database interface does not have an integrated lock mechanism to ensure that only one ABAP program accesses a file at any one time. If several programs simultaneously gain write access to a file, this will have unpredictable results.

To avoid this situation, SAP locks can be assigned, or unique file names such as GUIDs can be used.

Look for an existing lock object or create a new one to insure this.

OPEN DATASET Syntax

OPEN/TRANSFER/CLOSE at end of each job to minimize lock time. Use FOR [APPENDING|http://help.sap.com/abapdocu_70/en/ABAPOPEN_DATASET_ACCESS.htm#&ABAP_ALTERNATIVE_2@2@] in [OPEN DATASET|http://help.sap.com/abapdocu_70/en/ABAPOPEN_DATASET.htm] statement

Regards,

Raymond

ThomasZloch
Active Contributor
0 Kudos

Any chance you could write to 20 separate files and merge them into one file after the parallel processing is through?

Thomas

Former Member
0 Kudos

Hi Guys,

This requirement is no longer needed and hence i am closing the thread.

The closest possible solution which i found to be feasible in my scenario was opening the file using OPEN DATASET at the beginning of triggering the jobs, and then in each job i reopen the file using OPEN DATASET IN APPENDING MODE and then transfering the file contents. When all the jobs are completed i can close the file using CLOSE DATASET.

Thanks for all your responses.