Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

BAPI_CUSTOMER_CREATEFROMDAT1 issue with parallel processing

Former Member

Hi Team,

we are creating lakhs (100,000) of customer records at a time using BAPI BAPI_CUSTOMER_CREATEFROMDAT1 using parallel job approach.

Below are the details

1. Fetch the records from HANA schema table using ADBC query.

2. Split the records in to 4 parts ( Ex: 25000 records for single job

3. Call BAPI using SUBMIT programs by sending the records using DB Index.

Export itab to database index indx(zb).

4. Used asynchronous updates BAPI_TRANSACTION_COMMIT with out wait.

Issue :

Creating extra records in ECC side ex: 101000 records when we process 100000 records of HANA schema table.

Tried with SET UPDATE TASK LOCAL with BAPI_TRANSACTION_COMMIT without wait

Tried with SET UPDATE TASK LOCAL with BAPI_TRANSACTION_COMMIT with wait = 'X'



6 REPLIES 6

che_eky
Active Contributor
0 Kudos

For parallel processing use CALL FUNCTION STARTING NEW TASK and pass data as an input parameter rather than memory export/import.

Former Member
0 Kudos

if we use single program it is taking around 17 hours of time for 1000,000( 1 million). So to make it 4 to 5 hours we have created 4 parallel jobs.

Not sure why the BAPI buffers causing the issue. Internally it is creating multiple customers.

Regards

Raja

che_eky
Active Contributor
0 Kudos

You may have better luck using CALL FUNCTION STARTING NEW TASK. Idocs would be another option but you may require several different Idoc types (adrmas, cremas, debmas).

I don't think this would be smart: theoretically you would be starting 100.000 new tasks in parallel (1 for each new customer). My guess is you'll bring down the whole system.

che_eky
Active Contributor
0 Kudos

Maybe if you did it your way it would not be smart. Custom server groups can be used control maximum work processes.

roberto_forti
Contributor

Hi Rajasrikanth,

Consider reviewing your steps and testing as below.

2. Delete adjacent duplicates before Split records into 4 parts ( Ex: 25000 records ).

2.a.Save those records separated by comma into 4 ABAP Flat Files Interface (ex.: client01.txt, client02.txt etc.) into SAP Application Server (ex.: \\...\inbound\client01.txt etc.).

https://help.sap.com/http.svc/rc/abapdocu_752_index_htm/7.52/en-US/abenfile_interface_statements.htm

3. Process another program/report to each file created above but regarding 4 different jobs (background mode) and calling BAPI directly using BAPI_TRANSACTION_COMMIT with wait = 'X'.

Regards,