Skip to Content
0

BAPI_CUSTOMER_CREATEFROMDAT1 issue with parallel processing

Apr 12 at 12:13 PM

65

avatar image
Former Member

Hi Team,

we are creating lakhs (100,000) of customer records at a time using BAPI BAPI_CUSTOMER_CREATEFROMDAT1 using parallel job approach.

Below are the details

1. Fetch the records from HANA schema table using ADBC query.

2. Split the records in to 4 parts ( Ex: 25000 records for single job

3. Call BAPI using SUBMIT programs by sending the records using DB Index.

Export itab to database index indx(zb).

4. Used asynchronous updates BAPI_TRANSACTION_COMMIT with out wait.

Issue :

Creating extra records in ECC side ex: 101000 records when we process 100000 records of HANA schema table.

Tried with SET UPDATE TASK LOCAL with BAPI_TRANSACTION_COMMIT without wait

Tried with SET UPDATE TASK LOCAL with BAPI_TRANSACTION_COMMIT with wait = 'X'



10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

ROBERTO Forti Santos Apr 12 at 09:35 PM
0

Hi Rajasrikanth,

Consider reviewing your steps and testing as below.

2. Delete adjacent duplicates before Split records into 4 parts ( Ex: 25000 records ).

2.a.Save those records separated by comma into 4 ABAP Flat Files Interface (ex.: client01.txt, client02.txt etc.) into SAP Application Server (ex.: \\...\inbound\client01.txt etc.).

https://help.sap.com/http.svc/rc/abapdocu_752_index_htm/7.52/en-US/abenfile_interface_statements.htm

3. Process another program/report to each file created above but regarding 4 different jobs (background mode) and calling BAPI directly using BAPI_TRANSACTION_COMMIT with wait = 'X'.

Regards,

Share
10 |10000 characters needed characters left characters exceeded
Che Eky Apr 12 at 03:48 PM
0

For parallel processing use CALL FUNCTION STARTING NEW TASK and pass data as an input parameter rather than memory export/import.

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

if we use single program it is taking around 17 hours of time for 1000,000( 1 million). So to make it 4 to 5 hours we have created 4 parallel jobs.

Not sure why the BAPI buffers causing the issue. Internally it is creating multiple customers.

Regards

Raja

0
Che Eky Apr 12 at 03:58 PM
0

You may have better luck using CALL FUNCTION STARTING NEW TASK. Idocs would be another option but you may require several different Idoc types (adrmas, cremas, debmas).

Show 2 Share
10 |10000 characters needed characters left characters exceeded

I don't think this would be smart: theoretically you would be starting 100.000 new tasks in parallel (1 for each new customer). My guess is you'll bring down the whole system.

1

Maybe if you did it your way it would not be smart. Custom server groups can be used control maximum work processes.

0