Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Huge data uploads

Former Member
0 Kudos

I have the requirement to upload customers ,Materials .Each of them have around half a million records. Is there any method or procedure which will considerably reduce the time of run( I am using BAPI ).

8 REPLIES 8

Former Member
0 Kudos

i think no. The best way propably is to use LSMW project with idocs. YOu will easy mainain eventual errors.

BR, JAcek

0 Kudos

Hi!

BAPI or IDOC - in general they need the same time (no discussion about +/- 10%).

With IDOC you will get an easy way to create the data in SAP parallel -> best way to get use all resource and reduce total runtime.

Deactivate change documents and application log. If change documents can't be deactivated by an import parameter, then make an extension in the xxx_document_write function module. This might give you about 20% better performance (depending on the total size of change docs / appl log tables).

Make a fresh runstat before starting -> all necessary check tables should have a good index access. Also a new version after half the run can help.

Regards,

Christian

Former Member
0 Kudos

You may want to consider turning off change documents.

Rob

Former Member
0 Kudos

Hi Sashi,

Rite now me too working with the same stuff and im using LSMW/direct input methods with AIP so im feeling very much comfortable and easy to upload tons of data...

for customers its RFBIDE00/0050

and Materials its RMDATIND/0020

any more questions plz ask...

Sony

0 Kudos

I am looking at 2 million records and direct input doesn't have the simulate functionality. how does we know the capacity of data internal table can hold during execution ??

Thanks

sasi

0 Kudos

I don't think you can do all of this at one time. Create a number of sessions and you can process some of them in parallel. You will probably have locking errors, but re-processing the sessions should fix them.

Rob

0 Kudos

Using LSMW direct input is generally preferred option for master data upload. I am not really sure why do you need to simulate the upload for every record.

You can split the no of records in different sets & upload these sets one by one . After getting the log for first upload you will get some log for the records which are not uploaded . You can rectify the other sets accordingly which will reduce the furher processing time .

Hope this will help.

0 Kudos

hi..as yogesh said u got to split the file and load...as im doin so n its working fine with the 2 prgms i mentioned earlier

sony