12-05-2018 8:12 AM
Hi All ,
I need to read 2 million records from Al11 in single file. Will perform some validations and then will update the sap tables. I will capture it in internal table before update. So will thess millions of records be captured in internal table? If not then how can i achieve parallel processing for this. Do i have to ask Basis for increasing runtime before i proceed?
12-05-2018 9:35 AM
Hi Madhuri,
With such data volume, i'd organize the work in another way than uploading all the 2 milions records into an internal table.
I'd do something like thisu (quite rough flow), "slicing&mincing" the update in smaller chuncks (i put 1000, but calibrate it on your needs) .
DO.
READ DATASET <your file>.
IF sy-subrc <> 0.
exit.
Endif.
PERFORM controls.
APPEND int_table.
IF int_table lines >= 1000.
update db&commit.
clear int_table[].
endif.
ENDDO.
if int_table[] is not initial.
update db.
Endif.
12-05-2018 9:35 AM
Hi Madhuri,
With such data volume, i'd organize the work in another way than uploading all the 2 milions records into an internal table.
I'd do something like thisu (quite rough flow), "slicing&mincing" the update in smaller chuncks (i put 1000, but calibrate it on your needs) .
DO.
READ DATASET <your file>.
IF sy-subrc <> 0.
exit.
Endif.
PERFORM controls.
APPEND int_table.
IF int_table lines >= 1000.
update db&commit.
clear int_table[].
endif.
ENDDO.
if int_table[] is not initial.
update db.
Endif.
12-05-2018 9:42 AM
PERFORM?
Surely you mean me->validate_contents( ... ).
12-05-2018 9:43 AM
12-05-2018 9:54 AM
To add to Simone's suggestion to do this in "packages":
Whenever I have to do something like this, I put a parameter on the selection-screen with a suitable default-value but open for input. That way, the number can be tweaked dependent on the number of entries actually available which can differ quite a bit between development, test and production systems. Testing the logic in a dev-system with few entries can then happen with e.g. 10 as there may not even be 1,000 relevant entries available there.
12-05-2018 1:17 PM
As you are committing after every 1000 records, there's no way to Rollback it if any of the records failed.
12-05-2018 1:22 PM
gayatri.bokar the other option is...? To write to your db 2 milions records all together?
Are you sure you want to test and stress your DB with that?
And all those 2 milions records need to be written or discarderd all together?
Choose your chunck: you can work record by record, but it's a stress too for the DB (2 milions commits)!
As 8b889d0e8e6f4ed39f6c58e35664518f suggested, use a paramter and adapt your run with some tests
12-05-2018 1:34 PM
Yes will try it out with some tests. Thanks all