Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Process 2 millions of records from AL11

former_member478603
Participant
0 Kudos

Hi All ,

I need to read 2 million records from Al11 in single file. Will perform some validations and then will update the sap tables. I will capture it in internal table before update. So will thess millions of records be captured in internal table? If not then how can i achieve parallel processing for this. Do i have to ask Basis for increasing runtime before i proceed?

1 ACCEPTED SOLUTION

SimoneMilesi
Active Contributor

Hi Madhuri,

With such data volume, i'd organize the work in another way than uploading all the 2 milions records into an internal table.

I'd do something like thisu (quite rough flow), "slicing&mincing" the update in smaller chuncks (i put 1000, but calibrate it on your needs) .

DO.
   READ DATASET <your file>.
       IF sy-subrc <> 0.
                 exit.
       Endif.
      PERFORM controls.
      APPEND int_table.
      IF int_table lines >= 1000.
           update db&commit.
           clear int_table[].
     endif.
ENDDO.
 if int_table[] is not initial.
   update db.
Endif.
7 REPLIES 7

SimoneMilesi
Active Contributor

Hi Madhuri,

With such data volume, i'd organize the work in another way than uploading all the 2 milions records into an internal table.

I'd do something like thisu (quite rough flow), "slicing&mincing" the update in smaller chuncks (i put 1000, but calibrate it on your needs) .

DO.
   READ DATASET <your file>.
       IF sy-subrc <> 0.
                 exit.
       Endif.
      PERFORM controls.
      APPEND int_table.
      IF int_table lines >= 1000.
           update db&commit.
           clear int_table[].
     endif.
ENDDO.
 if int_table[] is not initial.
   update db.
Endif.

matt
Active Contributor

PERFORM?

Surely you mean me->validate_contents( ... ).

matthew.billingham "perform" as verb 🙂

To add to Simone's suggestion to do this in "packages":

Whenever I have to do something like this, I put a parameter on the selection-screen with a suitable default-value but open for input. That way, the number can be tweaked dependent on the number of entries actually available which can differ quite a bit between development, test and production systems. Testing the logic in a dev-system with few entries can then happen with e.g. 10 as there may not even be 1,000 relevant entries available there.

0 Kudos

As you are committing after every 1000 records, there's no way to Rollback it if any of the records failed.

0 Kudos

gayatri.bokar the other option is...? To write to your db 2 milions records all together?

Are you sure you want to test and stress your DB with that?

And all those 2 milions records need to be written or discarderd all together?

Choose your chunck: you can work record by record, but it's a stress too for the DB (2 milions commits)!

As 8b889d0e8e6f4ed39f6c58e35664518f suggested, use a paramter and adapt your run with some tests

0 Kudos

Yes will try it out with some tests. Thanks all