Skip to Content

Huge data itab and table updation

Hello Experts,

I have a requirement where i need to fetch records placed in 2 FTP paths into SAP with my ABAP program and update a custom table. Data is picked up later from that custom table through sequence of background jobs and all those records will be archived and post archival, those records are deleted from that custom table. (Only 9500 records can be picked up for archiving jobs at once)

The total records in those 2 text files in FTP path are in the range of 170000.

The issue is: Once those records are in my internal table from FTP

1. It is not ideal to loop thorough all those records at once. So, how many records should i pick at once to update the custom table to avoid memory overflow? is there a limit?

This picking of files from FTP is a daily activity and the frequency of data that will be in those files will be same as i mentioned above. Data size is huge.

Thanks in advance.

Add a comment
10|10000 characters needed characters exceeded

Related questions

1 Answer

  • Best Answer
    author's profile photo Former Member
    Former Member
    Posted on Sep 22, 2011 at 01:13 PM

    170,000 records shouldn't be too large for an internal table. But you haven't said how large each record is.

    As for looping, you have to read each record eventually, why not all at once?

    Rob

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.