Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Upload PGM File to ZTable

Former Member
0 Kudos

Hi,

I am trying to load a file (20 column) and the file can have nearly 80 million records.

ABAP Program needs to load all these records into table. Please tell if this is achievable?

Will Parallel processing help.

I am reading files in chunks of 5000 rows and inserting to the database after I insert nearly 0.6 million records I get an error message

"The database returned a value containing an error"

1. why do I get the above error. I see the data seems to be good for the insert.

2. Is it advisable to put 80 million record into a ztable.

the 80 million record will have

KUNNR(Customer)

MATNR(Material)

purchase sales history 1 year

purchase sales history 2 years

.....

....

sales history last 1 month.

We have an existing RFC that pulls out so many data in the front end (Dot net UI); along with this a join query will be made to above table to display results in the front end.

Is this achievable or are we doing wild goose chase?

Regards,

Suresh.

5 REPLIES 5

Former Member
0 Kudos

Hi,

From one point of view uploading of huge file will take a lot of time, much more than a sum of upload time for separate files (if the file is splitted).

If anyway you want to upload one file and for performance reason it would be better to load txt-file from a server.

As far as I understood, you have a client's file, most likely, in some of text formats. In this case, I'd prefer to upload this file into Access, concatenate fields into CR txt-file, then download it onto server and load into ztable.

Regards,

Ravi

Former Member
0 Kudos

Please get the file in parts and pieces to upload z table..... That is the best way .....

0 Kudos

It can be done. I would suggest the following be done:

1. The file be split up in themany files.

2. The files should be stored on the application server so the jobs can un in the background.

3. Make sure your BASIS group give you enough table space to load all of these records. (I assume this is the error you got).

4 Make sure you program does not keep accumulating records in memory. Once commited to the DB, the should be free'd from memory. (This could also be the issue you are having, runnin gout of memory).

Other than that, it will work. It will just take some time to finish.

thanks.

JB

alex_campbell
Contributor
0 Kudos

One additonal thought: Make sure you are doing a COMMIT WORK after you write each chunk of data to the database. I can imagine that it might be possible that you would get an error if the unit of work gets too large. If a COMMIT WORK is undesirable, I've heard of an FM 'DB_COMMIT' that you could look into, but I have never used it.

0 Kudos

If a COMMIT WORK is undesirable, I've heard of an FM 'DB_COMMIT' that you could look into, but I have never used it.

DB_COMMIT function module triggers a database-only commit. Without SAP-internal things that usually happen on COMMIT WORK.