Skip to Content
author's profile photo Former Member
Former Member

Upload PGM File to ZTable

Hi,

I am trying to load a file (20 column) and the file can have nearly 80 million records.

ABAP Program needs to load all these records into table. Please tell if this is achievable?

Will Parallel processing help.

I am reading files in chunks of 5000 rows and inserting to the database after I insert nearly 0.6 million records I get an error message

"The database returned a value containing an error"

1. why do I get the above error. I see the data seems to be good for the insert.

2. Is it advisable to put 80 million record into a ztable.

the 80 million record will have

KUNNR(Customer)

MATNR(Material)

purchase sales history 1 year

purchase sales history 2 years

.....

....

sales history last 1 month.

We have an existing RFC that pulls out so many data in the front end (Dot net UI); along with this a join query will be made to above table to display results in the front end.

Is this achievable or are we doing wild goose chase?

Regards,

Suresh.

Add a comment
10|10000 characters needed characters exceeded

Related questions

4 Answers

  • author's profile photo Former Member
    Former Member
    Posted on Dec 13, 2011 at 06:00 AM

    Hi,

    From one point of view uploading of huge file will take a lot of time, much more than a sum of upload time for separate files (if the file is splitted).

    If anyway you want to upload one file and for performance reason it would be better to load txt-file from a server.

    As far as I understood, you have a client's file, most likely, in some of text formats. In this case, I'd prefer to upload this file into Access, concatenate fields into CR txt-file, then download it onto server and load into ztable.

    Regards,

    Ravi

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Dec 31, 2011 at 03:59 AM

    Please get the file in parts and pieces to upload z table..... That is the best way .....

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Jan 13, 2012 at 01:53 PM

    It can be done. I would suggest the following be done:

    1. The file be split up in themany files.

    2. The files should be stored on the application server so the jobs can un in the background.

    3. Make sure your BASIS group give you enough table space to load all of these records. (I assume this is the error you got).

    4 Make sure you program does not keep accumulating records in memory. Once commited to the DB, the should be free'd from memory. (This could also be the issue you are having, runnin gout of memory).

    Other than that, it will work. It will just take some time to finish.

    thanks.

    JB

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Jan 13, 2012 at 04:11 PM

    One additonal thought: Make sure you are doing a COMMIT WORK after you write each chunk of data to the database. I can imagine that it might be possible that you would get an error if the unit of work gets too large. If a COMMIT WORK is undesirable, I've heard of an FM 'DB_COMMIT' that you could look into, but I have never used it.

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.