Hi,
I am trying to load a file (20 column) and the file can have nearly 80 million records.
ABAP Program needs to load all these records into table. Please tell if this is achievable?
Will Parallel processing help.
I am reading files in chunks of 5000 rows and inserting to the database after I insert nearly 0.6 million records I get an error message
"The database returned a value containing an error"
1. why do I get the above error. I see the data seems to be good for the insert.
2. Is it advisable to put 80 million record into a ztable.
the 80 million record will have
KUNNR(Customer)
MATNR(Material)
purchase sales history 1 year
purchase sales history 2 years
.....
....
sales history last 1 month.
We have an existing RFC that pulls out so many data in the front end (Dot net UI); along with this a join query will be made to above table to display results in the front end.
Is this achievable or are we doing wild goose chase?
Regards,
Suresh.