Skip to Content
0
Former Member
Jun 02, 2008 at 05:07 PM

Process files with huge amount of data using parallel processing

1649 Views

Hi,

My Requirement is to process a file with huge amount of data and update database table ( std or z-table).There are more than 5 million records in each file.

I think parallel processing is one of the best approach.but my question is how to handle 5million recrds at the internal table level.I want to split the data at internal table/File level for each 50-75k records and process this data using parallel processing.

Pls suggest the best approach?