Hello Friends,
I have a query. The scenario is that I want to process a large database table, say a BUT000 table and want to change some non-key field values.
A simple approach would be to
-execute a SELECT statement and retrieve the records in an internal table.
-change the values in the internal table.
- then update the database table from the internal table.
However, for large tables, performance issue will occur and the program might timeout.
So one solution would be to run a report in batches in background, each report processing 100 entries of the table.
The algorithm would be somewhat like this (Please ignore syntax):
OPEN CURSOR c1 FOR SELECT * FROM <db_table>
DO till all the records are processed.
FETCH 100 records INTO lt_tab[ ].
SUBMIT change_report WITH lt_tab[ ] IN BACKGROUND mode.
ENDDO.
EXPORT itab FROM itab TO DATABASE indx(ar) CLIENT sy-mandt ID job_number
IMPORT itab TO itab FROM DATABASE indx(ar) CLIENT sy-mandt ID job_number
However, I am not sure if this will work cross-application-servers. My guess is this solution should hold even if the background reports are run in different application servers of the system.
Could anyone guide me as to whether my understanding of the above 3 points is correct? Any comments on Point 3 ?
Are there any more solutions to this scenario?
With Kind regards,
Sameer.