03-11-2009 11:20 AM
Hi Experts,
I have an internal table which gets filled with something between 100.000 and 500.000 records. Since the system restrictions are not changeable I need to reduce the memory stress cause this filling is always followed by a memory error and the process can't be finished.
Is there a possibility to do something like packaging/splitting this internal table? For additional information: the internal table or better its reference needs to be exported with the entire record amount.
Many thanks in advance.
Best regards
Tobias
03-11-2009 11:30 AM
03-11-2009 11:22 AM
Hi,
try using EXTRACT Keyword..
code.
NODES: database tablename.
field-groups: <fldname>.
insert: databasetabname-fieldname into <fldname>.
extract <internaltabl_name>
START-OF-SELECTION.
GET databasetable name.
EXTRACT <fldname>.
if u still have some doudts pls, let me know...
03-11-2009 11:30 AM
03-11-2009 12:06 PM
Extracts is a good way to do these for sure.
Another way of handling large volume of data is to use package size option in the selection screen (for users ) and then use that value in the select statement.
If you specify the PACKAGE SIZE addition, all lines of the result set for SELECT are processed in a loop, which must be closed with ENDSELECT
ex:
wf_package = '10000'.
SELECT *
FROM vbap package size wf_package INTO TABLE it_vbap.
loop at it_vbap.
do the processing with the first 10000 records , refresh the table it_vbap and then fetch the next 10000 and so on.
endloop.
endselect.
Mathews