04-29-2020 6:59 AM
Hi,
In my custom table having more than 900 Million records.Need to delete the records without performance issue.
I have tried to read the records from based on date and time . And we have tried to delete but its going for memory dump..
DELETEFROM(tablename)WHEREzone='X'.
Could you please suggest best solution for this.
04-29-2020 7:15 AM
Hi
you can do some package deletion like
DO.
SELECT *
FROM (tablename)
WHERE zone = @abap_true
INTO TABLE @data(deletes)
UP TO 10000 ROWS.
IF sy-subrc <> 0.
EXIT. "DO
ENDIF.
DELETE (tablename) FROM TABLE @deletes.
COMMIT WORK.
ENDDO.
or with relase 751 you can use something like this:
DO.
DELETE FROM (tablename) WHERE zone = @abap_true UP TO 10000 ROWS.
IF sy-subrc <> 0.
EXIT. "DO
ENDIF.
COMMIT WORK.
ENDDO.
https://help.sap.com/doc/abapdocu_751_index_htm/7.51/en-US/abapdelete_where.htm
regards
Domi
04-29-2020 7:02 AM
you need to delete ALL the records or with a condition ?
04-29-2020 7:15 AM
Hi
you can do some package deletion like
DO.
SELECT *
FROM (tablename)
WHERE zone = @abap_true
INTO TABLE @data(deletes)
UP TO 10000 ROWS.
IF sy-subrc <> 0.
EXIT. "DO
ENDIF.
DELETE (tablename) FROM TABLE @deletes.
COMMIT WORK.
ENDDO.
or with relase 751 you can use something like this:
DO.
DELETE FROM (tablename) WHERE zone = @abap_true UP TO 10000 ROWS.
IF sy-subrc <> 0.
EXIT. "DO
ENDIF.
COMMIT WORK.
ENDDO.
https://help.sap.com/doc/abapdocu_751_index_htm/7.51/en-US/abapdelete_where.htm
regards
Domi
04-29-2020 10:35 AM
Hi @dominik.bigl2,
I will try to implement your solution in my system and I have another query like "what is maximum package size can we give 10,000 is maximum or can I try with 50,000 ?"
Also I have tried one solution.....
Could you please advise here....whether it will work or not.
Example :
I have 900 million records in my custom table....
Each month....approx..it will have 3 crores recods in table.
Solution :
OPEN CURSOR WITH HOLD s_cursor FOR
SELECT * FROM zupload INTO TABLE lt_pegg WHERE date between ls_date-low and ls_date-high.
DO.
FETCH NEXT CURSOR s_cursor APPENDING TABLE lt_pegg
PACKAGE SIZE 50000.
IF sy-subrc eq 0.
DELETE zupload FROM TABLE lt_pegg.
endif.
clear : lt_pegg.
else.
EXIT.
ENDIF.
ENDDO.
CLOSE CURSOR: s_cursor.
I have tried this solution.....it also working..
04-29-2020 2:00 PM
Hi rajeshkumar01
The package size depends on your DB, available memory, and runtime/timeout system parameter! You have to find suitable values for your landscape!
THIS solution will not work! IF...endif...else..ENDIF. (Please use the Insert Code function!?!?!?)
Did you try your solution? Why not?
regards
Domi
04-29-2020 4:15 PM
Sorry Domi,
I forgot to add "Not initial condition".After fetch next cursor, It will check whether Internal table lt_pegg have values or not. If values present, then it will delete the records from table. Else it will exit from Do.. enddo.
I have tried this solution in my system..
Solution :
OPEN CURSOR WITH HOLD s_cursor FOR
SELECT * FROM zupload INTO TABLE lt_pegg WHERE date between ls_date-low and ls_date-high.
DO.
FETCH NEXT CURSOR s_cursor APPENDING TABLE lt_pegg
PACKAGE SIZE 50000.
IF lt_pegg is not initial.
DELETE zupload FROM TABLE lt_pegg.
commit work.
clear : lt_pegg.
else.
EXIT.
ENDIF.
ENDDO.
CLOSE CURSOR: s_cursor.
Also I have tried your below solution with date range and its working fine....and it does not cause any dump..
DO.SELECT*FROM(tablename)WHEREzone=@abap_true
INTOTABLE@data(deletes)UPTO10000ROWS.IFsy-subrc <>0.EXIT."DOENDIF.DELETE(tablename)FROMTABLE@deletes.COMMITWORK.ENDDO.
Thanks,
Rajesh kumar.
04-29-2020 5:32 PM
rajeshkumar01 In the forum, if the spaces of your pasted ABAP code are removed, paste again and the spaces should be kept. Thank you.
04-29-2020 7:27 AM
Dear Rajesh
Write your delete function in a program and run it at background.
It will be done without any dump.
Regards,
Venkat
04-29-2020 7:43 AM
if the dump is related to memory area; it could occurs again.
only timeout as no effect on background job.
second method to bypass timeout, is to use COMMIT statement, because it clear the TIMEOUT count. So to use as dominik.bigl2 proposed, a rows block process
04-29-2020 1:27 PM
.In foreground its going for time out dump.So we have schedule it in background mode but in background mode also getting memory dump.
Per day its having 3 crores records in custom table.
04-29-2020 10:37 AM
Your previous question was actually really how to "Delete custom table records without performance issue".
But now, your question is how to "Delete custom table records without database error". Are you talking about the issue with size of rollback segment (segment size not big enough to store all deletions + to allow committed read?
Can you post any dump/message?
04-29-2020 10:54 AM
Hi
If you can provide the ST22 - dump message,
We can find is it related to memory or timeout or any other