Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Delete custom table records without performance issue

0 Kudos

Hi,

In my custom table having more than 900 Million records.Need to delete the records without performance issue.

I have tried to read the records from based on date and time . And we have tried to delete but its going for memory dump..

DELETEFROM(tablename)WHEREzone='X'.

Could you please suggest best solution for this.

1 ACCEPTED SOLUTION

Domi
Contributor

Hi

you can do some package deletion like

DO.
  SELECT * 
         FROM (tablename)
         WHERE zone = @abap_true
         INTO TABLE @data(deletes)
         UP TO 10000 ROWS.
  IF sy-subrc <> 0.
   EXIT. "DO
  ENDIF.
  DELETE (tablename) FROM TABLE @deletes.
  COMMIT WORK.
ENDDO.

or with relase 751 you can use something like this:

DO.
  DELETE FROM (tablename) WHERE zone = @abap_true UP TO 10000 ROWS.
  IF sy-subrc <> 0.
   EXIT. "DO
  ENDIF.
  COMMIT WORK.
ENDDO.
 

https://help.sap.com/doc/abapdocu_751_index_htm/7.51/en-US/abapdelete_where.htm

regards

Domi

11 REPLIES 11

FredericGirod
Active Contributor
0 Kudos

you need to delete ALL the records or with a condition ?

Domi
Contributor

Hi

you can do some package deletion like

DO.
  SELECT * 
         FROM (tablename)
         WHERE zone = @abap_true
         INTO TABLE @data(deletes)
         UP TO 10000 ROWS.
  IF sy-subrc <> 0.
   EXIT. "DO
  ENDIF.
  DELETE (tablename) FROM TABLE @deletes.
  COMMIT WORK.
ENDDO.

or with relase 751 you can use something like this:

DO.
  DELETE FROM (tablename) WHERE zone = @abap_true UP TO 10000 ROWS.
  IF sy-subrc <> 0.
   EXIT. "DO
  ENDIF.
  COMMIT WORK.
ENDDO.
 

https://help.sap.com/doc/abapdocu_751_index_htm/7.51/en-US/abapdelete_where.htm

regards

Domi

0 Kudos

Hi @dominik.bigl2,

I will try to implement your solution in my system and I have another query like "what is maximum package size can we give 10,000 is maximum or can I try with 50,000 ?"

Also I have tried one solution.....

Could you please advise here....whether it will work or not.

Example :

I have 900 million records in my custom table....

Each month....approx..it will have 3 crores recods in table.

Solution :

OPEN CURSOR WITH HOLD s_cursor FOR
SELECT * FROM zupload INTO TABLE lt_pegg WHERE date between ls_date-low and ls_date-high.
DO.
FETCH NEXT CURSOR s_cursor APPENDING TABLE lt_pegg
PACKAGE SIZE 50000.
IF sy-subrc eq 0.
DELETE zupload FROM TABLE lt_pegg.
endif.
clear : lt_pegg.
else.
EXIT.
ENDIF.
ENDDO.
CLOSE CURSOR: s_cursor.

I have tried this solution.....it also working..

0 Kudos

Hi rajeshkumar01

The package size depends on your DB, available memory, and runtime/timeout system parameter! You have to find suitable values for your landscape!

THIS solution will not work! IF...endif...else..ENDIF. (Please use the Insert Code function!?!?!?)

Did you try your solution? Why not?

regards

Domi

0 Kudos

Sorry Domi,

I forgot to add "Not initial condition".After fetch next cursor, It will check whether Internal table lt_pegg have values or not. If values present, then it will delete the records from table. Else it will exit from Do.. enddo.

I have tried this solution in my system..

Solution :

    OPEN CURSOR WITH HOLD s_cursor FOR
SELECT * FROM zupload INTO TABLE lt_pegg WHERE date between ls_date-low and ls_date-high.
DO.
FETCH NEXT CURSOR s_cursor APPENDING TABLE lt_pegg
PACKAGE SIZE 50000.
IF lt_pegg is not initial.
DELETE zupload FROM TABLE lt_pegg.
commit work.
clear : lt_pegg.
else.
EXIT.
ENDIF. ENDDO.
CLOSE CURSOR: s_cursor.


Also I have tried your below solution with date range and its working fine....and it does not cause any dump..

DO.SELECT*FROM(tablename)WHEREzone=@abap_true
         INTOTABLE@data(deletes)UPTO10000ROWS.IFsy-subrc <>0.EXIT."DOENDIF.DELETE(tablename)FROMTABLE@deletes.COMMITWORK.ENDDO.

Thanks,

Rajesh kumar.

Sandra_Rossi
Active Contributor

rajeshkumar01 In the forum, if the spaces of your pasted ABAP code are removed, paste again and the spaces should be kept. Thank you.

venkateswaran_k
Active Contributor
0 Kudos

Dear Rajesh

Write your delete function in a program and run it at background.

It will be done without any dump.

Regards,

Venkat

0 Kudos

if the dump is related to memory area; it could occurs again.

only timeout as no effect on background job.

second method to bypass timeout, is to use COMMIT statement, because it clear the TIMEOUT count. So to use as dominik.bigl2 proposed, a rows block process

0 Kudos

.In foreground its going for time out dump.So we have schedule it in background mode but in background mode also getting memory dump.

Per day its having 3 crores records in custom table.

Sandra_Rossi
Active Contributor
0 Kudos

Your previous question was actually really how to "Delete custom table records without performance issue".

But now, your question is how to "Delete custom table records without database error". Are you talking about the issue with size of rollback segment (segment size not big enough to store all deletions + to allow committed read?

Can you post any dump/message?

venkateswaran_k
Active Contributor
0 Kudos

Hi

If you can provide the ST22 - dump message,

We can find is it related to memory or timeout or any other