03-23-2009 11:52 AM
Hi Everybody,
We have a report to delete entries from large tables on reguler basis. While deleting, system is writing a log in table DBTABLOG. During delete I am getting dump because INSERT fails for large number of records to table DBTABLOG. Is there any way to bypass this logging?
thanks in advance.
Regards,
Anil
03-23-2009 1:26 PM
I would also investigate whether table logging is really required for this table from a functional standpoint. I'm not recommending to deactivate it just for the deletion process, rather check if it is needed altogether, and if not, switch it off in the technical settings of the table.
Block processing still applies.
Thomas
03-23-2009 12:09 PM
this can not be recommended, functional correctness has always higher priority than
performance.
The recommended solution was mentioned here several times, you must work in blocks of
10.000 or maixmally 50.000. There is no relevant performance difference if you do it all
in one step.
Siegfried
03-23-2009 12:18 PM
Hi Sie,
Thanks for your reply. But we are not using select statement. we directly deleting the table based on date field in where clause. note that tables are having millions of records, so we are avoiding select.
Regards,
Anil
03-23-2009 1:07 PM
so, what .... add another condition to the WHERE-clause which splits the result set.
WHERE .....
AND first_key_field BETWEEN 'A*' AND 'J*'
WHERE .....
AND first_key_field BETWEEN 'JA*' AND 'R*'
WHERE .....
AND first_key_field BETWEEN 'RA*' AND 'ZZZZZ*'
Or even better, just schedule your regular job more often
Siegfried
03-23-2009 1:26 PM
I would also investigate whether table logging is really required for this table from a functional standpoint. I'm not recommending to deactivate it just for the deletion process, rather check if it is needed altogether, and if not, switch it off in the technical settings of the table.
Block processing still applies.
Thomas