Skip to Content

How to delete many rows in a big table, but not all the rows ( truncate table can't be used in this case )

Hi All,

I want to delete many rows in a big table, but not all the rows ( truncate table can't be used in this case )

This table have to continue to be updated during operation ( because it's used by application).

That's means I can't move all data in a new table to truncate the table, then insert only data which are necessary.

So, is it possible to delete many rows in a big table without log ?

Options from database are :

select into/bulkcopy/pllsort, trunc log on chkpt, abort tran on log fullSybase version : 15.7 SMP SP101

Thanks in adavance

Regards,

Claude

Add a comment
10|10000 characters needed characters exceeded

Related questions

4 Answers

  • Posted on Nov 28, 2013 at 04:04 PM

    Unless this is an in-memory database (IMDB) you cannot disable logging.

    Based on your other comments, and assuming you don't have a maintenance window of time where there's no activity against the table, and assuming this isn't part of a warm standby replication environment (where you could switch your application to the standby), your best bet is to set up a script that runs a loop, deleting a small batch of rows at a time until there are no more rows to delete.

    You haven't mentioned how a) many rows are in the table, b) how many rows need to be deleted or c) how you determine which rows need to be deleted. So a few recommendations ...

    Deleting too many rows with a single DELETE statement can cause your locks to escalate to a table-level exclusive lock, which would block your application users. So you'll want to delete a small number of rows at a time to minimize the blocking of application users.

    If you delete too many rows in too short a period of time you can fill your log, which would lead to a suspension of application user activity until the checkpoint process runs (and truncates the log). You can alleviate this problem by periodically putting your looping process to sleep (waitfor delay "HH:MM::SS") thus giving the checkpoint process time to come around and truncate the log. Alternatively, since the database is marked with 'trunc log on chkpt' you could have your process periodically issue a 'dump tran/truncate only' to clear out the log. Primary concern is to make sure you don't fill up your log (and block your application user activity) before the log can be truncated.

    You may also want to look at how you plan on finding the rows that need to be deleted. If you have to perform a table scan every time you delete a few rows then you'll want to consider using a table-scanning cursor to reduce the volume of IOs (ie, scan the table once instead of thousands/millions of times). The cursor's WHERE clause will determine which rows to look for (to DELETE), while the select/projection list will contain the PK column(s) to be captured into local @variables. You can then use these local @variables in the WHERE clause of the DELETE statement to zero in on the desired row(s) to delete.

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Dec 09, 2013 at 07:18 PM

    This is one of the most asked questions for ASE. Answer: use small batch size.


    Here is the sample code.

    declare @errorStatus int, @rowsProcessed int

    set rowcount xxx

    select @rowsProcessed = 1
    while (@rowsProcessed != 0)
    begin
    begin tran

    delete ...

    from ...

    where ...

    select @rowsProcessed = @@rowcount, @errorStatus = @@error

    if (@errorStatus != 0)
    begin
    raiserror ....
    rollback tran
    return -1
    end
    commit tran

    -- please add "dump tran" here if "trunc" option is not on
    end

    set rowcount 0

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Dec 09, 2013 at 10:15 PM

    Hank's example shows a good practice when deleting large amount of records in general. But it's probably not answering your question -- do some reading on this command:

    set dml_logging minimal

    it may be what you're looking for. You'll have to have a good understanding of the implications of doing so.

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Dec 16, 2013 at 02:46 PM

    How about selecting only the few rows you want to keep and insert them into a new table (exact same DDL but a different name), then drop the old table and rename the new one to the original table name that you dropped ?

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.