avatar image
Former Member

Tables with Old Statistics

Hello guys,

In my database I checked the log of "Check and Update optimizer statistics" (DB13) and I saw the stats_change_treshold is 50 days, then I did a query and I realized the 99% of the tables have old statistics.

So my questions are:

Why this process doesn´t update the statistics of the tables that have old statistics?

Exist other conditions for the process of DB13 than stats_change_treshold parameter?

Thanks in advance.

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • Best Answer
    Dec 01, 2010 at 08:50 PM

    >

    > Hello guys,

    >

    > In my database I checked the log of "Check and Update optimizer statistics" (DB13) and I saw the stats_change_treshold is 50 days, then I did a query and I realized the 99% of the tables have old statistics.

    WRONG!

    The stats_change_threshold is not about the age of the statistics.

    It's about the amount of data that has been changed since the last statistics gathering.

    What is it with the documentation? Too difficult? Too few pictures?

    Try and read it [here|http://help.sap.com/saphelp_nw04/helpdata/en/02/0ae0c6395911d5992200508b6b8b11/frameset.htm].

    Not knowing what this parameter does maybe harmless - but there are many buttons to push that can cause your whole system to fail. So the docu really is your friend.

    PLUS: you can use the SDN search or google to search it.

    Now you owe me a beer for reading the docu at you...

    cheers,

    Lars

    Add comment
    10|10000 characters needed characters exceeded

    • So in my opinion it method isn´t efficient, because on Table B`s case, if I delete 10.000.000 rows the statistics won´t update and the table had have a change of 20% in its rows. What do you think about this? Exist any way to indicate to SAP these exceptions?

      Sure, there is 😉

      Look for DBSTATC table (maintenance transaction DB21) in sap notes.

      You'll find a full fledged explanation of what can be done with this table...

      However, your example is a bit flawed (just as the whole heuristic here is).

      Basically two key factors influence whether or not new statistics are required or not:

      1. the amount of change in data volume

      and

      2. the change of the data distribution.

      For 1. the 50% rule is usually more than sufficient.

      If I have a rather small table, add more than 50% of data volume than it might be that the new statistics lead to a more efficient access plan of the optimizer (e.g. using an index instead of doing a quick full table scan).

      For 2. the rule is not too well suited, as even a small amount of changed data could change the data distribution in a way that would lead to very different access plans. However, changes of this kind usually need to be covered with special attention anyhow, e.g. using histograms and literals instead of bind variables.

      Based on the experience with SAP systems, the 50% rule is not too bad.

      There's even a note of Jürgen Kirschner describing a different approach to statistics handling.

      In this note the statistics are more seen as table models for the CBO. In this view the ultima ratio is: get your statistics to a point where your system does perform as you like it and then freeze the statistics (or just stop collecting them).

      That way there is no risk of having changed execution plans due to new statistics (bad Monday morning syndrome...).

      And for several tables SAP also releases custom tailored statistics that should be implemented and frozen, just because of the special nature of the table usage (famous example TRFCQ... tables).

      As I already wrote: make sure to check the notes!

      All the stuff I mentioned here is fully explained in them - far better than I can do it here.

      Thanks for all.

      PD: If in any moment we find, we will drink a beer pair together !

      Looking forward for that 😉

      Cheers,

      Lars

Skip to Content