Skip to Content
avatar image
Former Member

Checking the size of EVERY MaxDB table

Good morning,

we have a SAP Netweaver 04s installation based on MaxDB 7.6.10

In the last month we are experiencing a very strong DB size growth (about 10Gb a day).

XI messaging deletion is working fine, both for the Adapter Framework and the Integration Engine, but there should be something that we're missing.

Checking the tables size and growth (both for the ABAP and the JAVA schemas) from inside the DB50 transaction, gives a total size of about 70Gb (including indexes) , but the reported size of the DB space usage is way ahead of this: 205Gb.

The most growing tables show a total grow of about 500Mb/day but, as said, the DB grows by 10Gb..

Now we are trying to identify were in the DB structure (system tables?) are those more 130Gb contained, so that we may have a clue about were to act to take down the DB to a more reasonable size..

The problem is that until now we have not found a way to check the size of EVERY object contained in the DB, both from the SAP interfaces (SAPGUI/nwa/rwb) or via SQLstudio.

This problem is quickly becoming very urgent, because the max allocable space to the DB without adding more hardware is about 300Gb and with this growing ratio we'll soon been stuck 😔

Thank you in advance for your kind help.

Carlo Ferrari

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

2 Answers

  • Best Answer
    Jun 01, 2008 at 01:14 PM

    What you see by default in DB50 is only the size of the ABAP part, the Java part is not included there.

    You can try the following:

    - DB59

    - Add you J2EE schema (SAP<SID>DB)

    - execute program RSADAT6M

    Markus

    Add comment
    10|10000 characters needed characters exceeded

    • Hi Carlo,

      Usually the system tables don't require much space in the database.

      Thus it's more likely that one of your application tables occupies the space or that there is a huge demand for temporary space.

      Also one possible reason for a huge space demand is the usage of snapshots. Do you use snapshots for that database?

      Please post a readable version of the output from thesestatements:

      select * from sysinfo.datastatistics
      
      select * from superdba.userstatistics
       (attention: this one runs long !)
      

      Once you know the overall stats, you can digg deeper:

      SELECT  r.owner, r.tablename, r.indexname,  f.type, (f.treeleavessize + f.treeindexsize + f.lobsize) AS KB
      from 
      	roots r,
      	files f
      where r.root = f.root
      order by KB desc
      

      This statement needs to be run for each Schemauser as the systemtables only provide access to catalog information for tables the user has at least SELECT privileges.

      If you don't figure it out yourself, you may also open a support message to get help directly on your system.

      KR Lars

  • avatar image
    Former Member
    Jun 03, 2008 at 10:04 AM

    Good morning, I have an update:

    yesterday, as the production environment was mostly unused as in Italy it was a national holiday, I tried the query suggested by Lars:

    select * from superdba.userstatistics

    I submitted it via SQl studio at about 4pm. After 7 hours the query was still running...

    As I was connected via VPN at home, I decided to close the SQL studio and to shut down my Pc.

    Immediately after closing SQL studio, a SAP Gui connection that I had on the desktop closed with error as the "connection to the DB went lost".

    A quick check showed that after closing the SQL studio the DB decided to go OFFLINE.

    I immediately put it ONLINE again and everything seemed to be working fine.

    Then I finally went to sleep, leaving the DB with a size of 210Gb.

    This morning at my workplace I was prepared to deal again with the problem but, to my great surprise, the DB size was decreased to 160Gb (and it is still gaining space right now)...

    My diagnosis is that there should have been some kind of lock on some tables that were stuck and that avoided the deletion of data. Restarting the DB had probably cleared those locks and flushed the sink.

    Is this actually a possible explication?

    Warmest regards,

    Carlo

    Add comment
    10|10000 characters needed characters exceeded

    • Former Member Lars Breddemann

      Thank you anyway, Lars.

      Actually on last Friday we opened a support request with high priority but, as on monday still we didn't receive an answer, I decided to post it also on this forum.. you know, being a linux guy I am more proned to be confiant in the help of the "community" 😉

      Yesterday anyway we've closed the support request as the problem was solved.

      Now my idea is: we do have a backup of the database when it was 200Gb and I'm planning to restore it on another maxdb instance on another server (without any SAP software on it), so that I can try at least to understand where were the data contained.

      Thank you so much again,

      Carlo