cancel
Showing results for 
Search instead for 
Did you mean: 

Create indices before or after compression

Former Member
0 Kudos

Hello

I ussualy delete and recreate indices before big uploads. Do you reccomend to create indices before or after compression?

Accepted Solutions (0)

Answers (6)

Answers (6)

Former Member
0 Kudos

From a performance standpoint you should index after compression (data manipulation with an index in place is slower and leads to index inefficiency/corruption) but SAP in note 407260 says that the 'P' or the primary index '0' on the E-facttable is NEEDED for compression, so there seem to be a tradeoff.

Hope this helps

Former Member
0 Kudos

As mentioned, any time you are inserting or deleting from a table, the more inidces that exist, the harder the DB has to work ot keep everything updated, so yes, compression should benefit from haing fewer indices in place when it runs. I have to believe the impact must vary a good bit depending on the DB.

The P index that is needed to support compression is NOT dropped as part of the drop index function, so you don't have to worry about that - only the secondary indices that are created to improve querying are dropped and rebuilt.

Former Member
0 Kudos

May I add to my previous post that we do not compress the last 15 loads. It gives us flexibility in case one or more loads fail. Creating the indexes before compression has never given us any problems. Is there performance increase to be gained by turning them around?

Eduard

Former Member
0 Kudos

Hi, Delete the indices before loading the data and then create a indices after the compression.

regs,

Mahantesh

Former Member
0 Kudos

We compress after creating the indexes. We're actually in the process of removing all compress-steps from the daily process chains and move them to a special process chain which runs once a week in the quiet hours of the weekend. We compress all our cubes there.

Removing the compression in the daily loads shortens the total process chain times every night, so we can keep all our daily loads in the 0:00-7:00 time interval.

cheers,

Eduard

Former Member
0 Kudos

Hi,

For once I disagree with Pizzaman. The sequence would be as follows:

1. Drop indices.

2. Load the cube.

3. Compress the cube.

4. Create indices.

5. Create/refresh statistics.

The primary reason why I think we should do compression before creating the indices is that the system would know for which table - E or F table the indices needs to be created.

We were following the above steps in my previous project.

Bye

Dinesh

Former Member
0 Kudos

Dropping the indices before the load allows the load to complete quicker since all the extra indices do not need to be maintained during the load process.

Drop Indices

Load Data

Create Indices

Refresh Statistics

Run Compression

Former Member
0 Kudos

After thinking about this a little more, I think Dinesh has got the sequence correct. I haven't mentally visited the issue in a while, since generally, we don't drop the indices as we support a 24 hr query environment and like to keep the secondary indices in place.

One thing you should "take away" from this thread is that for load perfromance you would drop the indices before a large load, load and compress the data, then build indices. There is no point in dropping <b>and rebuilding</b> before the load, which is what I think your your original post suggests you are doing.

For some good info on this issue, and when you might not want to drop inidces, or use techniques to drop just the E fact table indices rather than both the e and F fact table indices - check note 407260.