cancel
Showing results for 
Search instead for 
Did you mean: 

Infocube Index Problem

ssurampally
Active Contributor
0 Kudos

Hi,

I have loaded a infocube with huge number of records,  load is a from a Standard DSO to cube,  there were 900 requests in the DSO, so I did not prefer the delta load as request by request, instead data is loaded from active data table.  it ended of having 500 million records in one request and took 9 hours.

After that, I started the compression of the request, it was running for 3 days, basis Cancelled the job at last. After that I am trying to go to Manage tab of the cube, it is not responding, I can't display the data in cube by LISTCUBE also, it is spinning for a long time and not responding.

I have performed a RSRV check 'Data base indices of an Infocube',  it failed with an error as, ****/bic/E fact table index is not valid.  I have tried to correct the error by running repair,  the action is taking long time but did not complete the job and getting to dump as it timed out.

I requested Basis team to re generate the Index, they were not able to do that because they are getting a message that, index is not created so can't be built.

Could you please let me know, how do I correct this issue?

Thanks

Sreekanth

Accepted Solutions (0)

Answers (5)

Answers (5)

ssurampally
Active Contributor
0 Kudos

THank you all, the problem is corrected with the help of Basis team,

What happened?

After loading the cube, the compression of the request is scheduled, while it is running for a long time system shut down happened and job cancelled. But Still it had a lock on the tables and DB Stat job also not completed. this resulted in Index problem. Basis team ran the Stats first and then Index were built from the manage tab of the cube. Now it is fine.

former_member210571
Participant
0 Kudos

Hi Sreekanth,

Use SE14 to create the the indexes or ask the DBA guys to create the indexes or run BRTools for it.

But yes if you are going to compress 500M, then it is going to take days to complete. Better to load data in smaller chunks in the cube.

You can checked out for the below notes also, hope it might help:

Thanks,

Mirza

Former Member
0 Kudos

Hi Sreekanth,

I wonder it is the first time you load this cube?

Regards

Bill

former_member186053
Active Contributor
0 Kudos

Hi Sreekanth,

This might be due to inconsistency occurred in Cube. Try to check for consistency of E & F tables in SE14 and fix it if you have authorization for SE14 or ask Basis help to do it.

It is not a good idea to load huge amount of data in a single request. Are you trying to reload data, then use filters in DTP to load data into cube(example first 3 months into one request then 3 more months...). Or if you are doing data load to Cube for the first time, create a delta DTP and do init delta with request by request option.

Regards,

Vengal.

Former Member
0 Kudos

I think you had a disconnect between SAP-DDIC and physical DDIC because the process is aborted in between. Solution would lead to - fixing the physical layer(How? Is it possible?) and manually finding all SAP-tables to transfer the settings and status(request status;compress status..and more).

On the face of it it might look index but there may be more to do.This will take more time.

I would just delete and  redo the load. Before that I may look into cube model.

1. Should we split it in to multiple cubes?

2. Create more line item dimensions?

3. Reduce the size of DIM table

4. and more...