on 05-01-2015 6:44 PM
Hi,
I have loaded a infocube with huge number of records, load is a from a Standard DSO to cube, there were 900 requests in the DSO, so I did not prefer the delta load as request by request, instead data is loaded from active data table. it ended of having 500 million records in one request and took 9 hours.
After that, I started the compression of the request, it was running for 3 days, basis Cancelled the job at last. After that I am trying to go to Manage tab of the cube, it is not responding, I can't display the data in cube by LISTCUBE also, it is spinning for a long time and not responding.
I have performed a RSRV check 'Data base indices of an Infocube', it failed with an error as, ****/bic/E fact table index is not valid. I have tried to correct the error by running repair, the action is taking long time but did not complete the job and getting to dump as it timed out.
I requested Basis team to re generate the Index, they were not able to do that because they are getting a message that, index is not created so can't be built.
Could you please let me know, how do I correct this issue?
Thanks
Sreekanth
THank you all, the problem is corrected with the help of Basis team,
What happened?
After loading the cube, the compression of the request is scheduled, while it is running for a long time system shut down happened and job cancelled. But Still it had a lock on the tables and DB Stat job also not completed. this resulted in Index problem. Basis team ran the Stats first and then Index were built from the manage tab of the cube. Now it is fine.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Sreekanth,
Use SE14 to create the the indexes or ask the DBA guys to create the indexes or run BRTools for it.
But yes if you are going to compress 500M, then it is going to take days to complete. Better to load data in smaller chunks in the cube.
You can checked out for the below notes also, hope it might help:
Thanks,
Mirza
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Sreekanth,
I wonder it is the first time you load this cube?
Regards
Bill
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Sreekanth,
This might be due to inconsistency occurred in Cube. Try to check for consistency of E & F tables in SE14 and fix it if you have authorization for SE14 or ask Basis help to do it.
It is not a good idea to load huge amount of data in a single request. Are you trying to reload data, then use filters in DTP to load data into cube(example first 3 months into one request then 3 more months...). Or if you are doing data load to Cube for the first time, create a delta DTP and do init delta with request by request option.
Regards,
Vengal.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I think you had a disconnect between SAP-DDIC and physical DDIC because the process is aborted in between. Solution would lead to - fixing the physical layer(How? Is it possible?) and manually finding all SAP-tables to transfer the settings and status(request status;compress status..and more).
On the face of it it might look index but there may be more to do.This will take more time.
I would just delete and redo the load. Before that I may look into cube model.
1. Should we split it in to multiple cubes?
2. Create more line item dimensions?
3. Reduce the size of DIM table
4. and more...
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
83 | |
10 | |
10 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.