on 11-11-2013 3:39 PM
We change our extractor 0FI_GL_14 to run hourly about 6 months. Up until now we didn't have any issues. Users are noticing that data is not matching to what they are seeing in ECC. We are not sure if it is the change in the extractor running every hour or our design.
We have 0FI_GL_14 loading directly to a DSO and this looks good. We have a cube on top of it and this is where the report sits. Could this be causing the issue? It isn't all ecords, jsut a handful and only for 1 month.
Any ideas??
We ended up reporting at the DSO level and the the cube. The data in the DSO is fine but not the cube. We didn't want to do full loads due to the amount of data that would need to be reloaded.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
just check few points as below
1. BIA index needs to rebuild or not?
2. check whether any selective deletion in cube is required for certain fiscal year period and we can have full repair data load.
Thanks and regards
Kiran
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi All,
i am facing the same issue. how was it permanently solved?
thanks,
Ricardo
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
We moved the report to the DSO and it matches. Not sure why the cube stopped working though.
thanks for your help
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
hi,
As you are saying DSO and ECC data are matching (which is actually good newz) then the issue will be definitely with the dso to cube load.
Now I believe doc no and item will be part of key filed in your DSO, does the same thing in cube as well.
I worked on this flow and definitely the DSO hold much detail level information ( I remember some 200 fileds) compared to cube (60 fileds), so cube is holding aggregated data.
As far as 0BALANCE 0AMOUNT and 0QUANTITY fileds are matching at comp code and GL a/c level for various fiscper, you should be good.
I will suggest, gust get the aggr value from cube based on FISCYEAR CompCode and GL a/c for all 3 KFs
Then try to get the same info from DSO too.
And finally using FAGLFLEXA table.
And compare all to find the issue, which can be further drilled to fiscper and finally at doc no.
Thank-You.
Regards,
VB
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi MM,
What is the source of comparison in ECC..are u using FBL3N for comparison..i.e as on date/based on posting date.
Can you check the data based on yesterdays date and in few cases the entry date may be today but the posting date may be last month..in that case the data will not match.
My advise is to take out that ONE particular records and start investigation.If u find somethng post them here so that further we can suggest.
rgds
SVU123
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Now I see. Looks like you do an overwrite into your DSO and a delta to your cube. The old already existing records won't update with the new GL account. The new records contain the new GL account and therefore your reporting is off. You probably have to switch to a full load from DSO to cube..
hope that helps
Martin
I suspect your hourly loads to BW and the time when they cross check in ECC. Timing should be exactly same. I mean they can check till last one hour with BW, as the last load happened within last hour. If they check little later(crossing 1 hr) then obviously ECC will not match with BW.
Your data flow design is OK and I hope you are using standard flow.
If you say the data is not matching for one month only, you can take some documents and start doing some unit test in BW and do reconciliation with ECC. If you get reason for one document, then the reason could be for all possible cases.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
We are loading from the active table without archive.
This setting is to delta init from DSO to cube. I mean this will be used when you do delta load for the very first time. Subsequently your delta will be loaded from change-log.
I have asked you to take one document and start checking. Have you done that in your DSO and Cube?
Why don't you just drop your cube contents and reload again from DSO?
Regards,
Suman
Hi,
Delta loads always we pulls from change log tables(DSO) to further targets.
because at active table same records are over written.
Do the test case at dev system:
1. Change dtp settings to change log table.
2. Ask your source team to change 4 records.
3.Load to PSA, DSO and to info cube
4. Check how the record look like at info cube level.
that's give the clarity to change dtp settings to change-log table option.
Thanks
Hi you right any code on cube transformation level or start routine level please check, if you know the missing documets for particular month check that documet in DSO level, documets are missing or values are wrong.
Which type of DSO you are using ? DTP level extraction you are selected active data or change long . if you are selected the active data please change to change log
and load the data from DSO to cube
Hi
The frequence of your load does not cause data issues. There is a safety delta in SAP built-in that prevents to load any data newer than (standard) 30min for the exact thing that you don't run into data consitency issues. Double check where it is not matching and when that document that is missing has been created..
hope that helps
Martin
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.