cancel
Showing results for 
Search instead for 
Did you mean: 

Will changing how often 0FI_GL_14 runs cause data issues

Former Member
0 Kudos

We change our extractor  0FI_GL_14 to run hourly about 6 months.  Up until now we didn't have any issues.   Users are noticing that data is not matching to what they are seeing in ECC.    We are not sure if it is the change in the extractor running every hour or our design.

We have 0FI_GL_14 loading directly to a DSO and this looks good.  We have a  cube on top of it and this is where the report sits.  Could this be causing the issue?   It isn't all ecords, jsut a handful and only for 1 month.

Any ideas??

Accepted Solutions (0)

Answers (8)

Answers (8)

Former Member
0 Kudos

We ended up reporting at the DSO level and the the cube.  The data in the DSO is fine but not the cube.  We didn't want to do full loads due to the amount of data that would need to be reloaded.

former_member188080
Active Contributor
0 Kudos

just check few points as below

1. BIA index needs to rebuild or not?

2. check whether any selective deletion in cube is required for certain fiscal year period and we can have full repair data load.

Thanks and regards

Kiran

Former Member
0 Kudos

Hi All,

i am facing the same issue. how was it permanently solved?

thanks,

Ricardo

Former Member
0 Kudos

We moved the report to the DSO and it matches.  Not sure why the cube stopped working though.

thanks for your help

MGrob
Active Contributor
0 Kudos

I hope you don't have too much of a performance impact. You might just want to do a full load to the cube to ensure the data is correct.

Martin

former_member182346
Active Contributor
0 Kudos

hi,

As you are saying DSO and ECC data are matching (which is actually good newz) then the issue will be definitely with the dso to cube load.

Now I believe doc no and item will be part of key filed in your DSO, does the same thing in cube as well.

I worked on this flow and definitely the DSO hold much detail level information ( I remember some 200 fileds) compared to cube (60 fileds), so cube is holding aggregated data.

As far as 0BALANCE 0AMOUNT and 0QUANTITY fileds are matching at comp code and GL a/c level for various fiscper, you should be good.

I will suggest, gust get the aggr value from cube based on FISCYEAR CompCode and GL a/c for all 3 KFs

Then try to get the same info from DSO too.

And finally using FAGLFLEXA table.

And compare all to find the issue, which can be further drilled to fiscper and finally at doc no.

Thank-You.

Regards,

VB

former_member202718
Active Contributor
0 Kudos

Hi MM,

What is the source of comparison in ECC..are u using FBL3N for comparison..i.e as on date/based on posting date.

Can you check the data based on yesterdays date and in few cases the entry date may be today but the posting date may be last month..in that case the data will not match.

My advise is to take out that ONE particular records and start investigation.If u find somethng post them here so that further we can suggest.

rgds

SVU123

Former Member
0 Kudos

The users are comparing to FAGLL03.    The pattern we are seeing is that the docuemtns are assigned to a wrong GL account wehn it rolls into the Cube.   These are fine in the DSO but not in the cube.  For some reason the records are not updating when DTP'd to the cube.

MGrob
Active Contributor
0 Kudos

Now I see. Looks like you do an overwrite into your DSO and a delta to your cube. The old already existing records won't update with the new GL account. The new records contain the new GL account and therefore your reporting is off. You probably have to switch to a full load from DSO to cube..

hope that helps

Martin

former_member182470
Active Contributor
0 Kudos

I suspect your hourly loads to BW and the time when they cross check in ECC. Timing should be exactly same. I mean they can check till last one hour with BW, as the last load happened within last hour. If they check little later(crossing 1 hr) then obviously ECC will not match with BW.

Your data flow design is OK and I hope you are using standard flow.

If you say the data is not matching for one month only, you can take some documents and start doing some unit test in BW and do reconciliation with ECC. If you get reason for one document, then the reason could be for all possible cases.

Former Member
0 Kudos

The documents are in ECC and the BW DSo but when loading up to the cube it is not laoding correctly for some reason

RamanKorrapati
Active Contributor
0 Kudos

Hi,

Check your DTP between dso and info cube. observe your info cube logs. assuming your loading delta loads from change log table of dso.

Again activate your transformations and dtp thru programs and check it later.

Thanks

Former Member
0 Kudos

We are loading from the active table without archive.  This has been set this way for a long time and have nto had any issues.  Should it be the change log?

former_member182470
Active Contributor
0 Kudos

Hi,

We are loading from the active table without archive. 

This setting is to delta init from DSO to cube. I mean this will be used when you do delta load for the very first time. Subsequently your delta will be loaded from change-log.

I have asked you to take one document and start checking. Have you done that in your DSO and Cube?

Why don't you just drop your cube contents and reload again from DSO?

Regards,

Suman

Former Member
0 Kudos

The setting is on the DTP from DSO to cube.    yes I did look at each document and there is not pattern.  IT is fine in DSO but not cube.  We can reload but that is time consuming.  We were hoping to avoid reloads

Former Member
0 Kudos

What I found on a few is that there are 2 new images being sent at the same time.  In the DSO they both are there and looks good but when going to the cube only 1 makes it over.   The key in the DSO are in dimensions in the cube

RamanKorrapati
Active Contributor
0 Kudos

Hi,

Delta loads always we pulls from change log tables(DSO) to further targets.

because at active table same records are over written.

Do the test case at dev system:

1. Change dtp settings to change log table.

2. Ask your source team to change 4 records.

3.Load to PSA, DSO and to info cube

4. Check how the record look like at info cube level.

that's give the clarity to change dtp settings to change-log table option.

Thanks

former_member182470
Active Contributor
0 Kudos

You should worry about KPI values in the Cube, but not about the number of records in the cube. Let as many number of records in Cube, but KPi should be correct.

Former Member
0 Kudos

The values are incorrect it is just the record count

Former Member
0 Kudos

Hi you right any code on cube transformation level or start routine level please check, if you know the missing documets for particular month check that documet in DSO level, documets are missing or values are wrong.

Which type of DSO you are using ? DTP level extraction you are selected active data or change long . if you are selected the active data please change to change log

and load the data from DSO to cube

MGrob
Active Contributor
0 Kudos

Hi

The frequence of your load does not cause data issues. There is a safety delta in SAP built-in that prevents to load any data newer than (standard) 30min for the exact thing that you don't run into data consitency issues. Double check where it is not matching and when that document that is missing has been created..

hope that helps

Martin

Former Member
0 Kudos

It is actually happening wehn loading to the cube.  For the documents with the issues there are 2 records in the cube and not being updated correctly.  It isn't widespread though

MGrob
Active Contributor
0 Kudos

So your issue would be from dso to cube not related to the datasource.. How do you load to cube with delta?

Martin

Former Member
0 Kudos

We have a DSO layer then we have a DTP to load to the cube

MGrob
Active Contributor
0 Kudos

you might want to check if there is a known issue for your current bw release with dtp selection but it does not have anything to do with the frequency of your load..