Skip to Content
author's profile photo Former Member
Former Member

BI transformation / aggregation before endroutine- strange behaviour

Hi dudes,

I load data from one DSO to another DSO using a transformation.

I face a system-behaviour which I have not recognized before. At the end of the startroutine I have - lets say 20 records. At the beginning of the end routine I have only 14 records. This means that the system has compressed the data. I am not sure but I believe its some kind of "delete adjencates dublicates".

Question 1: Has this always been the case? I am not sure but I think that this has not been the case. The reason why i think so is just, that one data model of us does not work anymore correctly. Its running over a year now. And suddenly (somehow a few days after applying BW patch 20), the have a bug in the transformation. Debugging showed me that - considering the current system behaviour - the transformation could never ever give the correct result in the past. Buts its a report a lot of people work with it as a base for R/3 adjustments. So i simply doubt that the report was wrong without anybody noticing it...

Question 2: Is there any way to disable this function?

The problem is that we cant use the primary key of our starting DSO but have to derive it from other information (reference document). The mapping creates dublicate entries about which we care about in the end routine. But unfortunately the system deletes not the records we need... Even stranger is that I can only reproduce the problem when loading lots of data (20.000 records per package). When I only load that single business case it works perfect...

Any help very much appreciated.

Hendrik

Add a comment
10|10000 characters needed characters exceeded

Related questions

2 Answers

  • author's profile photo Former Member
    Former Member
    Posted on Sep 23, 2009 at 06:07 AM

    Hi Hendrik,

    1.

    Will you be able to find the missing record ?

    you said its working fine when u debug with single record, what result it gave when u tried debugging with the missing Record?

    2.

    For the Key field combination of Base DSO for eg : it may have 2 or 3 records , but with the key field combination of Target DSO there are chances of that 2 records being over written so that its stored as single record.

    This might be the reason for , lets say 20 records being stored as 14 records in Target DSO.

    Please check this and let us know .

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Dec 04, 2014 at 03:32 PM

    Hi Hendrik

    I just recognized the same behaviour.

    Did you find a way to disable the Aggregation between start- and endroutine?

    Thank you

    Roger

    Add a comment
    10|10000 characters needed characters exceeded

    • Former Member

      Dear Roger,

      actually I do not remember this scenario anymore I wrote about in my thread. Its a long time ago.

      But I again recognized it a few weeks ago in BW 7.3. There we use a query as data source and push the information into a cube. Against a few records got "lost" by some "delete adjencates dublicates" functionality.

      I believe after my ongoing tests that this is some internal logic inside SAP standard - to me it seems lke a bug but I have not yet reported to SAP.

      In other words: This transformation I now talk about has an end routine but no start routine. When debugging it I see that before the system reads all information from the source, but right before the end routine (and also inside at the very beginning), I am already missing a few records.

      What I did to solve it:

      1. I created a start routine with an internal table (same structure like the source_fields).
      2. I loop over source_package and collect into my own table
      3. then I delete the content of source_package and write my own data back

      That solved the problem.

      To me it seems that the system deletes all dublicate entries by comparing the against the target structure. So if the target has les fields that the source, then data gets deleted.

      I recognized this behaviour only when writing into a DSO. I personally would expect that the transformation itself does not aggregate the information. If you use a standard datasource like 2LIS for sales orders, one package could also contain multiple records and the last "wins" and overwrites the previous record values. I assumed that - once I dont want that - that I need to aggregate myself with ABAP "collect" statement. But obviously I miss and dont understand the BI behaviour correct or its a but.

      Anyhow with the "preaggregation" in the start routine I was able to get to my correct result.

      Hope this answer helps you to solve yours!

      Cheers,

      Hendrik

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.