cancel
Showing results for 
Search instead for 
Did you mean: 

Issue with DTP loading

amine_lamkaissi
Active Contributor
0 Kudos

Hello BW experts,

I'm facing an issue with a DTP in BW while loading data.

This DTP extracts Data from from a datasource into an Infocube.
FYI, this Datasource is built on a table present in BW (not in ECC or external system)

The problem is the following:

If I take the example of the last test request:

The package size is very high (defaut values proposed by BW during DTP building), and the number of inserted values is wrong.

In fact, the DTP takes 306 records from the table present in BW and inserts 536 lines (the difference is wrong data)

Logically, this kind of issues should be provoked by an error in transformation. But there are no errors detected by the system and the transformation is logical.

Does someone has an idea that could help me to resolve this issue.

Thanks for your help.

Amine

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hello,

From the above screen shots we can say that your table consists of 536+306 = 842 records.

In which there are 306 error records which are moved to error stack and 536 records are posted to the target in this delta.

For getting all the records you have to correct the errors in your errorstack and run the error stack DTP so that your issue will be solved.

Regards,

Vishnu

amine_lamkaissi
Active Contributor
0 Kudos

Hi Vishnu,

There are no errors. all data is accepted by the PSA.

but, while inserting data into the cube level. BW inserts more data (536 instead of 306).

It's like when do you have an End Routine which is not the case in my transformation.

I continue my investigation.

Thanks.

Amine

Former Member
0 Kudos

Hi Amine,

How many records were added to the infoCube? Is it  536?

Br,

H

amine_lamkaissi
Active Contributor
0 Kudos

Exactly Harish,

You can see it on my second screenshort.

Thanks.

Amine

Former Member
0 Kudos

Hi Amine,

Then the remaining data must have gone through error stack. Check if they really are error data?

P.S: Please do not mark all the suggestions you receive as helpful answer. Only mark which really helped you. In the above case, I just asked a question, which in my humble opinion, is not a helpful answer.

Br,

H

Former Member
0 Kudos

Hi,

Please expand your transformation arrow in the above screen so that we can understand what has happened in the your transformation why the increase in the number of records has happened.

One more info, is the DTP error stack DTP or the standard DTP?

Regards,

Vishnu

amine_lamkaissi
Active Contributor
0 Kudos

Ok

No the data was correctly inserted into the infocube (536 lines)

So ne errors in the PSA

amine_lamkaissi
Active Contributor
0 Kudos

No it's a standard DTP.

Please, find below the screen shot of my transformation (sorry for the quality of picture - a little bulkyTranformation)

Former Member
0 Kudos

Hi Amine,

One thing you should understand is that PSA won't detect error records. It is just a temporary staging area used for data quality check.

Triggering infopackage pulls all the data available in the source to bw system. Only after triggering the DTP will let you know if there are any error records.

You have set up an error stack so that the DTP run didn't failed instead, the erroneous records are pushed to error stack.

This is evident from the fact that there are no routines or rule types in your transformation.

Check your error stack for erroneous records, correct those and update it to target.

Correct me if anything above is wrong.

Br,

H

amine_lamkaissi
Active Contributor
0 Kudos

I agree with all your remarks Harish.

FYI, I created a little set of data in development to focuse on.

And I discovered something weird. May be I am getting close to identify the issue.

as you can see the fiscal year is wrong. It's 2005 instead of 2012. The error stack didn"t detect this error. It's accepted and was inserted correctly into the infocube.

Here my transformation regarding this point.

Normally, that should be automatic conversion. But not working fine at all.

Amine

amine_lamkaissi
Active Contributor
0 Kudos

I rsolved this issue by implementing a routine.

Here's the code, might helps other people:

*---------------------------------------------------------------------*

*       CLASS routine IMPLEMENTATION

*---------------------------------------------------------------------*

*

*---------------------------------------------------------------------*

CLASS lcl_transform IMPLEMENTATION.

  METHOD compute_0FISCYEAR.

*   IMPORTING

*     request     type rsrequest

*     datapackid  type rsdatapid

*     SOURCE_FIELDS-FISCPER TYPE /BI0/OIFISCPER

*    EXPORTING

*      RESULT type _ty_s_TG_1-FISCYEAR

    DATA:

      MONITOR_REC    TYPE rsmonitor.

*$*$ begin of routine - insert your code only below this line        *-*

... "insert your code here

*--  fill table "MONITOR" with values of structure "MONITOR_REC"

*-   to make monitor entries

... "to cancel the update process

*    raise exception type CX_RSROUT_ABORT.

... "to skip a record

*    raise exception type CX_RSROUT_SKIP_RECORD.

... "to clear target fields

*    raise exception type CX_RSROUT_SKIP_VAL.

     RESULT = SOURCE_FIELDS-FISCPER(4).

*$*$ end of routine - insert your code only before this line         *-*

  ENDMETHOD.                    "compute_0FISCYEAR

It seems to be Ok, no more additionnal data (with my little created data set).

And fiscal year is correct right now.

Thanks to everyone for your ideas.

Amine

Answers (2)

Answers (2)

former_member210253
Contributor
0 Kudos

Hi,

See the Transformations,

Do you have more than one Rule groups in the TRFN ?

Do u have written any code at DTP level ?

Regards,

Babu

amine_lamkaissi
Active Contributor
0 Kudos

Hi Babu,

In fact, I'm pretty sure that the issue is caused by my transformation.

I have many rules grups. But they're all one to one (direct assignment). Since the complicated treatments are done on the down level.

There is no coding, no filters either in the DTP level.

Thanks.

Amine

Former Member
0 Kudos

Hi Amine,

I assume 306 records in the PSA table.

How many records were added to the infocube?

Check whether any routines available in the transformation.

Also, ensure that while creating DTP, set a definite package size as 50,000 instead of default values.

This will enable proper data processing.

Br,

H

amine_lamkaissi
Active Contributor
0 Kudos

Hi Harish,

Absolutely, 306 recrds in the PSA table which is corresponding to the number of data present in my BW Table (SE11/SE16 Table)

There are no routines in my transformation.

Ok, I agree with you. I will change the size into 50.000. The current number is so high (since i wanted to keep as much as possible the default values proposed by BW) --> I will let you know if this changes anything on my issue

Thanks anyway.

Amine