cancel
Showing results for 
Search instead for 
Did you mean: 

Data arrived in ODS new data - all data fields are cleared

Former Member
0 Kudos


Hello Experts

We are loading every day since 3 years data into a DSO   There is a simple start routine.

Issue:

Suddenly 1x  record out of 12'000 records loses the data fields or in other words the data fields are cleared. The key part of the record still extists. The table of new data shows the record with all cleared data fields. Thus before activation the data are lost already somwhere in updating.

We dont use expert routines, no end routines ect. It is a simple 1:1 transformation with a simple start routine.

We debugged the start routine and the record still exists correctly at the end of the start routine with all entries in the data fields.

So quite tricky and we hope of some good ideas , you may help us.

Thanks a lot for your input and help

Christian

Our system is SAP BW 7.01 on DB6

Accepted Solutions (1)

Accepted Solutions (1)

RafkeMagic
Active Contributor
0 Kudos

do I understand correctly that you have 12000 records in 1 source_package?

is that 1 record located at the beginning, middle of end of your source_package table?

did something "suddenly" change, like OSS notes implemented, support package upgrade, database upgrade, increase in data package size, ... obviously "something" triggered this change in behaviour, the (hard) task is to figure out what

Former Member
0 Kudos

Hello Raf

   

The data request has been split in 2 x data packages.

And the erroneous record was exactly the last data record in data package 1.

I have than increased the data request size from 10’000 Records to 50’000
records (default) and it worked than perfectly without clearing any data
columns

For me absolutely not understandable, why this can happen.

Do you have an idea.

  Best regards

Christian

RafkeMagic
Active Contributor
0 Kudos

so, it was record 10000 then...

is this an "old" (3.x) flow or a new one? also is record 10000 linked to record 10001 (does it have the same key)? if so, are you using semantic grouping?

not sure as to why your values are cleared, but... if those two records somehow are linked, you may want to use semantic grouping so they are treated in the same data package (just thinking out loud here)

Former Member
0 Kudos

Hello Ralf

It is a 7.X data flow.  In case it happens again. I will drigger a SAP Support to discover this strange behaviour.

Thanks a lor for your ideas.

Christian


Answers (2)

Answers (2)

abhishek_shanbhogue2
Contributor
0 Kudos

HI Christian,

If the data is in the New table is getting cleared then I guess you havent mapped the data fields correctly so pls relook at the mapping, maybe you can have constant for dummy fields if they are not used but unless you map they wont appear

Thanks

Abhishek Shanbhogue

Former Member
0 Kudos

Hello Abhishek

Why do you think it is a mapping issue. It is just one record  of 12'000 records. All other records are correct.

Best regards

Christian

Former Member
0 Kudos

Which version and SP level are you on?

Former Member
0 Kudos

Hello Suyash

We are on BW 7.31 on SP 12

Best regards

Christian


fcorodriguezl
Contributor
0 Kudos

Hi Christian!

1) if you use FOR ALL ENTRIES instruction i think have a limit of 1000-1100. Try attempts to reduce
the packet size.

2) if you use SORT instruction maybe misssing a infoobject and READ instruction i can not find.

I hope help you.

Former Member
0 Kudos

Hola Francisco

As I have written, the records is still correct in the start routine and no data is lost.

Therefore why do you think, that 'FOR ALL ENTRIES' has an impact?

Thanks for your input

Christian.