on 08-22-2014 3:56 PM
Hello Experts
We are loading every day since 3 years data into a DSO There is a simple start routine.
Issue:
Suddenly 1x record out of 12'000 records loses the data fields or in other words the data fields are cleared. The key part of the record still extists. The table of new data shows the record with all cleared data fields. Thus before activation the data are lost already somwhere in updating.
We dont use expert routines, no end routines ect. It is a simple 1:1 transformation with a simple start routine.
We debugged the start routine and the record still exists correctly at the end of the start routine with all entries in the data fields.
So quite tricky and we hope of some good ideas , you may help us.
Thanks a lot for your input and help
Christian
Our system is SAP BW 7.01 on DB6
do I understand correctly that you have 12000 records in 1 source_package?
is that 1 record located at the beginning, middle of end of your source_package table?
did something "suddenly" change, like OSS notes implemented, support package upgrade, database upgrade, increase in data package size, ... obviously "something" triggered this change in behaviour, the (hard) task is to figure out what
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello Raf
The data request has been split in 2 x data packages.
And the erroneous record was exactly the last data record in data package 1.
I have than increased the data request size from 10’000 Records to 50’000
records (default) and it worked than perfectly without clearing any data
columns
For me absolutely not understandable, why this can happen.
Do you have an idea.
Best regards
Christian
so, it was record 10000 then...
is this an "old" (3.x) flow or a new one? also is record 10000 linked to record 10001 (does it have the same key)? if so, are you using semantic grouping?
not sure as to why your values are cleared, but... if those two records somehow are linked, you may want to use semantic grouping so they are treated in the same data package (just thinking out loud here)
HI Christian,
If the data is in the New table is getting cleared then I guess you havent mapped the data fields correctly so pls relook at the mapping, maybe you can have constant for dummy fields if they are not used but unless you map they wont appear
Thanks
Abhishek Shanbhogue
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Christian!
1) if you use FOR ALL ENTRIES instruction i think have a limit of 1000-1100. Try attempts to reduce
the packet size.
2) if you use SORT instruction maybe misssing a infoobject and READ instruction i can not find.
I hope help you.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
90 | |
10 | |
10 | |
10 | |
7 | |
7 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.