Skip to Content
0
Mar 24, 2009 at 06:22 PM

Changing an InfoObject that is part of the key in very large DSO

257 Views

Hi BI Experts,

I wonder whether anyone out there has come across a similar problem and solved it in a way that didn't include creating a copy, loading data into that copy, emptying the original DSO, change it and then load the data back? Here is the "challenge" we have are in:

- We have got an InfoObject Characteristic 0ITEM_NUM which is defined as NUMC with length 3, it just contains line item numbers like 001, 002, etc... up until 999

- 0ITEM_NUM is used in 10 DSOs as part of the key together with the FI document number, all DSOs are holding FI-SL Special Ledger Line Item data

- Unfortunately we have got data in R/3 with an item number larger than 999 (like 1013) in the same FI document, so that data was overwritten in these 10 DSOs in the past (line 013 got overwritten by line 1013, as this was transformed into 013)

- To prevent this from happening in the future we would like to extend the info object 0ITEM_NUM to the length of 6 so that it matches to what can come through from R/3

- As the existing data in the InfoObject and its SID table /BI0/SITEM_NUM would still make sense after the extension, we think there must be an easier way than what I wrote in the beginning, as this would mean a lot of QA testing and reconciliation tasks

In Oracle one could just use an ALTER TABLE statement to extend the key field from 3 to 6 characters in the affected DSO, PSA and ChangeLog tables. But of course we don't want to do that if it corrupts our

BI system. But would it?

The problem we are getting in the target system, after just doing the change in RSA1 in the InfoObject maintenance and then transport is error message R7159 "Error/warning in dict. activator, detailed log" during method execution of the transport. This forced us to transaction SE14 to convert the tables manually, but it seemed like the system tried to back up data, which would take a very long time in production, if we had the space to do so.

Reducing the current number of entries in the largest DSO (350 million) is not an option. In all DSOs affected we have well over 1 billion records.

Any views, ideas and sharing of your approach to similar "challenges" would be much appreciated.