cancel
Showing results for 
Search instead for 
Did you mean: 

Duplicae records in Master data

Former Member
0 Kudos

Hi all

We have an infosource 0FI_AR_3 that loads to an infoobject 0BILL_NUM and to an ODS 0FI_AR_O03. Yesterday, the load to the ODS was fine but the load to the InfoObject 0BILL_NUM has failed giving me the following error;

"128 duplicate record found. 4115 recordings used in table /BI0/XBILL_NUM"

"128 duplicate record found. 4115 recordings used in table /BI0/PBILL_NUM"

Is there a way I could fix this error. Could some one help me please.

Regards,

S.P

Accepted Solutions (0)

Answers (6)

Answers (6)

udayabhanupattabhiram_cha
Active Contributor
0 Kudos

Hi SP:

THis is just an idea. What if you try to load MD from ODS. THis way, depending on your ODS keys, the ODS already aggregates and then your MD load should be OK.

Ofcourse, it all depends on ODS Key Fields.

Good luck

Ram C.

Former Member
0 Kudos

Hi Everybody,

Thank you for responding. I already have the setting in my InfoPackage as "PSA Only, Update subsequently into Data targets".

Since I have this infopackage loading to both Master Data and an ODS, I will not have the option of selecting "Ignore Double Data records" in the infopackage.

So this boils down to the problem I have in the data load today.

What other option do I have to fix this issue.

~S.P

Former Member
0 Kudos

Hi,

Well there are the 2 options suggested by Bhanu and me, i.e to set up the error handling on the BW side and investigate on the source side how the data is set up and why the duplicate data is appering. You need to understand and eliminate the root cause for the duplicate data and the best bet to start off is on the source side.

The other option you can use is to write a start routine to filter out the duplicates.

But I would recommend that this be fixed on the source side if the problem lies there.

Cheers,

Kedar

former_member188975
Active Contributor
0 Kudos

Hi SP,

Since you are using a transactional datasource to load to master data, you will not have the Ignore Duplicate records option. You can try to prevent such errors in the future by setting error handling in the InfoPackage Update tab.

Hope this helps...

former_member188325
Active Contributor
0 Kudos

Hi,

did ou try with setting 'Ignore double records' in InfoPackage under Processing tab?

regards

Former Member
0 Kudos

Choose the options in the infopackage

-- only psa

-- subsequent in the the datatargets

-- ignore double data records

Hope it helps.

Regards

Former Member
0 Kudos

Hi,

In the info package you can set the uploading option to PSA and Ignore duplicate data.

Cheers,

Kedar

Former Member
0 Kudos

Hi Kedar,

Thanks for responding, I am aware of the setting in PSA. I wanted to know if there is an option to fix the maser data instead of the reload.

~S.P

Former Member
0 Kudos

Hi,

You can fix it Source side itself. Better to do it that . Have the Source system team check out the way the data is set up.

Cheers,

Kedar