on 10-23-2015 11:08 AM
Hello all,
I have a scenario wherein I will have to load the Master Data from a flat file. The load has to happen quite frequently. I have a flat file wherein one of the fields in my flat file has to be loaded to the Dimension ID of one Dimension. The flat file contains multiple records with the same ID. I used transformation file to load the data from a flat file. So when I tried to load the data, it throws me an error that the duplicate records are detected.
Does anyone have any idea how we can skip the duplicate records using transformation file options and mapping section?
Thanks,
Thanuja Sekar
Hello,
BPC will not load duplicated values it will simply rejects them. I recommend testing and verifying it, but I think all duplicated records will be rejected.
Regards,
Leila
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The question is - how you get this file?
In general it's generated from some system... And the correct way is to generate the version without duplicates directly from the source system.
Another options are to process the file with duplicates:
1. Preprocess text file with some script before uploading (vbscript or...)
2. Use routine badi.
Vadim
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Vadim,
Thank you. When You say Use routines, which routine is most suitable for doing this? Is it a start routine or end routine?
My scenario is I have to delete the adjacent duplicates from the source and then load it to BPC. Since it has to be done before transformation and Mapping, I tried with Start routine. But there comes a problem wherein my records will be fetched line by line in a string line wherein I will not be able to delete the adjacent duplicates because this will only work for Internal Tables. In My start routine, my records will be fetched row by row wherein I cannot delete the entries form a table.
What's the best way to implement this?
Thanks,
Thanuja
Vadim,
The scenario is that I post an entry using Journals. After posting it, I need to take the posted journals and load the journal IDs into One dimension. My problem is that I may have many transactions for the same journal entry. When I take the posted journals, it will have Duplicate Journal IDs. For removing the duplicate entries, I need to use the routine.
And in start routine, If I use Delete Adjacent Duplicates syntax, since it works only on Internal tables, how can I do this?
Thanks,
Thanuja
User | Count |
---|---|
15 | |
3 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.