on 06-13-2016 5:08 PM
Hi Gurus,
I know that semantic group is define the data packages we load for key uniqueness of the data. Data with the same semantic keys will be grouped in a single data packet and it is also the key fields to the error stack.
My Doubt!!!!
For example: Material and Plant are the semantic keys and the data packet size is of 50k but the uniqueness values of this combination is 70k. how the data will be loaded?? will it bring 50K as the data packet size is 50k?? then what about the remaining 20k records??
May be it seems a silly question for many but please help me out to understand.
Thank you.
Kind Regards,
Venkatesh
No, it would bring 70K in one package . It would overwrite your package size actually.
Generally if you use Semantic Group in DTP , you would see instead of 50K , you are getting 50K+ records in package while actually running the DTP.
Regards
Anindya
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Venkatesh
In the example below my package size is 2 and I have BILL_ITEM as semantic Key . As I have same billing item 3 times, system ignored package size and brought records in 4 packages each with 3 records. If I change my package size to minimum 12 , it would bring all records in one package. So, package size does matter . But it only matters after semantic settings are maintained .
Regards
Anindya
User | Count |
---|---|
75 | |
9 | |
8 | |
7 | |
7 | |
6 | |
6 | |
6 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.