cancel
Showing results for 
Search instead for 
Did you mean: 

Semantic group in DTP.

venkatesh_veera2
Participant
0 Kudos

Hi Gurus,

I know that semantic group is define the data packages we load for key uniqueness of the data. Data with the same semantic keys will be grouped in a single data packet and it is also the key fields to the error stack.

My Doubt!!!!

For example: Material and Plant are the semantic keys and the data packet size is of 50k but the uniqueness values of this combination is 70k. how the data will be loaded?? will it bring 50K as the data packet size is 50k?? then what about the remaining 20k records??

May be it seems a silly question for many but please help me out to understand.

Thank you.

Kind Regards,

Venkatesh

Accepted Solutions (1)

Accepted Solutions (1)

anindya_bose
Active Contributor
0 Kudos

No, it would bring 70K in one package .  It would overwrite your package size actually.

Generally if you use Semantic Group in DTP , you would see instead of 50K , you are getting 50K+ records in package while actually running the DTP.

Regards

Anindya

venkatesh_veera2
Participant
0 Kudos

oh! thanks a lot Anindya.

will there be any limit  for that overwrite package size??

anindya_bose
Active Contributor
0 Kudos

Nothing I am aware of .. System will try to put everything in one package based on your Semantic Group selection .

Regards

Anindya

venkatesh_veera2
Participant
0 Kudos

Anindya,

what is the use of defining data packet size?? when it is going to change by default based on records??

Venkatesh

anindya_bose
Active Contributor
0 Kudos

Hi Venkatesh

In the example below my package size is 2 and I have BILL_ITEM as semantic Key .  As I have same billing item 3 times, system ignored package size and brought records in 4 packages each with 3 records.  If I change my package size to minimum 12 , it would bring all records in one package.  So, package size does matter . But it only matters after semantic settings are maintained .

Regards

Anindya

Answers (0)