We are sending Demand data from IBP to ECC by way of Target Web Service call to a wrapper around BAPI_REQUIREMENTS_CREATE. We are facing problems in ECC when the input to the XML Map Batch exceeds the XML Map Batch "batch size (rows)" in the Transform Details. Currently the batch size is set to 10000 and the flag is checked for "Input already sorted by batch key columns". If the total number of rows for input is 10001 and the batch size is 10000 then it is possible for technical week planning data for a single material(product) to be split between Batch 1 and Batch 2. In this case Batch 2 is replacing the valid data in Batch 1 specifically for the product that is split across the batches.
What is the potential impact of setting the batch size very high (say 100000)?
Is there any special switch in ECC BAPI_REQUIREMENTS_CREATE that can avoid the data splitting?
Is there any best practice preprocess that should be performed prior to flowing data to XML Map Batch in CPI-DS?