we have a requirement to send data from ACDOCA DSO from BW4HANA to Azure Data lake via SAP Data intelligence in CSV files.
The initial volume is around 115 million.
We have an inbuilt operator in SAP DI: "Data Transfer" which can connect to BW DSO as ODP Source and in the target, we can mention CSV file format.
we have 3 options: Append, Create and Create based on Data packages. The append option will create one file and data will be appended in the same file.
Create and Create based on data packages will generate multiple files of 5-6 MB. Hence for such a big data volume, it will generate around 1000+data files which could be nightmare for Azure team to combine it in a single file.
Hence I am looking for a strategy how can we load such a big volume as initial load in CSV file via SAP DI w/o getting the job failed due to volume of data. And once it is done, delta will work based on subscription ID.