i have questions regarding the usage of big files with a BPM scenario. The functional requirements are as follows:
1. Pick up large (raw data) file (1MB) from FTP server
2. Drop this file to a second FTP server
3. After the file has been transmitted sucessfully (criticall !), look into a DB and extract information with the help of the filename of the transmitted file and extend the message (in a message mapping)
4. Send this data to ECC and update a custom table
My approach to realise this scenario, would be to perform the DB lookup in a java mapping. Are there any other options?
In addition I am concerned about performance issues, because we will send about 200 files a day (up to 10 at a time) using that interface. Is there a possibility to avoid the integration process?
Kind regards and thanks in advance