Skip to Content
0

Processing Large file SFTP to Proxy (mapping involved)

Jun 05, 2017 at 09:12 AM

141

avatar image

Dear All.

I've a scenario where i need to process 6 to 8 GB file from SFTP to Proxy scenario and the mapping is involved in this scenario.

can you please suggest the best way to process the file.

i can see https://blogs.sap.com/2015/04/03/sftp-adapter-handling-large-file/ the chunk mode processing is the way to handle the large file process, but here i do have mapping as well.

please suggest the best way to handle the large file for this scenario .

Regards

Ramesh.

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

5 Answers

Iñaki Vila Jun 05, 2017 at 11:42 AM
1

Hi Ramesh,

6 to 8 Gb is impossible to be treated in any programming language that i can think. I would talk with the sender system developer about the possibility of generating n files with less size.

Regards.

Show 1 Share
10 |10000 characters needed characters left characters exceeded

I would agree with Inaki on this - 6-8GB is way too big to process via mapping. Chunking is based purely on size in MB and what you will get is messages with truncated or partial records. The sender should provide files in reasonable processing sizes.

Regards,

Ryan Crosby

0
Manoj K Jun 05, 2017 at 10:26 AM
0

Ramesh,

Is the mapping related to complete records in the file or is it individual to each record and not dependent .

If the mapping is not dependent then you may use chunk mode and then do mapping.

What is the source file if its xml file then chunk mode is not helpful. if its text/csv then u may use recordset per message and split it accordingly

Br,

Manoj

Show 4 Share
10 |10000 characters needed characters left characters exceeded

Dear manoj.

thank you so much for you help.

The source file is CSV and each row to be considered for single posting in message mapping.

will chunk woks in this case? is recordset count will work for this case? we are expecting to post using EOIO, any suggestion for this case.

please suggest further.

Regards

Ramesh

0

Ramesh,

Chunk mode breaks the file on size not in number of records .

Recordset per message u can read each record as single message and u can enable EOIO here this should work in your case.

Note in this case u need to do FCC via standard message protocol provided and not by MTB module.

Br,

Manoj

0

Dear Manoj.

Thank you for your reply.

How the check mode breaks the file, means will it brake from first row of the file to the mentioned chunk mode size?

we are using SFTP channel, in this case i believe chunk mode is available in sender SFTP channel if imported updated tpz in to an ESR? correct me if i'm wrong.

Regards

Ramesh

0

Ramesh,

As mentioned earlier Chunk mode breaks the file on size u have mentioned and not on number of records so there is always a risk of reading a partial record . Chunk mode is useful when yo have pick and drop mechanism only so i would suggest you not to go to chunk mode option.

Did u try the recordset per message option ?

Br,

Manoj

0
Rudra Singh Jun 06, 2017 at 07:05 AM
0

Hi Ramesh,

As it is file to proxy scenario , I will suggest to pass through scenario from PI to ECC and ask your Abap team to handle the PI mapping logic in proxy coding.

Regards,

Rudra

Share
10 |10000 characters needed characters left characters exceeded
Rudra Singh Jun 06, 2017 at 07:05 AM
0

Hi Ramesh,

As it is file to proxy scenario , I will suggest to pass through scenario from PI to ECC and ask your Abap team to handle the PI mapping logic in proxy coding.

Regards,

Rudra

Share
10 |10000 characters needed characters left characters exceeded
Patrick Weber Jun 06, 2017 at 08:15 AM
0

To be honest, I would drop the idea of handling the mapping in PI altogether unless you can align with the sender to split the data into manageable packages. Could it be an option to simply use PI as a transport technology, drop the file on your backend's application server and have you ABAP team write a report that will read the file n lines at a time and post the corresponding documents?

Share
10 |10000 characters needed characters left characters exceeded