Skip to Content
0
Former Member
Jan 25, 2016 at 05:41 AM

Performance issues due to large data being sent to SAP by JDBC Sender

18 Views

Hi Experts,

I have an existing interface running in production which picks changes that has happened for a particular set of timesheet data everyday from an oracle database using JDBC Sender.Normally the number of records returned is more than 70,000 to 80,000 per polling iteration on a given day which is causing performance issues in SAP Proxy which is creating an ST22 dump in ECC with category "CALL_FUNCTION_SEND_ERROR" ( Short text " " (I/O error)) - An error occurred while making a Remote Function Call.

This dump seems to have started happening after the data has started to increase from 50000 records to 80000 records per polling iteration.

We are using subqueries to select data from 2 tables in Oracle database. I understand that there is an option to restrict the number of rows being picked by using rownum .But , for the select statement used I need to use an update statement to set the flag . Currently since all records are picked at one shot , the update statement that is used does not have any issues .If i reduce the number of rows to be picked , how can I ensure I set the flag using update statement for the same records.

My question at this point would be ,

1. Is there a better option to reduce the number of rows being picked in 1 polling iteration of JDBC Sender which uses subqueries to pick the data from Oracle database.

2. In case i reduce the number of rows being picked by using rownum, how can i ensure the update statement will act on those same records to update the status flag.

Please provide your valuable advice.

Regards,

Nick