Skip to Content

Event Based trigger on sender JDBC adapter

Hi all,

Is there any possibility of event based trigger on Sender JDBC adapter? I am working on PI 7.4 dual Stack.

Scenario - Sender JDBC adapter is integrated with SQL database, where its runs on scheduled time everyday and pull data from Staging table.

Issue - In SQL, Stored Procedure (SP) ie, Data from primary table to Staging table gets delayed in start and completion of porting job, so due to this JDBC adapter starts picking the data once the scheduled time arises and there is data mismatch in target system. In SQL, SP Job completion time is not accurate and differs each day, So due to this scheduling the JDBC adapter is also difficult.

Note - If porting job is completed on or before scheduled JDBC adapter there is no issue in data and everything is fine.

So to overcome this issue Event Based trigger or any other possibility is available?

Regards

Kannan Selvakumar

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • May 22, 2017 at 04:57 PM

    Hi Kannan,

    Are you calling SP or Staging table in your select query?

    1) SP - It's the responsibility of source system to provide the return records to PI only after the completion of business process COMMIT on their DB. This can be easily validated by SQL developer in the SP. Probably verify if you can add such condition on the source side to avoid inconsistencies.

    2) Staging Table: If you are reading directly from staging table in CC, then probably you need to have another table on SQL which can say when to read and when not to. The flag in this table should be updated by the business process after the completion of COMMIT on source side.

    Thanks.

    Add comment
    10|10000 characters needed characters exceeded

    • Hi Kannan,

      The Inner join on select query is very much feasible option but at the end of the data transfer successfully to SAP, you have to update back this traffic signal table status to initial so that source job will proces the next set of records to staging table.

      The logic is same even for source system. When you are processing the records the source team are not supposed to update the staging table. They have to verify the traffic table and update the staging table accordingly. In this way you dont have misalignment in the data. Since PI do not have locking the table feasibility you have to do externally in this way to avoid the inconsistencies.

      The other most precise way to design is ask your DB team to create the SP and this should return the required records that are COMMIT to DB successfully. You call the SP in the select query of CC. This is more easier way to deal with. In the UPDATE statement you leave <Test> since SP is already taken care of the status update of flags.

      Let me know if you still have any doubts.