on 10-23-2019 7:47 PM
In our product SAP Intelligent Asset Management we have OData APIs which can be used to export data to SAC. We now notice that for some of these APIs, when the customer has a lot of data, we are nearing performance bottlenecks, and the risk of our backend system timing out.
The question is whether there are mechanisms how we can automatically batch the data import. Say there are 1 million source records, can the OData import in SAC upload the first 10k records, pause, reset, and then upload the next 10k, etc. Sorry if I describe this in a naive way.
Hi,
yes, in SAC, for your Import model (using the odata API connection) - you can restrict the fetch of each data load using query filters, and then append each rowsets as 'chunks' into the model . following that, you can set a recurring schedule to fetch the delta since last fetch date.
regards, H
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
81 | |
24 | |
11 | |
9 | |
7 | |
5 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.