Skip to Content

"Batch" loads of data via OData

In our product SAP Intelligent Asset Management we have OData APIs which can be used to export data to SAC. We now notice that for some of these APIs, when the customer has a lot of data, we are nearing performance bottlenecks, and the risk of our backend system timing out.

The question is whether there are mechanisms how we can automatically batch the data import. Say there are 1 million source records, can the OData import in SAC upload the first 10k records, pause, reset, and then upload the next 10k, etc. Sorry if I describe this in a naive way.

Add a comment
10|10000 characters needed characters exceeded

Related questions

1 Answer

  • Posted on Oct 24, 2019 at 07:22 AM

    Hi,

    yes, in SAC, for your Import model (using the odata API connection) - you can restrict the fetch of each data load using query filters, and then append each rowsets as 'chunks' into the model . following that, you can set a recurring schedule to fetch the delta since last fetch date.

    regards, H

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.