on 07-15-2020 2:02 PM
Hi!
In Input limits "Inference request objects" to classify = Maximum: 50.
Tell me please, how I can classify 10 000 records with minimal labor?
Hi Tatiana,
let me expand on Evgeny's answer:
You can use the InferenceClient.do_bulk_inference() method to classify multiple items. To make an instance of InferenceClient, you can use the InferenceClient.construct_from_service_key method.
A general introduction to the SDK is also available, which includes a section on Inference.
Let me know if this works for you.
Michael
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Michael,
Thanks for the links and instructions. Everything is working.
Regards, Tatiana
Hi Tatiana,
At the moment there is no API functionality to classify batches of more than 50 records per one API request, so you would need to split your input records in batches of 50 and send separately.
Good news is that we have a python SDK for DAR that can do this (split into batches and send) for you. https://github.com/SAP/data-attribute-recommendation-python-sdk
Also please note, that trial accounts have some limitations, so you cannot classify more than 2 000 records in total (all limits)
Hope it helps,
Evgeny
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
81 | |
10 | |
10 | |
8 | |
7 | |
7 | |
7 | |
6 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.