cancel
Showing results for 
Search instead for 
Did you mean: 

Best practices to interact with models through API

0 Kudos

Dear all,

In order to have the best leverage from our modelling chains from SAP BO PA, we are looking for a way of interacting with models, which are automatically updated.

The further point we've reached in the conception of the solution is to use the Automated mode with Factory (to update the models automatically), since the expert mode does not use factory so far, is it ?

In that case, there is a way of exploring the model with the PA client, go to the 'export as' panel, and export the model by clicking on HTML source. Then we will end up with a javascript encoding of the current model. My point here is that the only way to export such a model is manual. Is there any other way to use either the automated mode or the expert mode or even the HANA PAL implementation to create a model which is able in the same time
1. to be updated automatically and
2. to be exported as a code or be served by an API endpoint automatically.

This way we will be able to create a forecast simulator that would help our clients to drive their businesses by interacting seamlessly (= change values of additional features in the forecast) with the model.

Thanks for your advices

Accepted Solutions (1)

Accepted Solutions (1)

abdel_dadouche
Active Contributor
0 Kudos

Hi Romain,

The APL is another option which allows to get somehow a finger (via SQL) on the Automated API.

APL doesn't expose all the Automated API.

Regards

Abdel

Answers (4)

Answers (4)

robert_mcgrath
Employee
Employee

Hi Romain,

Could you provide a little more detail on your use case.

You mention chains, for your use cases are you using expert chains to generate the predictions?

What is your architecture, how are the predictions consumed by your users? Is this an on-premise or a cloud use case?

There are a number of options for putting your models into operational use, if we understood the use case a little better we may be able to help further.

Rob.

0 Kudos

Hi Robert,

Sorry for the (very) late reply. We continued our modeling with a workaround but we now need to put them in production.

Our workflow (in R) is as follows:

- we consume data from a DB

- we train models and make prediction from the best model* (it could be a combination of various models).

- we store predicted data and models performances in a BO Universe to be explored by the end user.

To do so, we need to use the Expert mode (unfortunately without HANA) in an on-premise integration. My first attempt is to embed the developed R scripts into the expert mode. However, assuming it will work, I cannot use the SAP Factory to automate the run of the script since even in SAP PA 3.1 the export of the chain does not work.

Am I doing something wrong OR I need to use the expert mode with the HANA platform to be able to use the Factory module ?

Still, I do not have any option to make a dynamic prediction using the freshly computed model (like a webservice interaction for instance).

Best regards,
R.

marc_daniau
Advisor
Advisor
0 Kudos

Hi Romain,

Using HANA Studio you can in a SQL script call APL functions in order to create, train, retrain, apply a predictive model.

To answer your question on Export to HTML, here is a sample SQL script to run after a model was trained using APL.

create table EXPORT_HTML like RESULT_T; 
create table EXPORT_CONFIG like OPERATION_CONFIG_T; 
insert into EXPORT_CONFIG values ('APL/CodeType', 'HTML');
insert into EXPORT_CONFIG values ('APL/ApplyExtraMode', 'No Extra'); 
call APLWRAPPER_EXPORT_APPLY_CODE(FUNC_HEADER, MODEL_TRAIN_BIN, EXPORT_CONFIG, EXPORT_HTML) with overview;
Select key, value from EXPORT_HTML;

Videos: http://bit.ly/hanaapl Guide: http://service.sap.com/~sapidb/012002523100008699932016E/pa30_hana_apl_user_en.pdf

The APL library provides a number of functions for model management, but not for data preparation or data manipulation.

pierre_saurel
Advisor
Advisor
0 Kudos

Hi,

An alternate solution can be with “SAP HANA Cloud Platform predictive services” .

There are RESTful web services available in the SAP Hana cloud.

With a simple sequence of calls to “forecasts” services, you can get the forecasted value for a given dataset in a table.

It relies on the predictive functionality that runs in-memory in HANA (an automated model is automatically created behind the scene).

If you need an update of the model, you programmatically re-initiate the service with new data for training.

To know more, you can

Regards

0 Kudos

Hi,

I am just starting to have a look on those, but at this point I am afraid to reach some design limitations.

I'll try to explain why:

- In case of simple case (i.e. 1 training data set more or less already well prepared, 1 model, 1 output) that sounds like easy to use the services as you said, assuming that the models available behind those services might be as powerful as the ones from SAP PA automated mode/ PAL or whatsoever R model.

- But wWouldn't it be too complex if I need to use multiple data sources, merge those data sets, manipulate their attributes/variables, use a chain of models or several models and combine their outputs ?

I feel that it is possible to do so but it will require me to dedicate a great amount of time for each project, with an adhoc development that will even complexify the whole thing.
Or is it possible to create this data manip and models construction in a "design studio" and then pack it up to a service, which is then available in your HCP endpoint.

May be the pointed youtube videos will provide me some answers!

R.

abdel_dadouche
Active Contributor
0 Kudos

Hi Romain,

The "Automated" mode provides an API as well as a scripting capabilities to achieve your goal with no manual interaction.

If you look a the developpers guides from the documentation( it will point to the former InfiniteInsight product, but don't worry you are the right place), you will see that there is a Java API and something named KxShell script.

The KxShell script can also be generated from the user interface but only for the training and scoring task just to give you an idea about what it looks like.

Every year we try to have a "scripting" course at TechEd, are you planning to attend? The course is ANP360 (which I will be delivering).

Regards

@bdel.

0 Kudos

Hi Abdel,

Could you be more specific on the use of the so-called Java API? I am not really certain to understand if

- I need to develop the whole data manipulation + model construction + simulation tool through this API or

- if (as I guess) I can create a modelling process that is automatically rebuilt with the Automated mode, and then instead of clicking on export model manually use a pre-built java code that will interact with the (automatically updated) model.

Thanks !
R.

abdel_dadouche
Active Contributor
0 Kudos

Hi Romain,

The "Automated Analytics" has been built using the Automated Java API.

So basically, everything you can do in the UI can be achieved with the Automated Java API, i.e. Data Manipulations, Automated models, generate scoring functions.

The Automated Java API is really flexible.

I have built many java programs that use an existing model, retrains it on fresher data then deploy the scoring function.

But I also built other programs that build the model from scratch.

I really encourage you to check session ANP360 from the TechEd document center.

Hope this helps clarify your interrogations.

Regarda