cancel
Showing results for 
Search instead for 
Did you mean: 

Transporting a Scalar Function to Quality Environment

anup_deshmukh4
Active Contributor
0 Kudos

Hello Experts,

I have a Scalar function created in a package ( Snapshot attached ). It needs to be progressed to the quality environment. In our landscape, we do it using SCTS_HTA T-code .

Now the problem is that request fails as the function contains a reference to the physical schema SAPCRI which is in DEV only.

Apart from running the scripts and changing in respective environments, Is there any way we can dynamically map the schema according to the environment. ( OR Like in Scripted calculation view can I set any default schema for functions as well )

Attached is the snapshot for my M_SCHEMA_MAPPING content.

Accepted Solutions (1)

Accepted Solutions (1)

pfefferf
Active Contributor
0 Kudos

It is possible to define a default schema for user defined functions too for which the schema mapping is applied. But that does not solve your issue, because the default schema and the schema mapping is only applied to e.g. queries on tables w/o schema information in your function. What you expect is that the schema in which the runtime object for the function is created is changed by the schema mapping. That will not work. You have to define a fix schema for your runtime objects (e.g. via a .hdbschema design time object) which is the same on all your relevant systems.

anup_deshmukh4
Active Contributor
0 Kudos

Thanks a lot ...! it worked 🙂

Answers (0)