on 02-20-2019 1:47 AM
Hello Experts,
I have a Scalar function created in a package ( Snapshot attached ). It needs to be progressed to the quality environment. In our landscape, we do it using SCTS_HTA T-code .
Now the problem is that request fails as the function contains a reference to the physical schema SAPCRI which is in DEV only.
Apart from running the scripts and changing in respective environments, Is there any way we can dynamically map the schema according to the environment. ( OR Like in Scripted calculation view can I set any default schema for functions as well )
Attached is the snapshot for my M_SCHEMA_MAPPING content.
It is possible to define a default schema for user defined functions too for which the schema mapping is applied. But that does not solve your issue, because the default schema and the schema mapping is only applied to e.g. queries on tables w/o schema information in your function. What you expect is that the schema in which the runtime object for the function is created is changed by the schema mapping. That will not work. You have to define a fix schema for your runtime objects (e.g. via a .hdbschema design time object) which is the same on all your relevant systems.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
10 | |
9 | |
9 | |
9 | |
6 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.