cancel
Showing results for 
Search instead for 
Did you mean: 

How to achieve dynamic mapping in BODS

Former Member

Hello,

This is about Dynamic Mapping in BODS.

I can relate my query to a similar post which is already there but i do not see any updates further...

BODS - Query Transformation - Dynamic Mapping

Is there a way that we can achieve dynamic mapping in BODS? If so please throw some light on this. I am not looking for complete derived solution to this, but if any one can give some high level inputs that would be greatful.

Thanks

Thenmugaselvan B

Accepted Solutions (1)

Accepted Solutions (1)

denise_meyer
Employee
Employee
0 Kudos

This is not possibley through designer or normal designing of jobs. You could possible do this when creating an atl or xml file for a job on the fly and importing into the repository using al_engine command to import, however this would require in depth knowledge and consulting to implement.

Former Member
0 Kudos

Is there a way that we can utilize the User Defined Transform with python code in it? Any chance to do with it?

Answers (5)

Answers (5)

Former Member
0 Kudos

As an answer, I will consider the option provided by Denise Meyer. To implement this, we would need a in depth knowledge on al_engine command and consulting to implement. Considering the complexity, we will try with the alternate solution provided by Andrey.

Former Member
0 Kudos

Hi,

Let's say someone would be that stupid at SAP and build the functionality of automatically recognizing the columns by enabling this functionality with a tick (which in fact needs to reimport the source metadata at every run). Will SAP have to guess a default 1:1 mapping carried out till the target? This automation would be beneficial to 1 customer within 1000 customers. The situation of adding new columns so often would happen only during development, not after go-live. And if this is happening after go-live, that is not a reliable source and changes should be done there, not at SAP Data Services.

Regards,

Bogdan

former_member220897
Participant
0 Kudos

And what about the target structure, is it just 1:1 replication or something else?

I still wonder what are the business requirements behind that, and why the source cannot be simply a format like

( Name;RollNo;SubjectNo;SubjectName ) .

But anyway, if you can assume a maximum number of columns in the file, say no more than 100 subjects, then it is possible to build a job to convert it into ( Name;RollNo;SubjectNo;SubjectName ), and use it in your mapping or convert it back to the source structure, although it's unlikely to be a single simple dataflow anymore.

So true, in general you cannot add a column on the fly in Data Services, but try to review your business requirements and come up with some assumptions to simplify the processing. If you give me more details, I could come up with more ideas on how to make it work.

Former Member
0 Kudos

Hello Andrey, Thanks for the response.

Lets consider it to be an 1:1 mapping. The target structure has to be defined dynamically based on the header row in the source file with ';' as the separator. The target can either be a new table or a flat file. Our first goal is to achieve the staging of data with the dynamic file/table generation with the given source file.

Once we generate the dynamic target file/table, further we can work through with scripts for further logic.

Aim of the customer is to, not create anymore CRs to make changes to the existing jobs for field additions.

Thanks

former_member220897
Participant
0 Kudos

Once we generate the dynamic target file/table, further we can work through with scripts for further logic.

I don't get this part. The scripts for further logic will have to use the dynamic table structure as input, so you still will have to deal with the same problem of dynamic mapping in these scripts.

I would consider the following options

1. Redesign the input structure in the first place to make it static , e.g.

( Name;RollNo;SubjectNo;SubjectName )

2. Implement the job with the assumption of maximum number of columns possible, say for maximum 100 columns

3. Educate the customer and provide an instruction for the customer on how to extend the solution with additional columns - at the end of the day, that's what SAP Data Services is meant to be - a platform for transparent and easily extensible ETL, in contrast to a "black box"

Former Member
0 Kudos

Hello Andrey, I will take your inputs and discuss with the customer on this. Thanks a lot for the help.

Former Member
0 Kudos

Hello Andrey,

Lets take the example with a Flat file as source with the file structure as shown below:

The number of columns will be decided based on the header row with the semi colon as seperator.

Initial Source 1:

Name; RollNo; Subject1; Subject2; Subject3

ABC;1234; Maths; Physics; Chemistry

BCD; 2345; Maths; Computers; Physics

CDE; 3456; Computers; Accounts; Economics

When there is a new change to the existing job, To add a field Subject4. If you have to add another field like below; How to dynamically generate mapping and produce output with the same format as in the source file header row, without making any manual changes to the job....

Modified Source 1:

Name; RollNo; Subject1; Subject2; Subject3; Subject4

ABC;1234; Maths; Physics; Chemistry; History

BCD; 2345; Maths; Computers; Physics; Geography

CDE; 3456; Computers; Accounts; Economics; History

End of the day, the job should not be modified manually and it should be able to generate output(Table/File) with any number of fields given in the flat file(Source Input).

Former Member
0 Kudos

Hello Andrey, Thanks for your input.

This seems to be a bit tricky. I am actually trying to use a source file with many no of fields in the structure, taken as input and dynamic field mapping should be generated and output be produced.

former_member220897
Participant
0 Kudos

Could you give more details about your scenario. An example parhaps?

former_member220897
Participant
0 Kudos

Even wondered what "Expression?" option in Lookup_ext function wizard means? It allows you to treat a field value as a Data Services expression and calculate the result on the fly. Sounds close to your requirements, doesn't it? And it's exactly how SAP Financial Information Management (FIM) works. So you may implement this technique in your custom job or try to implement FIM