Skip to Content
0

Append delete and insert in SAP BPODS

Aug 04, 2017 at 09:44 AM

144

avatar image
Former Member

I am loading HANA database using BODS. For this job i need to implement the condition If

New records---- Append.
Existing record --- delete and Insert.

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Best Answer
Ravi kiran Pagidi Aug 05, 2017 at 10:07 PM
1

Hi Jyoti deka,

You can achieve this in 2 ways. Here is the 1st way

Create a table (name as PARAM_YEAR_PERIOD) with columns

1.Year

2.Period

Now you will have 3 data flows

1. Source files to PARAM_YEAR_PERIOD table.

2. PARAM_YEAR_TABLE to delete records from target for the YEAR and PERIOD records.

3. Source files to your TARGET Table.

in 1st dataflow take source file-----> Query-----> PARAM table

in query, map only YEAR and PERIOD column to schema out and select distinct in query. Make YEAR and PERIOD as primary key columns in the query and set use input keys to YES, auto correct load to YES and allow merge to YES in target table options. Enable delete data before loading in the target.

Here you will load only distinct YEAR and PERIOD in the PARAM table.

in the 2nd dataflow

PARAM_YEAR_PERIOD ----> Query---->Map_operation----> Target table

in query map YEAR and PERIOD to schema out and in the map operation transform set NORMAL to DELETE, here it will delete the records in the target table if YEAR and PERIOD that coming from PARAM table already exists in the target table

in the 3rd dataflow you will load Source files to your TARGET Table.

Now 2nd way...........

Create a table (name as PARAM_YEAR_PERIOD) with columns

1.Row_num

2.Year

3.Period

4.Status

Now you will have 2 data flows

1. Source files to PARAM_YEAR_PERIOD table.

2. Source files to your TARGET Table.

in 1st dataflow take source file-----> Query-----> PARAM table

in query, map only YEAR and PERIOD column to schema out and create new columns Row_num and map gen_row_num() function, and create column STATUS with datatype varchar(1) and map 'N' to it. and select distinct in query. Make YEAR and PERIOD as primary key columns in the query and set use input keys to YES, auto correct load to YES and allow merge to YES in target table options. Enable delete data before loading in the target.

Here you will load only distinct YEAR and PERIOD in the PARAM table.

now you have to take a script and create global variable like $G_ROW_NUM, $G_YEAR, $G_PERIOD, $G_COUNT.

now write script like below

$G_ROW_NUM=1;

$G_COUNT=sql('DataStore','select count(*) from PARAM_YEAR_PERIOD');

print('Number of records in PARAM table : '||$G_COUNT);

while($G_COUNT>=1)

Begin

$G_YEAR=sql('DataStore','select Year from PARAM_YEAR_PERIOD where row-num={$G_ROW_NUM}');

$G_PERIOD=sql('DataStore','select Year from PARAM_YEAR_PERIOD where row-num={$G_ROW_NUM}');

sql('DataStore','delete from TARGET_TABLE where year={$G_YEAR} and Period ={$G_PERIOD}');

sql('DataStore','update PARAM_YEAR_PERIOD set Status='Y' where row_num={$G_ROW_NUM}');

$G_COUNT=sql('DataStore','select count(*) from PARAM_YEAR_PERIOD where Status=/'N/'');

if($G_COUNT>=1)

begin

$G_ROW_NUM=$G_ROW_NUM+1;

end

else

begin

$G_COUNT=0;

end

end

Now data for the YEAR and PERIOD that are coming for the source will be deleted in the target table.

Now you your 2nd dataflow for Source files to your TARGET Table will run.

Note :- Please do complete testing and implement accordingly, let us know if you have any doubts in this.

Thanks,

Ravi kiran.

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Ravi

Thank you so much. I was trying on different way. I created the flow as you suggested and got the output..

Thanks

Jyoti

0
Ravi kiran Pagidi Aug 04, 2017 at 10:27 AM
0

HI Jyoti deka,

tick Delete data before loading in the target table and load it. All existing records will get deleted and all the records in source will get inserted into target table.

Show 6 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Ravi,

Thanks for your reply. I cant use the Delete before load, because we are using 12 source table for that.If I will do that I will lost all the source file data. So need to check if it exist already delete and insert and if new data append.

If you want more details to clarify the requirement please let me know.

0

Hi Jyoti,

How you are loading these 12 source data in to HANA table, is it a union of these 12 source files/tables? are you doing this in a single data flow or 12 data flows?

0
Former Member

Hi Ravi,

I am loading this files one by one using single flow. Previously I was doing it with Bulk load with Append options. Because client told me that every time I will get new data set. But they send six same files which have some new data with old data . Now they asked me to create a new flow for the same target table to load this six same file. Scenario like--- check if the year and period is available in target than delete all the data for the specific year and period and load the the file. If new year and periods data come append it directly.

0

Hi Jyoti deka,

You can achieve this in 2 ways. Here is the 1st way

Create a table (name as PARAM_YEAR_PERIOD) with columns

1.Year

2.Period

Now you will have 3 data flows

1. Source files to PARAM_YEAR_PERIOD table.

2. PARAM_YEAR_TABLE to delete records from target for the YEAR and PERIOD records.

3. Source files to your TARGET Table.

in 1st dataflow take source file-----> Query-----> PARAM table

in query, map only YEAR and PERIOD column to schema out and select distinct in query. Make YEAR and PERIOD as primary key columns in the query and set use input keys to YES, auto correct load to YES and allow merge to YES in target table options. Enable delete data before loading in the target.

Here you will load only distinct YEAR and PERIOD in the PARAM table.

in the 2nd dataflow

PARAM_YEAR_PERIOD ----> Query---->Map_operation----> Target table

in query map YEAR and PERIOD to schema out and in the map operation transform set NORMAL to DELETE, here it will delete the records in the target table if YEAR and PERIOD that coming from PARAM table already exists in the target table

in the 3rd dataflow you will load Source files to your TARGET Table.

Now 2nd way...........

Create a table (name as PARAM_YEAR_PERIOD) with columns

1.Row_num

2.Year

3.Period

4.Status

Now you will have 2 data flows

1. Source files to PARAM_YEAR_PERIOD table.

2. Source files to your TARGET Table.

in 1st dataflow take source file-----> Query-----> PARAM table

in query, map only YEAR and PERIOD column to schema out and create new columns Row_num and map gen_row_num() function, and create column STATUS with datatype varchar(1) and map 'N' to it. and select distinct in query. Make YEAR and PERIOD as primary key columns in the query and set use input keys to YES, auto correct load to YES and allow merge to YES in target table options. Enable delete data before loading in the target.

Here you will load only distinct YEAR and PERIOD in the PARAM table.

now you have to take a script and create global variable like $G_ROW_NUM, $G_YEAR, $G_PERIOD, $G_COUNT.

now write script like below

$G_ROW_NUM=1;

$G_COUNT=sql('DataStore','select count(*) from PARAM_YEAR_PERIOD');

print('Number of records in PARAM table : '||$G_COUNT);

while($G_COUNT>=1)

Begin

$G_YEAR=sql('DataStore','select Year from PARAM_YEAR_PERIOD where row-num={$G_ROW_NUM}');

$G_PERIOD=sql('DataStore','select Year from PARAM_YEAR_PERIOD where row-num={$G_ROW_NUM}');

sql('DataStore','delete from TARGET_TABLE where year={$G_YEAR} and Period ={$G_PERIOD}');

sql('DataStore','update PARAM_YEAR_PERIOD set Status='Y' where row_num={$G_ROW_NUM}');

$G_COUNT=sql('DataStore','select count(*) from PARAM_YEAR_PERIOD where Status=/'N/'');

if($G_COUNT>=1)

begin

$G_ROW_NUM=$G_ROW_NUM+1;

end

else

begin

$G_COUNT=0;

end

end

Now data for the YEAR and PERIOD that are coming for the source will be deleted in the target table.

Now you your 2nd dataflow for Source files to your TARGET Table will run.

Note :- Please do complete testing and implement accordingly, let us know if you have any doubts in this.

Thanks,

Ravi kiran.

0
Former Member

Hi Ravi,

I have some confusion on Map operation function. When to change the map operation code and how it works with different combination. Please help me to understand this concept. I will be always great full to you.

Thanks

Jyoti.

0

Hi Jyoti,

All the records that comes from source will come with Normal opcode. when you set

normal (input)----->nomal (output) = it tries to insert records

normal ----> insert = it tries to insert records

normal ----> update = it tries to send incoming records as update to target table

normal ---> delete = it tries to delete the incoming records if these records already exists in the target.

Nomal ---> discard = it will ignore all normal records and these records will not passout of map_operation transform

In your case your PARAM table has YEAR and PERIOD values, say YEAR=2017 and Period=6 then if you set

Normal ----> delete , all the records that has YEAR=2017 and PERIOD=6 records in the target.

For more explanation on MAP_Operation please refer below link

http://sapsimplified.com/sap-bods/map-operation-transformation-in-sap-bods/

https://blogs.sap.com/2013/04/04/mapoperation-transformation-in-data-services/

Thanks,

Ravi kiran

0
avatar image
Former Member Aug 05, 2017 at 01:30 AM
0

Hi Jyoti,

Based on your requirement. I would suggest SCD2 be the best option for you.

SCD2 - You can keep old data( History - Updates) and new data( INSERT - Appends ) as data comes from files daily.

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Nagaraju,

In my requirement I have to delete all the data for that specific periods if the same periods data come in the new file than append all data from that new files(includes new+old data). And if new periods data comes append it.

0