cancel
Showing results for 
Search instead for 
Did you mean: 

Remove duplicats in bods4.1

vijy_mukunthan
Active Contributor
0 Kudos

Dear Experts

  I have a requirement that i want to delete duplicates in my template table

Ex:

MATNRLAST_DIRECT_COSTLINE_DISCOUNTPRICELOCATION_ORIGINAL
27394.518<Null>94.518AS
27341.61<Null>94.518INC
27341.61<Null><Null>INC
10025473304.688<Null>3304.688AS
10025471485.23<Null>3304.688INC
10025471485.23<Null>3464.74INC
10025473464.74<Null>3304.688PTE
100263916.67<Null>16.67AS
1002639829.25<Null>16.67INC
1002639829.25<Null>1141.45INC
10026391141.45<Null>16.67PTE

  Here the MATNR ( Material No) is duplicated many times. How to delete this data based on condition. I have tried in query transform distinct row. Its not working.

I am coming from ABAP background

In ABAP syntax

Delete adjacent duplicates from <TABLE NAME> comparing MATNR LOCATION_ORIGINAL


This statement simply gives correct result as

MATNRLAST_DIRECT_COSTLINE_DISCOUNTPRICELOCATION_ORIGINAL
27394.518<Null>94.518AS
10025473304.688<Null>3304.688AS
100263916.67<Null>16.67AS

How to achieve the same in BODS?

Kindly suggest in 1 or 2 steps.

Regards

Vijay Mukunthan

Accepted Solutions (0)

Answers (2)

Answers (2)

former_member187605
Active Contributor
0 Kudos

Distinct rows will return the rows that have a different value in at least one column. Therefore, in your case, it returns all.

Why don't you use a simple condition in the where-clause of a query transform?

     LOCATION_ORIGINAL = 'AS'

vijy_mukunthan
Active Contributor
0 Kudos

Hi Arun and Dirk

Thanks for your reply.

  Both the solution cannot be implemented.

@ Arun

  That type of query cannot be used because price cannot be sum. If i do that price will be double this is wrong.

@ Dirk

  My previous dataflow as separate location wise data. But i am merging all the location data into single table this is required for users to validate the price. Here only i need to remove the duplicates. This is my requirement.

Thanks

Vijay Mukunthan

former_member187605
Active Contributor
0 Kudos

I am sorry but then I do not understand your requirement. My where-clause applied to your input will produce the exact results as included in your original post.

former_member198401
Active Contributor
0 Kudos

Could you please specify your statement

My previous dataflow as separate location wise data. But i am merging all the location data into single table this is required for users to validate the price. Here only i need to remove the duplicates.

vijy_mukunthan
Active Contributor
0 Kudos

Hi Arun

  I have addressed the remove duplicate in different way.

Solution:

I wrote a SQL script in Post-Load Commands to remove duplicate. It worked fine. The script i used is as follows

delete from se2_sd_15_sales_price_all_lc

where rowid not in

(select min(ROWID)

from se2_sd_15_sales_price_all_lc

group by matnr )

@ Drik

I am coming from ABAP background so always my mind set tell to do coding to achieve it. Now it time to change my mind set. I saw your design how to achieve that. Now i understood that i should follow your steps. None the less i have 4 more developments. Where i will face lots of issue. I will approach here for more knowledge.

Thanks and Regards

Vijay Mukunthan

former_member198401
Active Contributor
0 Kudos

Hi Vijay,

It can be seen from the table snapshot that there are no duplicates in your data. Its only the MATNR has different entries and they are unique so Data Services will always repeat MATNR along with the other attributes

As far as the logic of your ABAP query, you are grouping records by MATNR and LOCATION_ORIGINAL. The same can be done in Data Services by Adding the MATNR and LOCATION_ORIGINAL in Group By  and SUM(PRICE) under the select tab in the query transform.

The SQL Query generated by Data Services will be as follows:

select MATNR,LAST_DIRECT_COST, LINE_DISCOUNT, SUM(PRICE), LOCATION_ORIGINAL from TABLE_NAME where LOCATION_ORIGINAL = 'AS'

group by MATNR, LOCATION_ORIGINAL

This should work...

Regards

Arun Sasi

former_member187605
Active Contributor
0 Kudos

That's an invalid SQL-statement. With a group by-clause, you need an aggregation function for all non-grouped columns.

former_member198401
Active Contributor
0 Kudos

Hey Dirk,

forgot to add non group columns....LAST_DIRECT_COST, LINE_DISCOUNT