Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Delete duplicate entriess from the internal table its urgent pls help.

Former Member
0 Kudos

Hi friends,

Hope everybody is doing good,Here is m query on delete duplicate data from the intenal table.

I have an internal table which contain data in the following format.

Doc No Comp Cod Vendor Assignment

1500000009 JM11 00000000

1500000008 JM11 20070212(Repeating)

1500000007 JM11 20070212

1500000006 JM11 00000000

1500000005 JM11 00000000

1500000004 JM11 00000000(Repeating)

1500000003 JM11 00000000 (Repeating)

1500000002 JM11 00000000

1500000001 JM11 20050302

1500000000 JM11 00000000

1500000003 JM11 10000088

1500000001 JM11 10000088

1500000030 JM11 10006260

1500000010 JM11 10006269

1500000008 JM11 10006269

1500000006 JM11 10006269

1500000004 JM11 10006269

if you see the document numbers,there are some document number which are repeating here,there are some document numer which contain vendor number but not the assignments,some of the document numbers contain the assignments but not the vendors.

If my internal table contain this kind of data with repeted document numbers than i want the document number which contains only the vendor number.

Pls help me with the appropriate logic,its urgent.

Thanks a lot

mrutyun^

6 REPLIES 6

Former Member
0 Kudos

Hi,

Try this:


SORT IT_DATA BY Doc No Comp Cod Vendor Assignment.
DELETE ADJACENT DUPLICATES FROM IT_DATA COMPARING ALL FIELDS.

Regards,

0 Kudos

SORT IT_DATA BY Doc No Comp Cod Vendor Assignment.

DELETE ADJACENT DUPLICATES FROM IT_DATA COMPARING doc no.

Regards,

Amit R.

Former Member
0 Kudos

sort itab by Doc_No Comp_Cod Assignment ascending Vendor descending.

delete adjacent-duplicates from itab comparing Doc_No Comp_Cod Vendor Assignment.

reward if useful.

Former Member
0 Kudos

Hi,

<u><b>Deleting Adjacent Duplicate Entries</b></u>

To delete adjacent duplicate entries use the following statement:

DELETE ADJACENT DUPLICATE ENTRIES FROM <itab>

[COMPARING <f1> <f2> ...

|ALL FIELDS].

The system deletes all adjacent duplicate entries from the internal table <itab>. Entries are

duplicate if they fulfill one of the following compare criteria:

Without the COMPARING addition, the contents of the key fields of the table must be

identical in both lines.

If you use the addition COMPARING <f1> <f2> ... the contents of the specified fields <f1>

<f2> ... must be identical in both lines. You can also specify a field <fi> dynamically as

the contents of a field <ni> in the form (<ni>). If <ni> is empty when the statement is

executed, it is ignored. You can restrict the search to partial fields by

specifying offset and length.

If you use the addition COMPARING ALL FIELDS the contents of all fields of both lines

must be identical.

You can use this statement to delete all duplicate entries from an internal table if the table is

sorted by the specified compare criterion.

If at least one line is deleted, the system sets SY-SUBRC to 0, otherwise to 4.

Examples

DATA: BEGIN OF LINE,

COL1 TYPE I,

COL2 TYPE I,

END OF LINE.

DATA ITAB LIKE HASHED TABLE OF LINE WITH UNIQUE KEY COL1.

DO 4 TIMES.

LINE-COL1 = SY-INDEX.

LINE-COL2 = SY-INDEX ** 2.

INSERT LINE INTO TABLE ITAB.

ENDDO.

LINE-COL1 = 1.

DELETE TABLE ITAB: FROM LINE,

WITH TABLE KEY COL1 = 3.

LOOP AT ITAB INTO LINE.

WRITE: / LINE-COL1, LINE-COL2.

ENDLOOP.

The output is:

2 4

4 16

The program fills a hashed table with a list of square numbers. The DELETE

statement delete the lines from the table where the key field COL1 has the contents 1 or 3.

Regards,

Bhaskar

former_member404244
Active Contributor
0 Kudos

Hi,

Is vendor assignment = 00000000.if yes we can do like this.

Loop at itab.

delete itab where <vendor assignment> = '00000000'.

endloop.

it will delete the vendor assignment .

Regards,

Nagaraj

0 Kudos

Hi All,

Thanks a lot for the timely help and the answers provided by you,i solved this prob myself.

Thanks a lot

mrutyun^