cancel
Showing results for 
Search instead for 
Did you mean: 

Delta capability for enhanced standard master datasource

Former Member
0 Kudos

Hi All,

I read many docs regarding delta functionality for standard enhanced extractors. But i could not get a clear view of it.

I have a datasource 0VENDOR_ATTR where i enhanced it with email id field and that data for enhanced field is coming from ADR6 Table. I have a couple of questions. Please guide me.

1) Will the delta's work properly for the enhanced field after the full load by doing a normal enhancement using CMOD?

2) If an email id is modified, will it update in BW, as i will be doing a delta load.

Thanks,

Karthik

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Krishna,

I just wanted to add a few comments:

If you are already running a delta load on 0VENDOR_ATTR and now you have added a new field in it so delta's will not have any impact of that. However, during the transports in higher systems you have to take care that delta queue for 0VENDOR_ATTR is empty as this is a structral change and your transport will fail if it has any backlog records(even in delta repeat). So make sure you clear them up before moving it to higher systems.

Now coming to delta generation if you change only Email address. 0VENDOR_ATTR datasource works based on change pointers that mean it depends on BDCP2 (or BDCPS in old version) table for deltas. If you make any change to vendor it gets logged in this table and if you have written correct logic in User exit to populate the email address then delta will surely be generated even if you change only email address.

I checked in my system and I just changed email address of vendor and delta was generated despite of the fact that I have not enhanced my datasource for vemail address.

So it all depends on your user exit in which you should write correct logic of extraction of emial address.

Let us know if you have any difficulty.

Thanks

Amit

Former Member
0 Kudos

Hi Amit,

Thanks for the response.

Lets Say the delta is running for 0VENDOR_ATTR and i have appended a new field now ZZEMAIL. Now if i want the data from the beginning for that ZZEMAIL. I think the only thing we can do is to trigger full which will overwrite all the data and then , trigger subsequent delta. Regarding the transports, i will make the delta queue empty and tranport the datasource to external systems.Correct me if iam wrong.

Can u pls check my code and see if its correct?

As email id needs to be added to Ovendor_attr, The Email id field is coming from ADR6 table but there was no common primary key available between the DS and Table  The primary key in OVENDOR_ATTR is LIFNR and ADR6 is ADDNO (Address num) So i used an intermediary table LFA1 which has primary key LIFNR. The common key between ADR6 and LFA1 table is ADDNO. I want to do the linking by this logic.

Sample Code:

Select LIFNR ADDNO ZZEMAIL from LFA1 into IT_LFA1 where LIFNR= IT_VENDOR-LIFNR

Select ADDNO EMAIL from ADR6 into IT_ADR6 where ADDNO = IT_LFA1-ADDNO

Loop at IT_VENDOR into WA_VENDOR

Read IT_LFA1 into WA_LFA1 with key LIFNR = WA_VENDOR - LIFNR

IF SY-SUBRC = 0

READ IT_ADR6 into WA_ADR6 where ADDNO = WA_LFA1 - ADDNO

WA_LFA1 - ZZEMAIL = WA_ADR6 - EMAIL

WA_VENDOR - ZZEMAIL = WA_LFA1 - ZZEMAIL

Modify WA_VENDOR into IT_VENDOR

I_T_DATA =  IT_VENDOR


Will the delta works properly if i use this code?

Thanks,

Karthik

Former Member
0 Kudos

Hi Krishna,

Yes you are right, you will have to do a full load to get all historical email ids.

For code, please find the code below:

-------------------------------------------------------

   DATA: gt_vendor TYPE STANDARD TABLE OF i_t_data,
      gs_vendor LIKE LINE OF gt_vendor.
TYPES: BEGIN OF ty_email,
        adrnr TYPE adrnr,
        smtp_addr TYPE ad_smtpadr,
       END OF ty_email,
       BEGIN OF ty_lfa1,
         lifnr TYPE lifnr,
         adrnr TYPE adrnr,
         END OF ty_lfa1.
DATA: gt_email TYPE STANDARD TABLE OF ty_email,
      gs_email LIKE LINE OF gt_email,
      gt_lfa1 TYPE STANDARD TABLE OF ty_lfa1,
      gs_lfa1 LIKE LINE OF gt_lfa1.
DATA: lv_index TYPE sy-tabix.
gt_vendor[] = i_t_data[].

SELECT lifnr adrnr FROM lfa1 INTO TABLE gt_lfa1
  FOR ALL ENTRIES IN gt_vendor WHERE lifnr = gt_vendor-lifnr.

SELECT addrnumber smtp_addr FROM adr6 INTO TABLE gt_email
  FOR ALL ENTRIES IN gt_lfa1 WHERE addrnumber = gt_lfa1-adrnr.

LOOP AT gt_vendor INTO gs_vendor.
  lv_index = sy-tabix.
  READ TABLE gt_lfa1 INTO gs_lfa1 WITH KEY lifnr = gs_vendor-lifnr.
  IF sy-subrc EQ 0.
    READ TABLE gt_email INTO gs_email WITH KEY addrnumber = gs_lfa1-adrnr.
    IF sy-subrc EQ 0.
      gs_vendor-zzemail = gs_email-smtp_addr.
      MODIFY gt_vendor FROM gs_vendor INDEX lv_index.
    ENDIF.
  ENDIF.
ENDLOOP.

CLEAR: i_t_data.
REFRESH: i_t_data.
i_t_data[] = gt_vendor[].

-------------------------------------------------------

This should help you to code better. Please modify as needed.

Thanks

Amit

Former Member
0 Kudos

Hi Amit,

Thanks for the code. It helped me with my requirement.

I have a doubt that is related to function module. Can you tell me what is

1) Open cursor, fetch cursor and close cursor? I searched extensively in Google regarding it but could not find anything productive.

2) What is the execution procedure of function module. Means How many calls does a function module takes during an extraction(Not the steps of FM). I read many docs regarding that but could not understand clearly. Can u explain if possible?

Thanks,

Karthik

Former Member
0 Kudos

Hi Krishna,

I think there are many detailed documents present for CURSOR statement. But as you said you could not find anything productive, so I can give you a brief idea of this. In details you can go back to docs and read.

CURSOR -

Ideally when you use a OPEN CURSOR statement, it set a cursor in your db table from where the data is to be selected based on your selection criteria. For example, lets say in KNA1 (Customer Master) we want to extract data from customer 000020, then I can use open cursor statement for this and with a selection criteria KUNNR = '000020'. The effect of this will be that in KNA1 at database level a cursor will be marked at KUNNR = 000020. This statement does not extract any data in your internal table and just set the pointer.

Next comes the FETCH statement, it is used in conjunction with OPEN CURSOR as FETCH actually extracts data based on the cursor set by OPEN statement. So if we have opened a cursor at KUNNR = 000020, then FETCH statement will start selection from that point.

After all selection is complete then FETCH statement returns a sy-subrc other than 0 i.e. NE 0, then in this case we have to close the open cursor which is done by CLOSE CURSOR.

In BW related generic FM's we use OPEN CURSOR WITH HOLD - so this means that the open cursor will not be closed until we explicitly close with a CLOSE CURSOR command.

Now since we have this FM called as many time the no. of package mentioend in RSA3 or until the data is fully extracted so we use WITH HOLD so that cursor remain open till processing is complete.

Second, coming to your question on how many times the FM is called - This ideally depends on the type of datasource either if tis a generic or standard etc. Generally, below is the sequence of call for any type: Same FM is called many times -

1. First FM call is to pass the selection criteria and fields to be selected for datasource. For this mostly roosource, roosfield and roosgen tables are used and datasource's delta mechanism and fields of datasources are identified including what fields are hidden or not. Also if we pass any selection criteria like KUNNR = 000020, it is also passed to selection criteria. And a dynamic selection clause is created.

2. Second FM call is to pass this dynamic selection criteria and field list for data selection. Here actually the OPEN CURSOR and other selection statements are used. Also if there is any additional logic is required - all is applied in this call. Here the data selected is put to output tables.

So ideally 2 calls are made to FM.

3. Next hand off of the processing is to customer exits or if any BTEs are written for the datasource. All the selected data is then processed as per user exits etc.

4. Finally the data is passed to BI system.

Let me know if you have any questions.

Thanks

Amit


Former Member
0 Kudos

Hi Amit,

Sorry for the late response, i was on off. Thanks a lot for the detailed explanation. I understood a lot about FM Call. I have this requirement can u give me a sample code how to do pls?

I have to extract data from MCHA, AUSP, CABN Tables. By extracting the data from these tables i extract data of 3 fields Batch number (from MCHA), value ( From AUSP) and  name (From CABN)

Link between 3 tables:

MCHA and AUSP tables has a common field Obj id,

AUSP and CABN tables has common field Internal char number.

The output will be like this:

Batch    Name   Value

1001      A        40

1002      B        50

Can u help me pls, I am not having a good ABAP Knowledge. Thanks a lot again for the explanation

Thanks,

Karthik

Answers (1)

Answers (1)

RamanKorrapati
Active Contributor
0 Kudos

Hi,

1. Yes, if your logic was correct.

delta functionality won't disturb. if you need full load(historic data for enhanced field) then you need delete old init.

Do the full load and do the init without data transfer. next time onwards use delta loads.

2. No.

Thanks

Former Member
0 Kudos

Hi Ramanjaneyulu,

Then what should i do say if i have 10 modified email id's when my delta was running. Do i need to delete all the data and reload with full?

Thanks,

Karthik

RamanKorrapati
Active Contributor
0 Kudos

Your data source is master data source.

So you do repair full request and reload those 10 records thru info pack selections.

Even why you think email id will change?

Former Member
0 Kudos

Hi Ramanjaneyulu,


I was just curious if this happens after implementing.


I saw this somewhere that we have to check this option " change document" for enhanced fields in my case its email id. Is it true?


Thanks,

Karthik

RamanKorrapati
Active Contributor
0 Kudos

Lets try at dev system and see.

Actually your data source is standard one.

Then it works delta functionality on its own not by custom made.

if want custom made then you may need abap expert and change logic of extractor.But not good idea to change standard delta function. its your own risk.