on 11-02-2004 12:02 PM
Hi,
I have a few verifications with LO cockpit.
I have dobne the following steps.
1. Activated the datasource.
2. The update group shows active.
3. In maintenance I have selected the fields.
4. Deleted the setup tables.
Now my question is when I run the intial set up do I need to bring the system down so that no updates are taking place.The second question is when I do I set the update to be queued update . When does it get set for the delta loads.
We are writing a function module to put extra fields on the delivery table and would need to pull these fields in the standard extractor. Can I be guided as to how to accomplish this.
Please help me out as it is very urgent.
Thanks
Amit
Hi Amit,
in a productive environment, you need to run the set up in a time frame with no activities related to the application. The best is, no activity on the system, not even some background tasks for posting orders ...
After successful set up run, start your initial upload to BW and initialize the delta.
From now on the R/3 system takes care about the delta. The delta data of the application will then be posted to the delta queue each time your scheduled report runs. This might be hourly, daily or any other period depending on your needs.
If you need to extra fields in your extract structure, you need to create an append structure for the related MCXXXX-structure, maintain your extract structure once more, run a setup again .... The newly appended fields needs to be populated during extraction time using the user exit for transaction data.
If you have more questions let us know and don't forget to assign some points to the guys that help you.
regards
Siggi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Siggi,
Thanks for the insight. I am running this right now in development and wanted to know if I need to get the system down when I run the intialize stup tables .
I wanted to know when to select the type of delta in LO is it before running the intial set up tables . Let us say for the intial load I wanted to run a direct delta and then for the other loads I wanted to run a queued delta what would be the procedure.
<b>After successful set up run, start your initial upload to BW and initialize the delta.</b>
Could you expand a bit more on this as this is the first time I am doing this.Could you tell me the user exit for orders and where do I need to call the user exit. I dont have any experience with this and hence would need the utmost explaination.
I will definitely assign points to you Siggi as its all a matter of gratitude and I thank you in advance.
thanks
amit
For deliveries what would be the MC*** strucutre name.
Hi Amit,
even in the development box it would be nice if there is no activity on it. But I think it is not really neccessary. Your might get 'some' strange bookings into BW if somebody posts data to the application while the setup is running and the delta initialization is not yet done. But from my experience (in the development box) this is not relevant. Always delete the setup tables before running a new setup. After the setup finished successful, goto BW and schedule an infopackage. You have two options now. The first option (I prefer this one in the productive environment) is: schedule an infopackage for the initialization of the delta without data transfer and run the package (from now on every new or changed data is going to the delta queue). Then run a second package and chose full upload (the data is selected from the setup table). The second option is: schedule an infopackage fot the initialization of the delta with data transfer and run the package (Depending on the amount of data this might take a long time).
After the data is initially loaded schedule a new delta package that will always extract the data from the delta queue of the application.
For the enhancements, you only need to do some coding in the SAP Enhancement RSAP0001 in function module EXIT_SAPLRSAP_001 in include ZXRSAU01.
Hope this brings some light into the dark and is of some help for your ongoing work.
regards
Siggi
Hi siggi,
thanks that helped a lot. The only part that is still unanswered is that in the intial load if there is a huge amount of data then we would want to run thr process in parallel by dividing the job. Can you explain the steps to run the process in parallel and where do i create these jobs.
I have alloted almost full points and then I will allocate the full points once this answered
thanks
amit
Hi Amit,
normally you run the setup in your source system once for all the historic data. If you want to devide the loads into two or more steps you need to have a good selection criteria for it (change date, may be document number). Then you schedule the uploads within BW, running different infopackages (full upload) with the different selections. These infopackages can be scheduled parallel or one after the other, depending on the amount of data, on the load of your machine ...
Hope this helps.
Siggi
Hi Siggi,
I am dealing with delivery line datasource. I am not familiar with the code that I need to put in the user exit. Well we have 3 new fields that have been added on the delivery line table and we need to get the extractor to be enhanced to take these 3 new fields into account.
Please can you send across a code that i can use .
thanks
amit
Hi Amit,
in the first step you need to append the same fields to structure MCLIPS. Then, within the LO-Customizing cockpit, change your extract structure again. You will see the appended fields there and will be able to select them into your extract structure. Then activate your datasource again. Run a new setup (you need to delete the setup tables as well as the delta queue before changing the extract structure). Check your datasource with rsa3. If you are lucky, your fields will already be populated. If not, you need to apply some coding. I will provide you the next steps on Monday, because I don't have access to a system now and need to check the name of the include.
have a nice weekend
regards
Siggi
Hi Amit,
here the information you need to do the coding enhancement. Go to SAP Enhancement RSAP0001 and choose fm EXIT_SAPL_RSAP_001 (transactional data). Edit the include ZXRSAU01 and add the following coding:
data: wa_data type <name of your extract structure>,
wa_lips type lips,
l_index type i.
CASE i_datasource.
WHEN '2LIS_12_VCITM'.
loop at c_t_data into wa_data.
l_index = sy-tabix.
select single * into wa_lips from lips
where vbeln = wa_data-vbeln
and posnr = wa_data-posnr.
if sy-subrc = 0.
wa_data-<name of your append fields> = wa_lips-<name of your append fields>.
....
endif.
modify c_t_data from wa_data index l_index.
endloop.
ENDCASE.
regards
Siggi
Message was edited by: Siegfried Szameitat
Hello Siggi,
Sorry to bother you again with this, well the fields that we are adding is not to the LIPS table. A function module has been written to create 4 fields on the fly and we need to populate the extract structure with these 4 fields so that the bw systems gets these 4 fields. if i can get the code i would be really thankful.
thanks
amit
Hey SIggi,
When you say MCLIPS where do I do that. I am confused since only if the fields are on LIPS will it come on the right hand side and we move those fields ans can append them to the extract sturcture.
If you say we need to call the function module which part of the code would i need to change.This is thw first time I am playing with LO cockpit and hence have no idea.
Please help me out.
thanks
amit
Hi Amit,
the fields for MCLIPS must be appended using transaction SE11. After you activate MCLIPS, you are right, you will see the new fields on the right side in transaction LBWE for the maintenance of your extract structure and you will be able to put them into the extract structure with the arrow button.
To give you an exact direction for the coding, I need to know the fm (import/export parameters, tables ...). If you will provide me with this information I'll try to give you the coding.
regards
Siggi
Hi Siggi,
Here is the function module:
name: Z_PALLET_SPOT_CALC_DELIVERY
FUNCTION z_pallet_spot_calc_delivery.
*"----
""Local interface:
*" IMPORTING
*" VALUE(I_VBELN) LIKE LIKP-VBELN DEFAULT '86000801'
*" VALUE(I_POSNR) LIKE LIPS-POSNR OPTIONAL
*" VALUE(I_VKORG) LIKE LIKP-VKORG DEFAULT 'US01'
*" VALUE(I_VTWEG) LIKE LIPS-VTWEG DEFAULT '00'
*" EXPORTING
*" VALUE(E_PALLET_SPOT) LIKE ZZVS001-SPOT
*" VALUE(E_REAL_PALLETS) LIKE ZZVS001-SPOT
*" VALUE(E_TOTAL_WEIGHT) LIKE ZZVS001-BRGEW
*" VALUE(E_TOTAL_VOLUME) LIKE ZZVS001-VOLUM
*" VALUE(E_TOTAL_PALLETS_3) LIKE ZZVS001-ZSPOT_3
*" TABLES
*" T_LIPS STRUCTURE LIPS OPTIONAL
*" T_ZZVS001 STRUCTURE ZZVS001 OPTIONAL
*" T_ZZVS0012 STRUCTURE ZZVS0012 OPTIONAL
*" EXCEPTIONS
*" NO_DATA_FOUND
*"----
DATA: l_lfimg LIKE lips-lfimg, " Delivery Quantity
l_matnr LIKE lips-matnr, " Material Number
l_meins LIKE lips-meins, " Base Unit of Measure
l_vrkme LIKE lips-vrkme, " Sales Unit
ls_zzvs001 LIKE zzvs001,
ls_zzvs0012 LIKE zzvs0012.
CLEAR: e_pallet_spot, e_real_pallets, e_total_weight, e_total_volume,
e_total_pallets_3, z_spots_3, z_spots_1, z_tot_wgt, z_tot_vol.
FREE: t_zzvs001, "Clear header line and contents
t_zzvs0012.
IF NOT i_vbeln IS INITIAL.
IF NOT i_posnr IS INITIAL.
SELECT *
INTO TABLE t_lips
FROM lips
WHERE vbeln EQ i_vbeln
AND posnr EQ i_posnr.
ELSE.
SELECT *
INTO TABLE t_lips
FROM lips
WHERE vbeln EQ i_vbeln.
ENDIF.
IF sy-subrc NE 0.
RAISE no_data_found.
ENDIF.
ENDIF.
LOOP AT t_lips.
CHECK t_lips-nopck IS INITIAL " Indicator not relevant for pici
AND NOT t_lips-komkz IS INITIAL. " Indicator for picking control
l_matnr = t_lips-matnr.
l_lfimg = t_lips-lfimg.
l_meins = t_lips-meins.
l_vrkme = t_lips-vrkme.
CLEAR: ls_zzvs001, " Clear header line
ls_zzvs0012.
ls_zzvs001-mandt = sy-mandt.
Convert sales unit of measure to base unit of measure
IF l_meins NE l_vrkme.
PERFORM mat_unit_conv
USING l_lfimg " Delivery Quantity
l_matnr " Material Number
l_vrkme " Sales Unit
l_meins " Base Unit of Measure
CHANGING dummy_f
zz_umren
zz_umrez
rc.
CHECK rc = 0.
l_lfimg = dummy_f.
ENDIF.
Convert a base unit of measure to 1 pallet
MENG is the quantity to be converted in base unit of meas ie 'CA '
cases. Returned ZZ_UMREZ contains the number of cases per 1 pallet
SELECT SINGLE *
FROM marm
WHERE matnr = l_matnr
AND meinh = 'PAL'.
CHECK sy-subrc = 0.
PERFORM mat_unit_conv
USING zz_lfimg " Constant 1000
l_matnr " Material Number
'PAL' " Pallet
l_meins " Base Unit of Measure
CHANGING dummy_f
zz_umren
zz_umrez
rc.
CHECK rc = 0.
Get weight and volume and product attributes for material
PERFORM mat_maapv
USING l_matnr
i_vkorg
i_vtweg
CHANGING maapv
rc.
CHECK rc = 0.
For PALLET SPOTS:
If there is any value in the hundredths decimal position, it will
round the tenths decimal position up by 1 tenth. Because of float-
ing point calculations this is accomplished by adding .444 to calc
CLEAR: z_spots, vz_spots.
IF l_lfimg NE 0.
z_spots = ( ( l_lfimg * zz_umren ) / zz_umrez ) + z_5.
vz_spots = ( l_lfimg * zz_umren ) / zz_umrez.
vz_spots = vz_spots + z_4.
ENDIF.
Calculate Number of Real Pallets
e_real_pallets = e_real_pallets + z_spots.
ls_zzvs001-real_pallets = z_spots.
Product attribute 4 = Top Load product
Product attribute 5 = Single Stack Prod(reserves 2 pallet spots)
Single Stack: move calc'd field to dec4_1 to get value in tenths
before multiplying by factor of 2.
Product attribute 6 = Half Mod Prod(reserves 1/2 pallet spots)
Divide by factor of 2
Product attribute 7 = Qtr Mod Prod(reserves 1/4 pallet spots)
Divide by factor of 4
z_spots_cpy = z_spots.
IF maapv-prat7 = 'X'. " Quarter Mod Product
ls_zzvs001-spot = z_spots.
ls_zzvs001-zqtrmod = z_spots_cpy.
ls_zzvs001-spot = ls_zzvs001-spot / 4.
vz_spots = vz_spots / 4.
z_spots = ls_zzvs001-spot.
ENDIF.
z_spots_cpy = z_spots.
IF maapv-prat6 = 'X'. " Half Mod Product
ls_zzvs001-spot = z_spots.
ls_zzvs001-zhalfmod = z_spots_cpy.
ls_zzvs001-spot = ls_zzvs001-spot / 2.
z_spots = ls_zzvs001-spot.
vz_spots = vz_spots / 2.
ENDIF.
IF maapv-prat5 = 'X'. " Single Stack Product
ls_zzvs001-spot = z_spots.
ls_zzvs001-zsnglstk = z_spots_cpy.
ls_zzvs001-spot = ls_zzvs001-spot * 2.
z_spots = ls_zzvs001-spot.
vz_spots = vz_spots * 2.
ELSE.
ls_zzvs001-spot = z_spots.
ENDIF.
IF maapv-prat4 = 'X'. " Top Load Product
ls_zzvs001-ztopload = z_spots_cpy.
ENDIF.
ls_zzvs001-prat4 = maapv-prat4.
ls_zzvs001-prat5 = maapv-prat5.
ls_zzvs001-prat6 = maapv-prat6.
ls_zzvs001-prat7 = maapv-prat7.
ls_zzvs001-brgew = maapv-brgew * l_lfimg.
ls_zzvs001-gewei = maapv-gewei.
ls_zzvs001-volum = maapv-volum * l_lfimg.
ls_zzvs001-voleh = maapv-voleh.
IF maapv-prat4 = 'X'
OR maapv-prat5 = 'X'.
ls_zzvs001-tlss = z_spots.
ENDIF.
IF maapv-prat7 = 'X'. " Quarter Mod Product
ls_zzvs001-real_pallets = ls_zzvs001-real_pallets / 4.
ELSEIF maapv-prat6 = 'X'. " Half Mod Product
ls_zzvs001-real_pallets = ls_zzvs001-real_pallets / 2.
ENDIF.
ls_zzvs001-zspot_3 = vz_spots.
APPEND ls_zzvs001 TO t_zzvs001.
z_tot_wgt = z_tot_wgt + ls_zzvs001-brgew.
z_tot_vol = z_tot_vol + ls_zzvs001-volum.
z_spots_3 = z_spots_3 + vz_spots.
z_spots_1 = z_spots_1 + ls_zzvs001-spot.
Load additional information into secondary table Structure. Quantity
has already been converted to Base Unit of Measure
t_lips-matnr = '000003400024000000'.
t_lips-lfimg = 10023.
t_lips-meins = 'CA'.
IF maapv-prat7 = 'X'. " Quarter Mod Product
l_lfimg = t_lips-lfimg / 4.
ELSEIF maapv-prat6 = 'X'. " Half Mod Product
l_lfimg = t_lips-lfimg / 2.
ELSE.
l_lfimg = t_lips-lfimg.
ENDIF.
PERFORM get_additional_keyf
USING t_lips-matnr " Material Number
l_lfimg " Delivery Quantity
t_lips-meins " Base Unit of Measure
CHANGING ls_zzvs0012-ztotpalq " Total Pallets
ls_zzvs0012-zfullpalq " Full Pallets
ls_zzvs0012-zfulllayq " Full Layers
ls_zzvs0012-zfullcaq " Full Cases
ls_zzvs0012-zmisccaq. " Miscellaneous Cases
ls_zzvs0012-mandt = t_lips-mandt.
ls_zzvs0012-vbeln = t_lips-vbeln.
ls_zzvs0012-posnr = t_lips-posnr.
ls_zzvs0012-matnr = t_lips-matnr.
ls_zzvs0012-zspot_3 = ls_zzvs001-zspot_3.
APPEND ls_zzvs0012 TO t_zzvs0012.
ENDLOOP.
this pallet total is 1 decimal precision
e_pallet_spot = z_spots_3.
IF maapv-prat7 EQ 'X'. "Qtr Mod
e_real_pallets = e_real_pallets / 4.
ELSEIF maapv-prat6 EQ 'X'. "Half Mod
e_real_pallets = e_real_pallets / 2.
ENDIF.
this pallet total is 3 decimal precision
e_total_pallets_3 = z_spots_3.
e_total_weight = z_tot_wgt.
e_total_volume = z_tot_vol.
ENDFUNCTION.
I hope thats what is needed. Look forward to your kind reply.
Thanks so much in advance
Amit
Hi Amit,
here the coding:
data: wa_data type <name of your extract structure>,
wa_lips type lips,
l_index type i.
CASE i_datasource.
WHEN '2LIS_12_VCITM'.
loop at c_t_data into wa_data.
l_index = sy-tabix.
now call your function
call FUNCTION 'Z_PALLET_SPOT_CALC_DELIVERY'
EXPORTING I_VBELN = wa_data-vbeln
I_POSNR = wa_data-posnr
I_VKORG = wa_data-vkorg
I_VTWEG = wa_data-vtweg
IMPORTING E_PALLET_SPOT = wa_data-pallet_spot
E_REAL_PALLETS = wa_data-real_pallets
E_TOTAL_WEIGHT = wa_data-total_weight
E_TOTAL_VOLUME = wa_data-total_volume
E_TOTAL_PALLETS_3 = wa_data-total_pallets_3
EXCEPTIONS
NO_DATA_FOUND
if sy-subrc = 0.
modify c_t_data from wa_data index l_index.
endif.
endloop.
ENDCASE
Please check the field names. I don't know if you named them like I did in this example and check if you have vbeln, posnr, vkorg and vtweg in your extract structure. Then it should work.
regards
Siggi
User | Count |
---|---|
81 | |
11 | |
10 | |
8 | |
7 | |
6 | |
6 | |
6 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.