on 09-14-2011 1:41 AM
Hi Gurus
I am reading a master data table and populating a field zsetlrun in the cube.
Thae data is coming from DSO to cube and in the transformation END ROUTINE i have written the code to poplulate this field zsetlrun.
As i have 10 Million records in the Master data info object ,Its taking almost 1 hr to load 100 thousand records(in transformation) from DSO to cube.
I am pasting the code. Let me know if it can be optimised.
*Data: wa_i_tab1 type tys_tg_1 .
DATA: i_tab TYPE STANDARD TABLE OF tys_tg_1.
Data : wa_result_pkg type tys_tg_1,
wa_result_pkg1 type tys_tg_1.
*
*
*
*
SELECT /BIC/ZSETLRUN AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1 from
/BIC/PZREB_SDAT
into CORRESPONDING FIELDS OF table i_tab
FOR ALL ENTRIES IN RESULT_PACKAGE
where
/bic/ZREB_SDAT = RESULT_PACKAGE-/BIC/ZREB_SDAT
AND
AGREEMENT = RESULT_PACKAGE-AGREEMENT
AND /BIC/ZLITEM1 = RESULT_PACKAGE-/BIC/ZLITEM1.
*
DELETE i_tab WHERE /BIC/ZSETLRUN = ''.
*
*
*
sort RESULT_PACKAGE by AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1.
sort i_tab by AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1.
loop at RESULT_PACKAGE into wa_result_pkg.
*
*
*
read TABLE i_tab INTO wa_i_tab1 with key
/BIC/ZREB_SDAT =
wa_result_pkg-/BIC/ZREB_SDAT
AGREEMENT = wa_result_pkg-AGREEMENT
/BIC/ZLITEM1 = wa_result_pkg-/BIC/ZLITEM1.
*
IF SY-SUBRC = 0.
move wa_i_tab1-/BIC/ZSETLRUN to
wa_result_pkg-/BIC/ZSETLRUN.
wa_result_pkg1-/BIC/ZSETLRUN = wa_result_pkg-/BIC/ZSETLRUN.
modify RESULT_PACKAGE from wa_result_pkg1
TRANSPORTING /BIC/ZSETLRUN.
*
ENDIF.
CLEAR: wa_i_tab1,wa_result_pkg1,wa_result_pkg.
endloop.
Hi Dheeraj,
I can suggest you few improvements in the below given code, but not sure how much it will optimize your loading time. Just give a try,
*Data: wa_i_tab1 type tys_tg_1 .
DATA: i_tab TYPE STANDARD TABLE OF tys_tg_1.
Data : wa_result_pkg type tys_tg_1,
wa_result_pkg1 type tys_tg_1.
*
*
*
*
SELECT /BIC/ZSETLRUN AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1 from
/BIC/PZREB_SDAT
into CORRESPONDING FIELDS OF table i_tab
Instead of corresponding fields maintain the fields in itab in the same sequence in which you are extracting data from table, corresponding fields take much time.
FOR ALL ENTRIES IN RESULT_PACKAGE
where
/bic/ZREB_SDAT = RESULT_PACKAGE-/BIC/ZREB_SDAT
AND
AGREEMENT = RESULT_PACKAGE-AGREEMENT
AND /BIC/ZLITEM1 = RESULT_PACKAGE-/BIC/ZLITEM1.
*
DELETE i_tab WHERE /BIC/ZSETLRUN = ''.
*
*
*
sort RESULT_PACKAGE by AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1.
sort i_tab by AGREEMENT /BIC/ZREB_SDAT /BIC/ZLITEM1.
loop at RESULT_PACKAGE into wa_result_pkg. Instead of looping in workarea and again modifying directly use field symbol,
loop at result_package assigning <wa_result_pkg> so that once you assign the value to this field symbol it will be automatically updated to your result package without using modify. *
*
*
read TABLE i_tab INTO wa_i_tab1 with key
/BIC/ZREB_SDAT =
wa_result_pkg-/BIC/ZREB_SDAT
AGREEMENT = wa_result_pkg-AGREEMENT
/BIC/ZLITEM1 = wa_result_pkg-/BIC/ZLITEM1. As you have sorted the itab you can use Binary Search along with read table*
IF SY-SUBRC = 0.
move wa_i_tab1-/BIC/ZSETLRUN to
wa_result_pkg-/BIC/ZSETLRUN.
wa_result_pkg1-/BIC/ZSETLRUN = wa_result_pkg-/BIC/ZSETLRUN.
modify RESULT_PACKAGE from wa_result_pkg1
TRANSPORTING /BIC/ZSETLRUN.
*
ENDIF.
CLEAR: wa_i_tab1,wa_result_pkg1,wa_result_pkg.
endloop.
Regards,
Durgesh.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Is there any specific reason for writing an end routine , Cant you achieve the same in a start routine . Generally the following would improve the performance when you are writing a select on a master data table .
before writing the select ensure you donot have have any duplicates records for the key in the master data .
Define your Internal table as a HASHED Table if the master data is huge and you are retrieving with the key unique records.
Donot forget to Include Objvers = A while writing the select , since this is also part of the key .
Where clause should have the same key sequence as that in your P table for master data .
Its better to write in start routine since you donot need to loop the result again . Go with the update routine and read the internaal with key and populate your required fields.
Hope this helps.
Thanks,
VSK.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi VSK
I have to write this in the end routine as i have a constraint.
However I need to improve the performance of the code written in End routine.
I can see while uploading the data, 95% of the total time its stuck in end routine in DTP upload .
Generally by how much percentage he performance would be improved if i use hashed table and objevrs=A.
My master data table is huge it contains 10 Million records.
for 1000 records when it compares with master data its taking almost 1 hr.
Thanks
Dheeraj
User | Count |
---|---|
86 | |
10 | |
10 | |
9 | |
7 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.