10-09-2018 8:50 AM
Hi Experts,
How to over come Time out issue while executing the report it is taking 10 mints and time out with dump at the below pont.
loop at itab assigning fieldsymbol<mat_plant>
data(char) = value tt_value(for char in it_chars) where .....
can any one suggest on this and there are 400000 of records fetching in the above
I tried in different ways like bypassbuffer, idx, ... but no use.
Thanks,
Murali
10-09-2018 9:41 AM
Hi Murali,
Have you tried Packet Size . You can go through the below link to have better understanding of packet size. From your sample code its difficult for me to give you exact solution but may be from the below link you can get an idea of using Packet size and reduce data in internal table which you then looping.
https://archive.sap.com/discussions/thread/575190
Thanks
Ila Chaudhary
10-09-2018 2:37 PM
Hi Murali,
More information about the report will be much helpful. For example if the purpose of this report is just to provide some ALV data, I would recommend to run it as a background job. You will get the ALV output as a spool file which you can access either via SM37 tcode or SP02 tcode.
If this report need user interaction or GUI Upload/Download steps. You can try increasing the limit for timeout but I will not recommend it.
Please let me know if you need any further clarification.
Thanks and Regards,
Arpan Shukla
10-09-2018 7:35 PM
First : Show your code or Explain in more detail what exaclty goes on in the Loop and with what type of Data ...
In absence of more information : Put as much of the processing logic outside of the Loop, if possible. Are there records that can be skipped or deleted right away? Is there a way to do a specific Manipulation / Reading / ... beforehand or in a separate Query? Are you using STANDARD TABLE or can you also resort to SORTED TABLE / HASHED TABLE, which would be faster?
Have you traced down accurately where the most time is 'lost' (inside the Loop)? Well, attack that part first ...
10-10-2018 3:48 PM
Hi Experts,
Attachind code.scn.txt.
Background is ok but also in fore ground and in future records may increase so should not give dump.
Moderator note, for ease of reading, I've put the code inline.
IF it_mat_plant_keys IS NOT INITIAL.
SELECT marc~matnr AS objectid,marc~werks AS plant,
ibsymbol~atinn AS charname,
ibin~klart AS class_type,
ibsymbol~atwrt AS value,
ibsymbol~atflv AS valuefrom
* cabn~MSEHI as uom
INTO TABLE @DATA(lt_characters)
FROM marc
JOIN ibin ON ibin~instance = marc~cuobj
JOIN ibinvalues ON ibinvalues~in_recno = ibin~in_recno
JOIN ibsymbol ON ibsymbol~symbol_id = ibinvalues~symbol_id
* join cabn on cabn~atinn = ibsymbol~ATINN
FOR ALL ENTRIES IN @it_mat_plant_keys
WHERE marc~matnr = @it_mat_plant_keys-matnr AND
marc~werks = @it_mat_plant_keys-werks AND
ibin~delflag EQ @space
AND ibin~valto EQ 99991231235959.
IF lt_characters IS NOT INITIAL.
"build final output table
LOOP AT it_mat_plant_keys ASSIGNING FIELD-SYMBOL(<mat_plant_key>).
DATA(chars) = VALUE tt_charvalues( FOR char IN lt_characters WHERE
( objectid = <mat_plant_key>-matnr AND plant = <mat_plant_key>-werks
) ( objectid = char-objectid
charname = char-charname
class_type = char-class_type
value = char-value
valuefrom = char-valuefrom
) ).
SORT chars BY charname class_type.
APPEND VALUE ty_charvalues_out_mat( material = <mat_plant_key>-matnr
plant = <mat_plant_key>-werks
charvalues = chars ) TO et_characterstics.
ENDLOOP.
ENDIF.
ENDIF.
ENDMETHOD.
10-11-2018 8:51 AM
Do what Nic suggests Define lt_characters before your select as a SORTED table with NON-UNIQUE keys objectid plant.
And of course it works in background. Background doesn't have a timeout like foreground does...
10-11-2018 7:26 AM
could you try corresponding instead of loop and for, like this:
chars = CORRESPONDING #( lt_characters FROM it_mat_plant_keys USING KEY mkey objectid = matnr<br>
plant = werks<br>
MAPPING objectid = objectid<br>
charname = charname<br>
class_type = class_type<br>
value = value<br>
valuefrom = valuefrom ).
i dont know if it work or not but could be try...
i also think about mesh type to handle your joined data in association, but it maybe other story...