Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Mark_Zhiltsov
Associate
Associate
0 Kudos

Welcome, this article is an extension of original article, here we will look at more complex custom logic for custom Rule Action in SAP Corporate Serialization (SAP CorS). More information on CorS you can find here and in SAP Help.

 

1. What has already been done

In previous article we created BADI implementation for /CORS/BADI_RULE_ACTION with custom logic for custom Rule Action which stores some event data in CSV file on application server. Now it is executed for every packing event, which is fine if you want to process every event one by one and you want to have a separate file for every consecutive event. But what if some events are related to each other and are actually sent in one notification message and should be processed in groups to have 1 file for 1 group of events (from 1 notification message)? In that case we need to adjust our code.

Code from previous article:

 

METHOD /CORS/IF_BADI_RULE_ACTION~EXECUTE.
  DATA:
        lt_output TYPE ZCORS_T_NTF_CONTENT,
        lt_file TYPE STANDARD TABLE OF string,
        lv_filepath TYPE string.

  CONSTANTS:
        lc_dir TYPE eps2filnam VALUE '/tmp',
        lc_format TYPE c LENGTH 3 VALUE 'csv'.

  DATA(lv_filename) = |test{ sy-datum }_{ iv_evtid }|.

"get event data from DB tables
  SELECT @( |{ sy-datum DATE = ENVIRONMENT }| ) AS ntf_date,
         CONCAT( o1~product, o1~serial ) AS obj1,
         CONCAT( o2~product, o2~serial ) AS obj2
    FROM /CORS/DM_EVT_REL AS r
    INNER JOIN /CORS/DM_OBJ_IDS AS o1
      ON r~objid = o1~objid
    INNER JOIN /CORS/DM_OBJ_HRY AS h
      ON o1~objid = h~objid_child
    INNER JOIN /CORS/DM_OBJ_IDS AS o2
      ON h~objid_parent = o2~objid
    WHERE r~evtid = _evtid
      AND o1~obj_type = '18'
      AND o2~obj_type = '17'
    INTO CORRESPONDING FIELDS OF TABLE @lt_output.
  
  SORT lt_output BY ntf_date obj2 obj1 ASCENDING.
  DELETE ADJACENT DUPLICATES FROM lt_output.

  IF lt_output IS NOT INITIAL.
"create csv file and save on AS
    lt_file = VALUE #( ( |Date,Parent serial no,Child serial ID| ) ).
    LOOP AT lt_output ASSIGNING FIELD-SYMBOL(<ls_output>).
      lt_file = VALUE #( BASE lt_file ( |{ <ls_output>-ntf_date },{ <ls_output>-obj2 },{ <ls_output>-obj1 }| ) ).
    ENDLOOP.

    lv_filepath = |{ lc_dir }/{ lv_filename }.{ lc_format }|.
    OPEN DATASET lv_filepath FOR OUTPUT IN TEXT MODE ENCODING UTF-8.
    IF sy-subrc = 0.
      LOOP AT lt_file ASSIGNING FIELD-SYMBOL(<ls_file>).
        TRANSFER <ls_file> TO lv_filepath.
      ENDLOOP.
      CLOSE DATASET lv_filepath.
    ENDIF.
  ENDIF.
ENDMETHOD.

 


2. What is needed to be adjusted

To log which events of the group have been already processed, we need to create custom table ZCORS_EVENT_LOG.
table.png

 And then we need to adjust our code to meet new requirements:

  • Firstly, we need to find all related events to current event which we are processing. For that we will use table /CORS/DM_EVENT. We need to find all records in this table with the same MSGGUID_IN (AIF Message GUID (Inbound Interface)) field value as our current event.
  • We need to understand if we're dealing with the first event of a group or some events were processed already. For that we will use our new table ZCORS_EVENT_LOG. 
    • If our current event is the first of a new group (no records with same MSGGUID_IN exist in that table), we need to add a new row to our log table.
    • If there is a record with same MSGGUID_IN, then we need to check if current event is the last of the group (compare number of processed events of the group PROCESSED and number of events in the group in total EVENTS). If we are processing the last event, then we actually need to execute the logic from previous article, adjusting only the data selection: we need to get data from DB tables for all events of the group. After processing we can delete corresponding record from ZCORS_EVENT_LOG.
    • If we are processing not the last event and not the first event of the group, we just need to increase ZCORS_EVENT_LOG-PROCESSED.

Here is example code with the changes:

METHOD /CORS/IF_BADI_RULE_ACTION~EXECUTE.
  DATA:
        lt_output TYPE ZCORS_T_NTF_CONTENT,
        lt_file TYPE STANDARD TABLE OF string,
        lv_filepath TYPE string,
        lt_return TYPE STANDARD TABLE OF bapiret2,
        lv_msgguid TYPE /cors/e_msguid_in,
        lv_new_msgguid TYPE abap_bool,
        lv_last_event TYPE abap_bool,
        lv_events_number TYPE i.

  CONSTANTS:
        lc_dir TYPE eps2filnam VALUE '/tmp',
        lc_format TYPE c LENGTH 3 VALUE 'csv'.

  DATA(lv_filename) = |test{ sy-datum }|.

"find all events
  SELECT msgguid_in, evtid
    FROM /CORS/DM_EVENT
    WHERE msgguid_in = ( SELECT msgguid_in
                           FROM /CORS/DM_EVENT
                           WHERE evtid = @iv_evtid )
    INTO TABLE @DATA(lt_events).

  IF sy-subrc = 0.
    lv_msgguid = lt_events[ 1 ]-msgguid_in.
    lv_filename = lv_msgguid.

    SELECT SINGLE *
      FROM zcors_event_log
      WHERE msgguid = @lv_msgguid
      INTO @DATA(ls_event_log).

    IF sy-subrc = 0.
      IF ls_event_log-events - ls_event_log-processed = 1.
        lv_last_event = abap_true.
      ENDIF.
    ELSE.
      lv_new_msgguid = abap_true.
      lv_events_number = lines( lt_events ).
      IF lv_events_number = 1.
        lv_last_event = abap_true.
      ENDIF.
    ENDIF.
  ENDIF.

  IF lv_new_msgguid = abap_true AND
     lv_last_event  = abap_false.
    ls_event_log = VALUE ZCORS_EVENT_LOG( mandt     = sy-mandt
                                          msgguid   = lv_msgguid
                                          events    = lv_events_number
                                          processed = 0 ).
  ENDIF.

  IF lv_last_event = abap_true.
"get data for all events of the group from DB tables 
    SELECT @( |{ sy-datum DATE = ENVIRONMENT }| ) AS ntf_date,
           CONCAT( o1~product, o1~serial ) AS obj1,
           CONCAT( o2~product, o2~serial ) AS obj2
      FROM @lt_events AS e
      INNER JOIN /CORS/DM_EVT_REL AS r
        ON e~evtid = r~evtid
      INNER JOIN /CORS/DM_OBJ_IDS AS o1
        ON r~objid = o1~objid
      INNER JOIN /CORS/DM_OBJ_HRY AS h
        ON o1~objid = h~objid_child
      INNER JOIN /CORS/DM_OBJ_IDS AS o2
        ON h~objid_parent = o2~objid
      WHERE o1~obj_type = '18'
        AND o2~obj_type = '17'
      INTO CORRESPONDING FIELDS OF TABLE @lt_output.

    SORT lt_output BY ntf_date obj2 obj1 ASCENDING.
    DELETE ADJACENT DUPLICATES FROM lt_output.

    IF lt_output IS NOT INITIAL.
"create csv file and save on AS
      lt_file = VALUE #( ( |Date,Parent serial no,Child serial ID| ) ).
      LOOP AT lt_output ASSIGNING FIELD-SYMBOL(<ls_output>).
        lt_file = VALUE #( BASE lt_file ( |{ <ls_output>-ntf_date },{ <ls_output>-obj2 },{ <ls_output>-obj1 }| ) ).
      ENDLOOP.

      lv_filepath = |{ lc_dir }/{ lv_filename }.{ lc_format }|.
      OPEN DATASET lv_filepath FOR OUTPUT IN TEXT MODE ENCODING UTF-8.
      IF sy-subrc = 0.
        LOOP AT lt_file ASSIGNING FIELD-SYMBOL(<ls_file>).
          TRANSFER <ls_file> TO lv_filepath.
        ENDLOOP.
        CLOSE DATASET lv_filepath.
      ENDIF.
    ENDIF.
    IF lv_new_msgguid = abap_false.
      "delete record from logger as all events were processed
      DELETE ZCORS_EVENT_LOG FROM @ls_event_log.
    ENDIF.

  ELSE."not last_event

    IF ls_event_log IS NOT INITIAL.
      ls_event_log-processed += 1.
      MODIFY ZCORS_EVENT_LOG FROM @ls_event_log.
    ENDIF.

  ENDIF.
ENDMETHOD.

Additional notes for the code above: 

  • This code will work also if we have only one event, but it doesn't create a record in ZCORS_EVENT_LOG in that case. So, if you want to store in this table everything that was already processed and not to delete those records, you will need to adjust the code a little bit.
  • For message processing you can use importing parameter IO_MESSAGES.

 

Thank you again for taking your time to read this article, I hope you found it helpful. Good luck with your future developments!