cancel
Showing results for 
Search instead for 
Did you mean: 

CRM_ORDER_MAINTAIN Performance Tuning

0 Kudos

I am working on a development , which does the following
closes all Activities in teh system ( of a vcertain process type )
creates 1 activity eack for all accounts present in the system
also map the marketing attributes per account
create a survey with the responses as the corresponsdng marketing attribute and attach the survey to the account

Now there are 16000 accounts in the system
and i have used CRM_ORDER_MAINTAIN thrice
Once while closing the Activities
Once while creating new activities without a survey ( based on certain conditions)
and once while creating activities with surveys

I am facing prformance issues, the code is taking 40000 odd hours to complete in a batch job
I have performed SE30 /SAT , sm50 ..the major time s consumed by CRM_ORDER_MAINTAIN.Could you kindly suggest a way out to improve the performance .
I pasting the portion of the code here :
call function 'CRM_ORDER_MAINTAIN'
      exporting
        it_activity_h     = lt_activity_h
        it_appointment    = lt_appointment
        it_partner        = lt_partner
        it_status         = lt_status
      changing
        ct_orderadm_h     = lt_orderadm_h
        ct_input_fields   = ct_input_fields2
      exceptions
        error_occurred    = 1
        document_locked   = 2
        no_change_allowed = 3
        no_authority      = 4
        others            = 5.
    if sy-subrc eq 0.
*          *   build save table
      loop at lt_orderadm_h into ls_orderadm_h.
        ls_objects_to_save = ls_orderadm_h-guid.
        append ls_objects_to_save to lt_objects_to_save.
        clear:ls_orderadm_h, ls_objects_to_save.
      endloop.
    endif.
    if lt_objects_to_save[] is not initial.
      call function 'CRM_ORDER_SAVE'
        exporting
          it_objects_to_save   = lt_objects_to_save
          iv_update_task_local = 'X'
          iv_save_frame_log    = 'X'
        importing
          et_saved_objects     = lt_saved_objects
*         et_exception         = lt_exception
*         et_objects_not_saved = lt_objects_not_saved
        exceptions
          document_not_saved   = 1
          others               = 2.
      if sy-subrc eq 0.
        clear: lt_return2.

        call function 'BAPI_TRANSACTION_COMMIT'            "#EC FB_NORC
        exporting
            wait =  'X'
         importing
              return = lt_return2.


        call function 'CRM_ORDER_INITIALIZE'
          exporting
            it_guids_to_init  = lt_objects_to_save
            iv_initialize_whole_buffer = 'X'
            iv_init_frame_log = 'X' ""true
            iv_keep_lock      = ' '  "false
          exceptions
            error_occurred    = 1
            others            = 2.
      endif.

Accepted Solutions (0)

Answers (1)

Answers (1)

former_member182421
Active Contributor
0 Kudos

I would initialize even if the save works or not, anyway the API is just a pain, I would suggest to use parallel processing to improve the performance.

0 Kudos

Hi Luis/Gurus,

Thanks , Actually I tried Parallel Prrocessing , again thrice ,

Activity Change :16000 account

Activity Create :16000 account

Inbetween

when I do   SPBT_INITIALIZE , to get the no of servers available, at times it shows 0.

I do a loop wait  5 seconds and again  SPBT_INITIALIZE , but its still 0 and the system runs.

However this process happens alternately .

Run 1 works excellent .

Run 2 ...hangs ..even if i run it 10 mins later

Run 3 ..works excellent..

Im a bit zapped with this..

Please help.

former_member182421
Active Contributor
0 Kudos

Did you copy paste the example from the SAP Help portal?

Implementing Parallel Processing (SAP Library - Background Processing)

How many orders do you process per aRFC call?

0 Kudos

Hi Luis ,

You got it right. I did..but I had to break up data into packets. per free server available..it works very good if a server is avalable..but was getting hanged when a server is temporarily not available.

I have ..for the time being cahnged the sequesncing..i.e Run 2 , run 1 and then run 3..since run 2 has max data..thats working for me..not sure how this will work in higher data volumes though.

former_member182421
Active Contributor
0 Kudos

I only use one server group and true works good, I would suggest to make packs of 1-3 orders per aRFC call