cancel
Showing results for 
Search instead for 
Did you mean: 

Mass Update a Zfield in Std Table with Split & Job submit

ricky_shaw
Contributor
0 Kudos

Hi,
I need to do a mass update for a single Z*custom field of  a STD table(say EANL).
The no. of records expected are around 30 million recs.

I am using Open cursor & then Fetch cursor technique to get 50000 records(package) at a time and update it in a do..enddo loop. Ofcourse 'Package' is a selection screen field.

Is this a right approach?

Now how do i make this fetch cursor & update(using FM) to submit into different background jobs?

I know there are std FM's like FM's JOB_OPEN, SUBMIT Program & Job_ClOSE. But how do i fit these pieces together(fetch cursor + Update+ jobs) to make use these FM's into my program?

Can anyone  please suggest.

Thanks

thomas_mller13
Participant
0 Kudos
If you access the table in a fully qualified manner yo do not need that batch jobs. You can do the task in a single loop and commit some bulks. That will not run very long.
thomas_mller13
Participant
0 Kudos

Your code must look like this:

thomas_mller13_0-1713986610634.png

 

im_repid is name of your program. This method schedules your program in batch. But you can schedule your program directly in batch with TA Se36. If you want to process certain bulks you have to select the bulks and schedule the bulks with this method. This is more complicated, since a bulk of 50000 is to large for a report parameter. You could think about writing the key fields in a temp database table and select teh bulks from this table using a bulk id.

 

ricky_shaw
Contributor
0 Kudos
Sorry there is correction..it is 30 millions recs & NOT 3 million recs

Accepted Solutions (0)

Answers (3)

Answers (3)

ricky_shaw
Contributor
0 Kudos

Hi Thomas,

Can you please post a better picture of your program. I Could not see it properly even after full screen.

Thanks

Szczerbowski
Active Participant
0 Kudos

Hi, 

My routine inserts about 18mln rows every now and then.
It's built from a report that has all the needed data, loops over them and every 'step' like 2mln commits with an update module to the backend:


    LOOP AT user_items ASSIGNING FIELD-SYMBOL(<ins>).
      INSERT <ins> INTO TABLE inserted.
      IF linesinserted >= 2000000 OR sy-tabix num_of_all_lines.
      CALL FUNCTION 'YIMT_BW_AUTH_DB' IN UPDATE TASK
         TABLES inserted inserted.
        CLEAR inserted.
        COMMIT WORK AND WAIT.
ENDIF.
ENDLOOP.

M.

ricky_shaw
Contributor
0 Kudos

Hi Szczerbowski, My Question is about the way to schedule this as Background Job. My DB update is working fine.

Did you happened to schedule it as background job ? If so, can you please suggest me the code?

Thanks

 

ricky_shaw
Contributor
0 Kudos

I am using a P_Submit on Sel screen and when this is checked it will submit the jobs.

My code is below:

*  Split into jobs

*  Split into jobs
 1) if p_submit = 'X'.
    perform submit_jobs.
    stop.
  endif.


 2) open cursor with hold c_curs1 for
   select * from sometable
    where field1 in s_field1
     and field2 lt sy-datum
     and loevm = space.
 
*    fetch data in packets
  do.
    fetch next cursor c_curs1 into table it_tab package size p_record.
 
    if sy-subrc = 0.
      sort it_tab by field1.

      describe table it_tab lines data(lv_lines).

      perform update_data.
    else. "   if no further data then exit
      message 'No (more) data found' type 'S'.
      exit.
    endif.

    refresh it_tab.
  enddo.

  close cursor c_curs1.

3) Inside perform submit_jobs  i have FM's for Job open & close & between that a SUBMIT prog..etc

  submit (lv_prog) via job gv_jobname number gv_jobcount
      with s_field1 in s_contr
      with p_record eq p_record
      with p_submit eq 'X'
       and return.

But the Program is executing 1) and stopping there.

The data Update logic is working perfect.

Can someone suggest?

ricky_shaw
Contributor
0 Kudos
Hi, Can i use some perform_update (table update logic) between JOB_OPEN & JOB_CLOSE instead of SUBMIT Stsmt?