Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Select 1000 rows each time program starts

former_member246634
Active Participant
0 Kudos

Hi experts,

I need to write a report that every time it is run by user it selects 1000 records from DB table, let's say BKPF.

Run no.First record no.Last record no.
111000
210012000
320013000
430014000

What is the best way to achieve this? Should I store last selected record in Z db table and select next 1000 records with fields greater than that record?

//Edit: I need to select records like they had indexes, not some random 1000 records each time, but following 1000 records.

I'd be greatful for any help,

Bartłomiej

4 REPLIES 4

Former Member
0 Kudos

Hi Bartlomiej

Could you please explain the business requirements for this.

Seems like an odd requirement

Regards

Arden

0 Kudos

This report (or rfc function) will be called by an external system. Our goal is to send huge amount of records to that system. We decided to use packages of 1000 records per one connection, because sending whole db table doesn't and won't work.

0 Kudos

There is the same requirement for BW extractors for example, there OPEN CURSOR/FETCH NEXT CURSOR is used.

Check FM RSAX_BIW_GET_DATA_SIMPLE.


* Determine number of database records to be read per FETCH statement

* from input parameter I_MAXSIZE. If there is a one to one relation

* between DataSource table lines and database entries, this is trivial.

* In other cases, it may be impossible and some estimated value has to

* be determined.

       OPEN CURSOR WITH HOLD S_CURSOR FOR

       SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT

                                WHERE CARRID  IN L_R_CARRID AND

                                      CONNID  IN L_R_CONNID.

     ENDIF.                             "First data package ?

* Fetch records into interface table.

*   named E_T_'Name of extract structure'.

     FETCH NEXT CURSOR S_CURSOR

                APPENDING CORRESPONDING FIELDS

                OF TABLE E_T_DATA

                PACKAGE SIZE S_S_IF-MAXSIZE.

Former Member
0 Kudos

I had the same requirement, like send 200 records at a time via proxy to SQL DB.

1. After select Query, get total no of records in final output ITAB.

2. define index_low , index_high.

3. create one more temp. table and store itab data to temp itab.

   itab_temp[] = itab[].

4.process as below.

*Sample code

   DESCRIBE TABLE LT_ITAB[] LINES LV_LINES.

   IF LV_LINES > 200.

     WHILE LV_LINES > 200.

       LV_COUNT = LV_COUNT + 200.

      APPEND LINES OF LT_ITAB[] FROM LV_INDEX_LOW TO LV_COUNT TO LT_ITAB_TEMP[].

       TRY.

           CALL METHOD LR_PROXY->SI_PROXY

             EXPORTING

               OUTPUT = LT_OUTPUT.

           COMMIT WORK.

           wait UP TO 10 SECONDS.

       ENDTRY.

       LV_INDEX_LOW = LV_INDEX_LOW + 200.

       LV_LINES = LV_LINES - 200.

       REFRESH LT_RECORD_TEMP[].

     ENDWHILE.

*==========Push the rest of data which are less than 200===================*

     IF LV_LINES NE 0 AND LV_LINES <= 200.

       REFRESH LT_TEMP[].

       LV_COUNT = LV_COUNT + 200.

     APPEND LINES OF LT_ITAB[] FROM LV_INDEX_LOW TO LV_COUNT TO LT_ITAB_TEMP[].

       LT_OUTPUT-MT_PROXY-RECORD = LT_ITAB_TEMP[].

       TRY.

           CALL METHOD LR_PROXY->SI_PROXY

             EXPORTING

               OUTPUT = LT_OUTPUT.

           COMMIT WORK.

           wait UP TO 10 SECONDS.

       ENDTRY.

     ENDIF.

   ELSE.

** If the number of records less than 200

     LT_OUTPUT-MT_PROXY-RECORD = LT_ITAB[].

     TRY.

         CALL METHOD LR_PROXY->SI_PROXY

           EXPORTING

             OUTPUT = LT_OUTPUT.

         COMMIT WORK.

         wait UP TO 10 SECONDS.

     ENDTRY.

    ENDIF.

   ENDIF.

ENDIF.

NOTE: This logic is for Pushing data in form of checks of data via PI proxy.

Pl modify the logic as per your requirement.

Regds,

Lokes.