Skip to Content

Select 1000 rows each time program starts

Hi experts,

I need to write a report that every time it is run by user it selects 1000 records from DB table, let's say BKPF.

Run no. First record no. Last record no. 1 1 1000 2 1001 2000 3 2001 3000 4 3001 4000

What is the best way to achieve this? Should I store last selected record in Z db table and select next 1000 records with fields greater than that record?

//Edit: I need to select records like they had indexes, not some random 1000 records each time, but following 1000 records.

I'd be greatful for any help,

Bartłomiej

Add comment
10|10000 characters needed characters exceeded

  • Follow
  • Get RSS Feed

2 Answers

  • author's profile photo
    Former Member
    Nov 03, 2015 at 06:52 AM

    Hi Bartlomiej

    Could you please explain the business requirements for this.

    Seems like an odd requirement

    Regards

    Arden

    Add comment
    10|10000 characters needed characters exceeded

    • There is the same requirement for BW extractors for example, there OPEN CURSOR/FETCH NEXT CURSOR is used.

      Check FM RSAX_BIW_GET_DATA_SIMPLE.

      * Determine number of database records to be read per FETCH statement

      * from input parameter I_MAXSIZE. If there is a one to one relation

      * between DataSource table lines and database entries, this is trivial.

      * In other cases, it may be impossible and some estimated value has to

      * be determined.

      OPEN CURSOR WITH HOLD S_CURSOR FOR

      SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT

      WHERE CARRID IN L_R_CARRID AND

      CONNID IN L_R_CONNID.

      ENDIF. "First data package ?

      * Fetch records into interface table.

      * named E_T_'Name of extract structure'.

      FETCH NEXT CURSOR S_CURSOR

      APPENDING CORRESPONDING FIELDS

      OF TABLE E_T_DATA

      PACKAGE SIZE S_S_IF-MAXSIZE.

  • author's profile photo
    Former Member
    Nov 03, 2015 at 07:38 AM

    I had the same requirement, like send 200 records at a time via proxy to SQL DB.

    1. After select Query, get total no of records in final output ITAB.

    2. define index_low , index_high.

    3. create one more temp. table and store itab data to temp itab.

    itab_temp[] = itab[].

    4.process as below.

    *Sample code

    DESCRIBE TABLE LT_ITAB[] LINES LV_LINES.

    IF LV_LINES > 200.

    WHILE LV_LINES > 200.

    LV_COUNT = LV_COUNT + 200.

    APPEND LINES OF LT_ITAB[] FROM LV_INDEX_LOW TO LV_COUNT TO LT_ITAB_TEMP[].

    TRY.

    CALL METHOD LR_PROXY->SI_PROXY

    EXPORTING

    OUTPUT = LT_OUTPUT.

    COMMIT WORK.

    wait UP TO 10 SECONDS.

    ENDTRY.

    LV_INDEX_LOW = LV_INDEX_LOW + 200.

    LV_LINES = LV_LINES - 200.

    REFRESH LT_RECORD_TEMP[].

    ENDWHILE.

    *==========Push the rest of data which are less than 200===================*

    IF LV_LINES NE 0 AND LV_LINES <= 200.

    REFRESH LT_TEMP[].

    LV_COUNT = LV_COUNT + 200.

    APPEND LINES OF LT_ITAB[] FROM LV_INDEX_LOW TO LV_COUNT TO LT_ITAB_TEMP[].

    LT_OUTPUT-MT_PROXY-RECORD = LT_ITAB_TEMP[].

    TRY.

    CALL METHOD LR_PROXY->SI_PROXY

    EXPORTING

    OUTPUT = LT_OUTPUT.

    COMMIT WORK.

    wait UP TO 10 SECONDS.

    ENDTRY.

    ENDIF.

    ELSE.

    ** If the number of records less than 200

    LT_OUTPUT-MT_PROXY-RECORD = LT_ITAB[].

    TRY.

    CALL METHOD LR_PROXY->SI_PROXY

    EXPORTING

    OUTPUT = LT_OUTPUT.

    COMMIT WORK.

    wait UP TO 10 SECONDS.

    ENDTRY.

    ENDIF.

    ENDIF.

    ENDIF.

    NOTE: This logic is for Pushing data in form of checks of data via PI proxy.

    Pl modify the logic as per your requirement.

    Regds,

    Lokes.

    Add comment
    10|10000 characters needed characters exceeded