Skip to Content

Split Internal Table into Multiple Internal Tables Dynamically based on No of Records

I want to split Internal Table into multiple Internal Tables as the Main Internal Tables contains huge no of records due to which the system generates a dump.

If the no of records are less it easily processes the Records.

For Eg

ITAB1 = 100000 Records. Split this table into multiple internal tables

TAB1 = 50000 Records

TAB2 = 50000 Records.

I dont want to follow the hardcoding logic of writing and checking sy-tabix = 50000, append into TAB1 or TAB2.

Can this be done using dynamic logic. Like the system identifies it has reached the size limit and automatically fills up another internal table.

Hope my requirement is clear.

Please guide me in resolving this issue.

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

2 Answers

  • Jan 04, 2017 at 07:07 PM

    I guess the first question is what in the world are you wanting to read into an internal table that could be that large to begin with (and not better suited for another toolset that handles "big data" for instance). Are you absolutely sure you should be pulling that much data??!?!

    Add comment
    10|10000 characters needed characters exceeded

  • Jan 04, 2017 at 08:22 PM

    As Christopher said I doubt this approach is the correct way and also wouldn't the dump already be generated if your first table exceeds the limit?

    I gave it a try anyways:

    TYPES: BEGIN OF lty_line,
             column1 TYPE i,
             column2 TYPE c LENGTH 4,
           END OF lty_line.
    CONSTANTS: lc_test_data_amount TYPE i VALUE 100000,
               lc_split_at_amount  TYPE i VALUE 10000.
    DATA: lt_big_table    TYPE STANDARD TABLE OF lty_line,
          lv_string       TYPE string,
          lt_small_tables TYPE STANDARD TABLE OF REF TO data,
          lr_small_table  TYPE REF TO data.
    " Generate test data
    DO lc_test_data_amount TIMES.
          number_chars  = 4    " Specifies the number of generated chars
          random_string = lv_string.    " Generated string
      APPEND VALUE #( column1 = sy-index
                      column2 = CONV #( lv_string ) ) TO lt_big_table.
    CLEAR lv_string.
    " Split
    DATA(lo_descr) = CAST cl_abap_tabledescr(
                       cl_abap_typedescr=>describe_by_data( lt_big_table )
    LOOP AT lt_big_table ASSIGNING FIELD-SYMBOL(<ls_line>).
      IF ( sy-tabix - 1 ) MOD lc_split_at_amount = 0.
        CREATE DATA lr_small_table TYPE HANDLE lo_descr.
        ASSERT lr_small_table IS BOUND.
        APPEND lr_small_table TO lt_small_tables.
        ASSIGN lr_small_table->* TO <lg_target>.
        ASSERT <lg_target> IS ASSIGNED.
      APPEND <ls_line> TO <lg_target>.
    UNASSIGN: <lg_target>, <ls_line>.
    FREE lr_small_table.
    BREAK-POINT. " lt_small_tables contains references to the split tables
    Add comment
    10|10000 characters needed characters exceeded

    • You could try getting a size from there and compare it using something like this I guess:

      DATA(lv_max_size) = ???.
      DESCRIBE FIELD: lt_tab      LENGTH DATA(lv_size_current) IN BYTE MODE,
                      ls_new_line LENGTH DATA(lv_size_additional) IN BYTE MODE.
      IF ( lv_size_current + lv_size_additional ) > lv_max_size.
        " On to the next itab...

      But I doubt the behavior is "well defined" (in ABAP) and is probably release, hardware and / or workload dependent. Looking through the documentation there is even wording like "usually" in place so I doubt that approach is worth pursuing.