Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

DUMP because of insufficient memory

Former Member
0 Kudos

Hi friends,

When the internal table is having 10 lakh record while processing the internal table because of memory problem the program is going for dump.How to solve this ?is there any alternate way of handling this?What we should do for this?

Can anyone help on this issue.

Thanks.

3 REPLIES 3

Former Member
0 Kudos

Hi,

Try this:

0. Use Field symbols to process itab with huge data, it will be very fast as it refers only pointers, i m giving one e.g. of use of field symbol with itab :

REPORT demo_int_tables_read_assigning .

DATA: BEGIN OF line,
        col1 TYPE i,
        col2 TYPE i,
      END OF line.

DATA itab LIKE HASHED TABLE OF line WITH UNIQUE KEY col1.

FIELD-SYMBOLS <fs> LIKE LINE OF itab.

DO 4 TIMES.
  line-col1 = sy-index.
  line-col2 = sy-index ** 2.
  INSERT line INTO TABLE itab.
ENDDO.

READ TABLE itab WITH TABLE KEY col1 = 2 ASSIGNING <fs>.

<fs>-col2 = 100.

LOOP AT itab INTO line.
  WRITE: / line-col1, line-col2.
ENDLOOP.

1. Try to minimize entries in itab, using more criteria in where clause. this is important.

2. While processing try to use Binary search or if possible use hash table.

3. using se30 check exactly where error it is taking too much time and attack that portion of codes for improvement. refer tips and trick section of se30 for achieving this.

if u want further info revert with sample codes.

Jogdand M B

Former Member
0 Kudos

Hi,

1) First alternative is to make use of field symbols.

2) Second alternative is to split the data into separate internal tables by using loop.

3) If you are facing the problem while selecting the data into internal table, then make use of package size .

Regards

sailaja.

Former Member
0 Kudos

Try using extracts instead of an internal table.