Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

using internal tables to handle huge volume of data

Former Member
0 Kudos

Hi,

im using internal tables to store huge volume of data coming from db with each table containing gigs of data and i need all the records in those tables .

could someone help me out on using internal tables and approach while handling LARGE volume of data.

Thanks,

ravi

1 ACCEPTED SOLUTION

jayanthi_jayaraman
Active Contributor
0 Kudos

Hi,

Welcome to SDN.

If you are having data in internal table,then you can get the data from the internal table using LOOP or READ statement.

If you are sure that for the condition which you are giving fetch only one record,then use READ statement.

sort internal_table by field1 field2.

read table internal_table into workarea with key

field1 = value1

field2 = value2 binary search.

if sy-subrc eq 0.

...

endif.

If for the entered condition,there are more record,then use LOOP .

sort internal_table by field1 field2.

loop at internal_tab into workarea where field1 = value1

and field2 = value2.

...

endloop.

By giving conditions in Loop,you can process the record efficently.

Before using READ or LOOP,just sort the table by its key field and use binary search for reading the records.

Hope it helps.If so,reward points by clicking green star[6 points],blue star[10 points-problem solved] or 2 points[yellow star] on the left side in the reply.If you need more clarifications, get back.

7 REPLIES 7

jayanthi_jayaraman
Active Contributor
0 Kudos

Hi,

Welcome to SDN.

If you are having data in internal table,then you can get the data from the internal table using LOOP or READ statement.

If you are sure that for the condition which you are giving fetch only one record,then use READ statement.

sort internal_table by field1 field2.

read table internal_table into workarea with key

field1 = value1

field2 = value2 binary search.

if sy-subrc eq 0.

...

endif.

If for the entered condition,there are more record,then use LOOP .

sort internal_table by field1 field2.

loop at internal_tab into workarea where field1 = value1

and field2 = value2.

...

endloop.

By giving conditions in Loop,you can process the record efficently.

Before using READ or LOOP,just sort the table by its key field and use binary search for reading the records.

Hope it helps.If so,reward points by clicking green star[6 points],blue star[10 points-problem solved] or 2 points[yellow star] on the left side in the reply.If you need more clarifications, get back.

Former Member
0 Kudos

Hi,

Check the below link for the required documentation.

Hope it solves your purpose.

http://help.sap.com/saphelp_47x200/helpdata/en/fc/eb35de358411d1829f0000e829fbfe/content.htm

You can also use an ExTRACT for your purpose.

Check the below link for explanation on EXTRACT.

http://help.sap.com/saphelp_47x200/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm

Regards,

Vara

athavanraja
Active Contributor
0 Kudos

Welcome to SDN.

If the volume of data is huge, its better to use <b>EXTRACT</b> & <b>FIELD-GROUPS</b>.

check out the demo program from ABAP key word documentation.

REPORT demo_extract. 

NODES: spfli, sflight. 

FIELD-GROUPS: header, flight_info, flight_date. 

START-OF-SELECTION. 

INSERT: spfli-carrid spfli-connid sflight-fldate 
          INTO header, 
          spfli-cityfrom spfli-cityto 
          INTO flight_info. 

GET spfli. 
  EXTRACT flight_info. 

GET sflight. 
  EXTRACT flight_date. 

Regards

Raja

Former Member
0 Kudos

U can use logical database selection for retrieving a huge number of records.

Former Member
0 Kudos

Hi Ravi,

If you are not accessing the contents of the internal table by <i>index</i>, ie., if you always access a record using the key, then a hashed table is the most likely to give you the best performance. You may search this forum for "hashed table" to find some of the recent discussions.

Secondly, if you're concerned about the memory your program will require, you have said that you will need all the data which comes to several gigs. Are you sure that this is a correct estimate ? If it is, then it is recommended that you process the data in chunks. You retrieve the records a few at a time in your program and serialize the data retrieval and processing.

A more concrete solution may be provided if the details of the tables and their number of records , the conditions you're giving to fetch the data etc., are given.

Reward points if it helps. Please get back if you have further doubts.

Regards,

Anand Mandalika.

Former Member
0 Kudos

Hi,

i appreciate your prompt response.

i still have a question. are these measures going to cost too much space on app sever?

thanks,

ravi.

0 Kudos

Hi Ravi,

That is exactly what I have tried to address in the second part of my post above.

Whether your application server is capable of running programs with extremely huge memory requirements is something that will depend on the hardware you have got and some other settings in SAP, for example, memory area for the work process , roll area etc.,

Regards,

Anand Mandalika.