Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

maximum size of internal table

Former Member
0 Kudos

hello community,

I have select data from zzalla table into a internal table but it become dump due to excced the lenth of internal table.Is any way by which we maximize size of internal table and resolve dump.

18 REPLIES 18

RichHeilman
Developer Advocate
Developer Advocate
0 Kudos

Sounds like your internal table is not typed correctly. The internal table line type should be exactly what you are extracting from the table. So if you are extracting all columns of the table, then you should define your internal table like this.

data: itab type table of zzalla.

select * into table itab from zzalla.

If this does not resolve your issue. Please post the DATA statement for your internal table as well as any TYPES statement which may be defined, as well as the SELECT statement which is dumping.

Regards,

Rich Heilman

0 Kudos

no my question is that accutullay ZZalla table has huge records. for less record o/p is comming. But at huge records message comes like that

no storage space available for internal table.

you attemp to extend an internal table but the required space are not available.

0 Kudos

hi,

the internal table size can be defined by the programmer as

occurs 0

occurs 1,

ouccrs 2,

occurs3,

...

...

...

occurs n .. etc where are the maximum memort of the internal table is OCCURS 0

0->ZERO means 2 gigabytes of data size which is 2powerof 30. .

after that there is no memory growth in the program ... so it will give you short dump for that only field-sysmbols are used .... for separating the internal table data dynamically to avoid short dump and to increase the performance of the read concept

No, limit depends on the memory size itself. Because whenever more space is needed it is automatically allocated.

IF you define an internal table using OCCURS N then it will reallocate n memory slots of same size.

Edited by: subas Bose on Mar 11, 2010 6:12 PM

Edited by: subas Bose on Mar 11, 2010 6:13 PM

Edited by: subas Bose on Mar 11, 2010 6:14 PM

Edited by: subas Bose on Mar 11, 2010 6:15 PM

copy/pasted, one possible source: Please don't do this again, otherwise I'll have to suggest your ID for deletion.

Edited by: Thomas Zloch on Mar 11, 2010 6:28 PM

former_member641978
Active Participant
0 Kudos

Hi

You can ask your BASIS to allocate more memory for SAP use ( if it not at it's maximum ).

It worked for me at the only time i encountered such problem.

Best Regards

Yossi Rozenberg

0 Kudos

Ok, so the best way to handle large volumes is to read in packages. The SELECT statement supports an extension called PACKAGE SIZE.

http://help.sap.com/abapdocu_70/en/ABAPINTO_CLAUSE.htm

Basically this constructs a loop in the SELECT...ENDSELECT, it will select n records(depending on your package size) from the DB and place in the internal table, you do the processing that you require inside this SELECT...ENDSELECT, next time through the loop, it will fetch the next n records, and you can process them accordingly.

Regards,

Rich Heilman

0 Kudos

hi Thomas,

Please don't do this again, otherwise I'll have to suggest your ID for deletion.

How can it will be meaningful thomas.

For all my contribution your giving above line as a gift.

0 Kudos

Copy/pasting without naming a source is seen as cheating. There is no problem if this was the last time.

Please keep contributing "the right way".

Thanks

Thomas

0 Kudos

no it is not working with package size also

0 Kudos

You need to set the package size to something managable, like 50000. Then process 50000 lines at a time. If this is not working, post the relevant code, and we can take a look.

Regards,

Rich Heilman

0 Kudos

actually we are working in development in that records are very low and dump is comming in production server

so for testing first we write code like that dump should come for over size of internal table and then resolve it.aslo we have no authorization to create z table in dev server for test. just look my queary below

do 100000 times.

select * from zzalla APPENDING table t_table package size 50000.

endselect.

enddo.

0 Kudos

Simply appending the data to a table will not solve your issue. You have a memory overload. So that means you need to do processing of the 50000 inside the SELECT...ENDSELECT and throw them away.

select * from zzalla iINTO table t_table package size 50000.

* Process the records in here.  Now the next loop will throw away these
* records and bring 50000 new ones, process them here.

endselect.

So basically you are processing only 50000 records at a time. You will not be able to process them all at the same time as you can see in your production system.

Regards,

Rich Heilman

0 Kudos

actually issue is that throw package size we devide the data into two internal table thatis ok but finally we want whole data in one table so for that we have to appending data of both table in one table at that time dump will come.

is there any possibilitis to create view for internal table

Edited by: rajcool on Mar 11, 2010 9:12 PM

0 Kudos

So the real question is what are you doing with all of this data? If you are getting dumps based on the sheer volume of data, then it is too much data to present to the user to manipulate, right? So what is it that you are doing with this data? I would think that you would want to summarize it in some way which is understandable or managable to someone. In which case, you could use the package size to read in chunks of data and summarize it to another internal table. Again, you are fetching too much data from the DB, and there is no other way around this other than having to increase system wide settings, which I don't think you want to do. Again, what is it that you want to do with all this data.

Regards,

Rich Heilman

0 Kudos

actuall we are doing daily Daily Delta load for extractor zzalla .in that extractor function module used in that function module our select quary is defind.and when we run delta load it goes dump fails due to internal table memory in R/3 (We need to see if there can be some solution where huge data would come for 1 day. This happened for the 2 time with 2 delta loads.

so for that we are writing some code through tah size of internal table can increase.

0 Kudos

I've written extractors and ended up with this problem. Solved using OPEN CURSOR WITH HOLD ... FETCH NEXT CURSOR etc. ( There are plenty of examples - in fact, the template extractor, Function Module RSAX_BIW_GET_DATA contains a lot of hints itself.

I suspect you're trying to extract all the data in one (BW) package.

matt

Former Member
0 Kudos

Hi,

it can go to dump saying out of memory. A possible way of

dealing this is to increase the memory limit in Application

server. This is usually done by BASIS.

Former Member
0 Kudos

This message was moderated.

Former Member
0 Kudos

close