Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Query on memory consumption during SQL

Former Member
0 Kudos

Hi SAP Gurus,

Could I kindly request for your inputs concerning the following scenario?

To put it quite simply, we have a program where we're required to retrieve all the fields from a lengthy custom table, i.e. the select statement uses an asterisk. Unfortunately, there isn't really a way to avoid this short of a total overhaul of the code, so we had to settle with this (for now).

The program retrieves from the database table using a where clause filtering only to a single value company code. Kindly note that company code is not the only key in the table. In order to help with the memory consumption, the original developer had employed retrieval by packages (also note that the total length of each record is 1803...).

The problem encountered is as follows:

- Using company code A, retrieving for 700k entries in packages of 277, the program ran without any issues.

- However, using company code B, retrieving for 1.8m in packages of 277, the program encountered a TSV_TNEW_PAGE_ALLOC_FAILED short dump. This error is encountered at the very first time the program goes through the select statement, ergo it has not even been able to pass through any additional internal table processing yet.

About the only biggest difference between the two company codes is the number of corresponding records they have in the table. I've checked if company code B had more values in its columns than company code A. However, they're just the same.

What I do not quite understand is why memory consumption changed just by changing the company code in the selection. I thought that the memory consumed by both company codes should be the same... at least, in the beginning, considering that we're retrieving by packages, so we're not trying to get all of the records all at once. However, the fact that it failed at the very beginning has shown me that I'm gravely mistaken.

Could someone please enlighten me on how memory is consumed during database retrieval?

Thanks!

1 ACCEPTED SOLUTION

Former Member
0 Kudos

Hi Hermann,

The SQL statement is as follows:

SELECT * FROM <custom table>

INTO TABLE lt_temp

PACKAGE SIZE l_package

FOR ALL ENTRIES IN lt_bukrs

WHERE bukrs = lt_bukrs-bukrs

ORDER BY primary key.

<....>

ENDSELECT.

"For all entries" was used, but the tests I've been doing had only been for a single value.

Package size is computed as 500,000/<length of table row> (the custom table is dynamic). In this particular test, the length of the row is 1803, thus package is 277.

I also continued testing earlier today, looking for "valid" company codes that will not cause an error. To do the check faster, I decided to bulk a lot of them together in the lt_bukrs table. What I noticed was that the bigger the total no. of records is expected to be retrieved overall from the table, the more memory is consumed. I proceeded to check what would happen if I try to extract from company code A once more, but add a bunch of other company codes so that the total records is nearly equivalent to the failing company code B. As expected, the program short dumped on page alloc error.

I still don't understand why the packaging didn't work though...

Edited by: mjhermi on Jul 7, 2010 6:03 AM

4 REPLIES 4

former_member192616
Active Contributor
0 Kudos

Hi,

could you please post your OPEN SQL statement?

Kind regards,

Hermann

Former Member
0 Kudos

Hi Hermann,

The SQL statement is as follows:

SELECT * FROM <custom table>

INTO TABLE lt_temp

PACKAGE SIZE l_package

FOR ALL ENTRIES IN lt_bukrs

WHERE bukrs = lt_bukrs-bukrs

ORDER BY primary key.

<....>

ENDSELECT.

"For all entries" was used, but the tests I've been doing had only been for a single value.

Package size is computed as 500,000/<length of table row> (the custom table is dynamic). In this particular test, the length of the row is 1803, thus package is 277.

I also continued testing earlier today, looking for "valid" company codes that will not cause an error. To do the check faster, I decided to bulk a lot of them together in the lt_bukrs table. What I noticed was that the bigger the total no. of records is expected to be retrieved overall from the table, the more memory is consumed. I proceeded to check what would happen if I try to extract from company code A once more, but add a bunch of other company codes so that the total records is nearly equivalent to the failing company code B. As expected, the program short dumped on page alloc error.

I still don't understand why the packaging didn't work though...

Edited by: mjhermi on Jul 7, 2010 6:03 AM

0 Kudos

Hi,

with FAE (FOR ALL ENTRIES) the whole query even for a single record in the itab is executed and all results for

the company code are transfered from the database to the DBI since the duplicates will be removed by the DBI

not by the database.

If you use package size the resultset is buffered in a system table in the DBI (which allocates memory from your user quota). And from there on the package sizes are built and handed over to your application (into table lt_temp).

see recent ABAP documentation:

Since duplicate rows are only removed on the application server, all rows specified using the WHERE condition are sometimes transferred to an internal system table and aggregated here. This system table has the same maximum size as the normal internal tables. The system table is always required if addition PACKAGE SIZE or UP TO n ROWS is used at the same time. These do not affect the amount of rows transferred from the database server to the application server; instead, they are used to transfer the rows from the system table to the actual target area.

What you should do:

calculate the size needed for your big company code B. How many row multiplied with line length.

That is the minimum amount you need for your user memory quota. (quotas can be checked with

ABAP report RSMEMORY) If the amount of memory is sufficient then try without packagesize.

SELECT * FROM <custom table>

INTO TABLE lt_temp

FOR ALL ENTRIES IN lt_bukrs

WHERE bukrs = lt_bukrs-bukrs

ORDER BY primary key.

This might actually use less memory than the package size option for the FOR ALL ENTRIES.

Since with FAE it is buffered anyway in the DBI (and subtracted from your quota) you can

do it right away and avoid double saving portions (the DBI buffer and a portion of that in the

packe in lt_temp).

If the amount of memory is still too big, you have to either increase the quotas or select

less data (additional where conditions) or avoid using FAE in this case in order to not read all

the data in one go.

Hope this helps,

Hermann

0 Kudos

Hi Hermann,

Thank you very much for your reply. I was able to do a bit more testing and managed to figure out that using FAEs with package sizes are definitely not memory sufficient; your answer has definitely confirmed this for me.

I'll try out what you have suggested. At least this is a lot clearer to me now.

Thanks again!