cancel
Showing results for 
Search instead for 
Did you mean: 

TSV_TNEW_PAGE_ALLOC_FAILED

Former Member
0 Kudos

Hi SDN,

When ever i am scheduled one custom report in the back ground scheduled in the Production system the next day it went to dump with the message <b>TSV_TNEW_PAGE_ALLOC_FAILED</b>. This is the second time i got this type of dump with this message. Kindly provide the good solution what i need to do. The following is the information showing in the dump.

No storage space available for extending table "IT_53".

What happened?

You attempted to extend an internal table, but the required space was

not available.

What can you do?

Please make a note of the actions and input which caused the error.

Please make a note of the actions and input which caused the error.

To resolve the problem, contact your

SAP system administrator.

Choose "Print" for a hard coopy of the termination message. You can

display and adminster short dump messages using Transaction ST22.

Try to find out (e.g. by targetted data selection) whether the

transaction will run with less main memory.

If there is a temporary bottleneck, execute the transaction again.

If the error persists, ask your system administrator to check the

following profile parameters:

o ztta/roll_area (1.000.000 - 15.000.000)

Classic roll area per user and internal mode

usual amount of roll area per user and internal mode

o ztta/roll_extension (10.000.000 - 500.000.000)

Amount of memory per user in extended memory (EM)

o abap/heap_area_total (100.000.000 - 1.500.000.000)

Amount of memory (malloc) for all users of an application

server. If several background processes are running on

one server, temporary bottlenecks may occur.

Of course, the amount of memory (in bytes) must also be

available on the machine (main memory or file system swap).

Caution:

The operating system must be set up so that there is also

enough memory for each process. Usually, the maximum address

space is too small.

Ask your hardware manufacturer or your competence center

about this.

In this case, consult your hardware vendor

abap/heap_area_dia: (10.000.000 - 1.000.000.000)

Restriction of memory allocated to the heap with malloc

for each dialog process.

Parameters for background processes:

abap/heap_area_nondia: (10.000.000 - 1.000.000.000)

Restriction of memory allocated to the heap with malloc

for each background process.

Other memory-relevant parameters are:

em/initial_size_MB: (35-1200)

Extended memory area from which all users of an

application server can satisfy their memory requirement.

Error analysis

The internal table "IT_53" could not be enlarged further.

To extend the internal table, 12416 bytes of storage space was

needed, but none was available. At this point, the table "IT_53" has

1793712 entries.

-

Please note:

To facilitate error handling, the internal table "IT_53" was deleted.

-

Last error logged in SAP kernel

Component............ "EM"

Place................ "SAP-Server sscprda3_PRD_02 on host sscprda3 (wp 20)"

Version.............. 37

Error code........... 7

Error text........... "Warning: EM-Memory exhausted: Workprocess gets PRIV "

Description.......... " "

System call.......... " "

Module............... "emxx.c"

Line................. 1650

The error reported by the operating system is:

Error number..... " "

Error text....... " "

How to correct the error

The amount of storage space (in bytes) filled at termination time was:

Roll area...................... 6066656

Extended memory (EM)........... "-2146374184"

Assigned memory (HEAP)......... 575633856

Short area..................... 16079

Paging area.................... 24576

Maximum address space.......... " "

-


Good solution is to be appriciated..

Regards,

Kumar..

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Check with your DBA if the cube tables at oracle end..(provided your backend database is oracle ) have Stats created for them.. We have recently got into this kind of issue and solved by creating Stats at database level.Also try reducing your datapackage size..increase the virtual memory or roll in memory..

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi all,

Thanks for the immediate response. My problem have been resolved. I was sat with basis guys and resolve the problem.....

Thanks to every one........

Regards,

Kumar

Former Member
0 Kudos

Hi Kumar,

This is Vijay, Iam SAP BASIS ADMINISTRATOR.

We are also facing the same problem in R/3 Production server.

What is the solution you got for this?

Can you please give me the solution in detail.

Thanks

Vijay Ganga

Former Member
0 Kudos

Hi Vijay,

We are getting these type of error dumps due to following reasons :

1) Some time due to month end/Year end process there may be tremendous strain on the system. So due to this we will get this type of errors.

2) Due to lack of memory spaces it may happen.

3) Suppose data base table have 3 million records. select those records into internal table and time of when move those records from one internal table to another internal table we may get these type of errors.

<u>Expected solutions</u> :

1) Reduce the selection criteria. Just devided huge selection criteria into small chunks.

Suppose ours selecting the data from 2002 to current date then it will be devided into small chunks like from 2002 to 2003 and .........

2) Just check the abap code once with abaper.

3) Just check the when this problem was occurs on that day how many backgrounds are scheduled through DB02.

If backgrounds scheduleds are more then we will get these type of dumps.

4) Better to check the table spaces and buffer memories.

For more details please see the notes 369726.

Due to month end process and tremendous strain on the system i got these dumps.

Hope this helps.

Let me know for the more details..

Regards,

Kumar.

Former Member
0 Kudos

We have the same problem on one custom program.

1. We reduced the selection criteria in vain.

2. ABAPer still busy debugging

3. DB02? Where can I check scheduled jobs?

4. Tablespaces and buffers seem ok with us.

Vijay, did you come right?

Former Member
0 Kudos

Hi Lakshman,

Check the <b>select</b> statement or code where the dump occurs as it tries to use all of your system memory and finally dumps. What exactly you are trying to do in the custom report?. Copy and paste the code where the dump get generated to provide more useful info.

Thanks

Viswa

Former Member
0 Kudos

Hi Viswa,

Thanks for the imeediate response. The following only are the select statements which are used in my custom report. The dump shows at the second select statement. That seems ok. But i am not able to catch why it wentdump?

SELECT BUKRS

HKONT

GJAHR

BELNR

BUDAT

BLDAT

XBLNR

BLART

SHKZG

DMBTR

FROM BSIS

INTO TABLE I_BSIS

WHERE BUKRS IN S_BUKRS

AND HKONT IN S_HKONT

AND BUDAT IN S_BUDAT

AND BLDAT IN S_BLDAT

AND NOT XBLNR LIKE '%CONV%'

AND BLART IN S_BLART.

SELECT BUKRS

BELNR

GJAHR

BLART

BLDAT

BUDAT

CPUDT

XBLNR

BKTXT

GRPID

FROM BKPF

INTO TABLE I_BKPF

FOR ALL ENTRIES IN I_BSIS

WHERE BUKRS EQ I_BSIS-BUKRS

AND BELNR EQ I_BSIS-BELNR

AND GJAHR EQ I_BSIS-GJAHR

AND CPUDT IN S_CPUDT. "entry date

Please suggest me the solution. Appriciate for the good solutions....

Regards,

Kumar.

Former Member
0 Kudos

When you use FOR ALL ENTRIES in the select statement you need to make sure that all the entries which are coming into the itab I_BSIS are not duplicated and also their values should be compatible with the BUKRS, BELNR, GJAHR and CPUDT. Try to debug and see the contents of the itab I_BSIS.

Former Member
0 Kudos

Hi Kumar,

Add a if condition like below as suppose this internal table is not filled with values then For all entries command will read whole BKPF table. Probably that could be a reason for the insufficient memory dump.

<b>If not I_BSIS[] is initial.</b>

SELECT BUKRS

BELNR

GJAHR

BLART

BLDAT

BUDAT

CPUDT

XBLNR

BKTXT

GRPID

FROM BKPF

INTO TABLE I_BKPF

FOR ALL ENTRIES IN I_BSIS

WHERE BUKRS EQ I_BSIS-BUKRS

AND BELNR EQ I_BSIS-BELNR

AND GJAHR EQ I_BSIS-GJAHR

AND CPUDT IN S_CPUDT. "entry date

<b>endif.</b>

Thanks

Viswa

Former Member
0 Kudos

Hi LAKSHMAN KUMAR ,

if the dump happens at data loading then please try to reduce your packet size.

If the dump happens regardless from data loading,

you can try to optimize the custom report with the package size parameter in the initial select statement.

Example:

... INTO|APPENDING [CORRESPONDING FIELDS OF] TABLE itab [PACKAGE SIZE n]

I think PACKAGE SIZE 1000 would be a good value.

Please check also note 103747.

Regards,

Sascha

Former Member
0 Kudos

Hi Sascha,

Thanks for the immediate response. Infact in my select statement i didn't mention the package size and appending and into corresponding fields.

Eventhough it went dump.Could you please provide the valuable suggestions..

Regards,

Kumar.