Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Short dump on write statement

Former Member
0 Kudos

Hello All,

We have a list report which displays contents of an *.xml file.

It uses write statement using color option within an internal table to display the data of the file.

The internal table has single column but is of char 3600 length.

This internal table has 0.5 million records.

Currently on one of these write statements, the program is dumping with an error TSV_TNEW_PAGE_ALLOC_FAILED.

It is happening only for one such case.

The memory is probably insufficient.

Is there any specific SAP documentation to explain the memory usage of write statement in such case?

Thanks,

Prashant

16 REPLIES 16

matt
Active Contributor
0 Kudos

How are you using the write statement? It's highly unlikely that the issue is directly related to WRITE, so you're not going to get any documentation about memory usage of "WRITE".

Former Member
0 Kudos

Hi Matthew,

The following code is being executed within a loop on internal table consisting of 0.5 million records:

color = color + 1.
     color = color MOD 2.
   ENDON.
   FORMAT RESET.
   IF color = 1.
     WRITE: /2(79) space COLOR 2.
     IF text1 = '  0'.
       text = sy-tabix.
       WRITE: 3 text COLOR 2, 11 text2 COLOR 2.
     ELSE.
       WRITE: 2 '+' COLOR 2, 3 text1 COLOR 2, 11 text2 COLOR 2.
     ENDIF.
   ELSE.
     WRITE: /2(79) space COLOR 4.
     IF text1 = '  0'.
       text = sy-tabix.
       WRITE: 3 text COLOR 4, 11 text2 COLOR 4.
     ELSE.
       WRITE: 2 '+' COLOR 4, 3 text1 COLOR 4, 11 text2 COLOR 4.
     ENDIF.
   ENDIF.

Thanks,

Prashant

matt
Active Contributor
0 Kudos

I assume then that your logic is:

LOOP AT data.

write data.

ENDLOOP.

My guess is that you're already on the cusp of memory lack. Any action - writing to a list or whatever - is going to increase the memory to an extent. The WRITE just pushes it over the edge.

You can check by running in debug with the memory monitor. How much does memory usage increase with each iteration of your loop?

I do wonder, however, what the purpose is of outputting half a million records? Who is ever going to read it?!

Former Member
0 Kudos

Hi Matthew,

It is an old report used to view data of files stored on UNIX server.

We are thinking of a workaround to download the file separately and then viewing it.

It will spare us the short dump.

Thanks,

Prashant

raymond_giuseppi
Active Contributor
0 Kudos

You could try to debug your report, and before (not always easy) execute a memory use analysis ?

(ref: The Memory Analysis tool)

Regards,

Raymond

0 Kudos

Hi Raymond,

Memory analysis is as follows:

Though we could understand that internal table was usurping bulk of the memory.

But after the data is read from the dataset completely and while looping at this internal table we are encountering this dump.

Thanks,

Prashant

Former Member
0 Kudos

yes the memory is insufficient.I think the data is too huge.

Check with basis guy's then will increase memory.

0 Kudos

Hi Sai,

It is happening with this one program only and that too for just one file.

So increasing the memory is not a good idea, as per the basis team.

Thanks,

Prashant

former_member189845
Active Participant
0 Kudos

This message was moderated.

Former Member
0 Kudos

Hi Prashant,

       The Dump actually means that there is not enough run time memory to hold the data required to complete the process. The reason may not be the WRITE statement itself. The best thing to do is to  check for the memory parameters or to reduce the data volume that is being processed.

Can you please provide us the complete Long text of the Dump which will help us in getting more insight on the actual Dump that has occurred.

Best regards,

Praveenkumar T

0 Kudos

Hi Praveen,

Thanks a lot but due to our client's privacy policies we cannot divulge any such information on public portals.

Thanks,

Prashant

raymond_giuseppi
Active Contributor
0 Kudos

What use spool thus generated will you have, could you consider "breaking the spool" in smaller spool with some db commit  between every spool open/close to free some memory (NEW-PAGE PRINT ON/OFF) ?

Regards,

Raymond

0 Kudos

Hi Raymond,

The report is to display the data of a file from unix server.

We are currently working on a workaround beacuse this report is used by a lot of users and the impact is very high.

Plus, generally we do not have such bulky files so we were looking for some mall breakfix to make this program up and working.

Thanks,

Prashant

0 Kudos

As the file is read from a UNIX server, i guess you use READ DATASET to retrieve the contents?

If that is the case, you could break up the reading and writing of the file in several parts. E.g. after reading 100 lines, write 100 lines and then proceed to read the next 100 lines.

paul_bakker2
Active Contributor
0 Kudos

Hi,

I'm sorry, but the design of this program sounds loco.

Loading 500K records into an internal table, and then trying to write all out to spool?

That's too much data for anyone to make sense of (even with color coding). Is it meant to be stored away, for auditing purposes?

It might be better to create an ALV report where the users can select which parts of the file they want to view. This way they can use sorting, view it on the screen, hide columns, etc.

Or perhaps just use an XML editor to view the file?

cheers

Paul

0 Kudos

Hi Paul,

It is a very old report. Generally we do not encounter any such dumps.

But it is for this particular file that we received this dump.

So we were thinking of some breakfix rather than changing the process.

This report is generally used to read files from unix to application layer.

We do not have such bulky files in general, it is just a one off case.

Thanks,

Prashant