Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

The maximum size of an SQL statement has been exceeded.

colin_cheong
Contributor
0 Kudos

In ST22, abap dump shows that one of the possible reason was "The maximum size of an SQL statement has been exceeded."

How can I know what's the maximum size and how to compute that size had been reached?

Is this shown anywhere in the log?

1 ACCEPTED SOLUTION

Former Member
0 Kudos

HI

Can you Paste your SQL Query...

Else..Why dont you check some Notes with this Dump Error..

Hope you will find some notes which tells about such Issue...

Like 946196

Check SAP NOTE : 1002491

Hope it Helps .

Thanks

Praveen

Edited by: praveen parasiya on Dec 21, 2007 3:17 AM

11 REPLIES 11

Former Member
0 Kudos

HI

Can you Paste your SQL Query...

Else..Why dont you check some Notes with this Dump Error..

Hope you will find some notes which tells about such Issue...

Like 946196

Check SAP NOTE : 1002491

Hope it Helps .

Thanks

Praveen

Edited by: praveen parasiya on Dec 21, 2007 3:17 AM

0 Kudos

Here's the part of it.

An exception occurred. This exception will be dealt with in more detail

below. The exception, assigned to the class 'CX_SY_OPEN_SQL_DB', was not

caught, which

led to a runtime error. The reason for this exception is:

The SQL statement generated from the SAP Open SQL Statement violates a

restriction imposed by the database system used in R/3.

Possible errors:

o The maximum size of an SQL statement has been exceeded.

o The statement contains too many input variables.

o The space needed for the input data exceeds the available memory.

o ...

0 Kudos

Please show us the actual sql from your abap.

0 Kudos

here is the SQL

SELECT MATKL MATNR MEINS INTO TEMP_KEY FROM MARA

WHERE MATNR IN SO_MATNR.

MOVE-CORRESPONDING TEMP_KEY TO ITAB_KEY1.

IF WS_PCHR <> SPACE.

CLEAR ITAB_CHAR.

CLEAR ITAB_KEY1-CHARACTER. REFRESH ITAB_KEY1-CHARACTER.

PERFORM GET_CHAR TABLES ITAB_CHAR ITAB_KEY1-CHARACTER

USING ITAB_KEY1-MATNR.

ENDIF.

APPEND ITAB_KEY1. CLEAR ITAB_KEY1.

ENDSELECT.

0 Kudos

I would guess with this particular select that the problem is so_matnr is very big, ie it has a lot of material numbers in it. Is that true?

0 Kudos

Not really.

After going thru the correspondence, I had a feeling that it's the opposite of what you said.

I had a feeling that it was left as a blank entries. This may result in numerous possible selection.

I guessed I would have to fix the logic to control this loophole during the record search.

0 Kudos

Hi friend,

I think ur sql statement is mismatched, so thats y, it goes into short dump.It takes loop iteration,the size should be infinite.

If it is useful, reward points.

regards,

satish.

former_member181962
Active Contributor
0 Kudos

did you give a lot of entries in your select-options?

something like over 1000 entries in the select-options?

I think that must be the problem.

Regards,

Ravi Kanth Talagana

0 Kudos

I don;t think the user does that. Otherwise, they would probably have saved it as a variant for future use.

what's other possibility to it?

0 Kudos

1. decalre temp_key as a table.

2. use this package size option ....this gets the data in bite-sized chunks

SELECT MATKL MATNR MEINS

INTO corresponding fields of table TEMP_KEY package size 1000

FROM MARA

WHERE MATNR IN SO_MATNR.

loop at temp_key.

MOVE-CORRESPONDING TEMP_KEY TO ITAB_KEY1.

IF WS_PCHR SPACE.

CLEAR ITAB_CHAR.

CLEAR ITAB_KEY1-CHARACTER. REFRESH ITAB_KEY1-CHARACTER.

PERFORM GET_CHAR TABLES ITAB_CHAR ITAB_KEY1-CHARACTER

USING ITAB_KEY1-MATNR.

ENDIF.

APPEND ITAB_KEY1. CLEAR ITAB_KEY1.

endloop.

ENDSELECT.

0 Kudos

Hi Ravi Kanth,

My select query is trying to access over 10,000 records and I am getting the same short dump. How should I deal with this? My requirement expects me to pick up those many records from the table.

Please suggest how I can go about this problem.