Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

ABAP LDB

Former Member
0 Kudos

Hi All,

The following query is giving short dump when I am running in a report program.

select objek atwrt into table i_dtc

from ausp

where objek in dd_kunnr

and atinn = g_atinn.

The internal table defnition is:

begin of wa_dtc,

kunnr type ausp-objek,

atwrt type ausp-atwrt,

end of wa_dtc.

i_dtc like hashed table of wa_dtc

with unique key kunnr.

dd_kunnr is a table field from Logical database DDF.

This is happening only for one customer. It is working for a different customer. The program creates Aged trial balance report.

It cannot be type mismatch, becaz it is running for a different customer. dd_kunnr is getting the data(there are about 1040 entries).

Any ideas?

1 ACCEPTED SOLUTION

Former Member
0 Kudos

Glad to hear that, so can you please reward and close?

Thanks,

Srinivas

14 REPLIES 14

Former Member
0 Kudos

What is the short dump? I think since you have a restriction on your internal table definition for unique customer, you are getting this dump.

If you declare your internal table as standard table without any key and then remove your duplicate customers using the following statements, you shold be fine.

SORT i_dtc BY kunnr.

DELETE ADJACENT DUPLICATES FROM i_dtc COMPARING kunnr.

Regards,

Srinivas

0 Kudos

If I change the table defnition from hash table to standard table the program performance (may take more time and tme out may result) may go down. Also the program is currently reading the hash table based on the key. So this will need to be changed.

Let me know changing to standard table will be a better option.

Thanks,

Sobhan.

0 Kudos

Hi Sobhan,

I know that the performance will take a hit if you change the table to a standard table. What I am attempting here is to find the root cause. If it is duplicate values of kunnr that is causing it, then you need to at least remove the "unique" restriction.

Srinivas

0 Kudos

Also remember, classification/characteristic values are date dependent, meaining they have a valid start and end date. So may be if you add the date to the mix, you may not get the duplicate issue.

Can you tell me what the dump says?

Srinivas

0 Kudos

Srinivas,

Thank you very much for the quick responce.

The following are the dump details:

The sql statement generated from SAP open SQL violates

a restriction imposed by the database system used in R/3.

For details refer to either system log or the developer trace.

possible reasons for error:

1. maximum size of an SQL statement exceeded

2. The statement contains too many input variables

3. The input data requires more space than is available

Thanks,

Sobhan.

0 Kudos

I cannot tell much from this error, but try either changing the definition of the internal table or try changing the definition of the dd_kunnr which refers to the LDB table field.

Do the change to the table first and see if it works. If it works, then your issue is with the internal table definition. We will then tackle the performance issue.

Regards,

Srinivas

0 Kudos

Hi All,

I changed the hash table to standard internal table( without key). But still I am getting the same problem. I looked at the developer trace. The following error messages I found.

Error code: DBIF_RSQL_INVALID_RSQL

RSQL Error 13 when accessing table AUSP

ERROR => dbtran error (set_input_da_spec) : statement too big.

My new internal table defnition is as follows:

begin of wa_dtc,

kunnr type ausp-objek,

atwrt type ausp-atwrt,

end of wa_dtc,

i_dtc like wa_dtc occurs 0 with header line

The actual query is:

select objek atwrt into table i_dtc

from ausp

where objek in dd_kunnr

and atinn = g_atinn.

Delete adjacent duplicates from i_dtc comparing kunnr.

Sort i_dtc by kunnr.

Any new ideas?

Thanks,

Sobhan.

0 Kudos

Looks like your error is due to the large number of values for dd_kunnr. Do you have the dd_kunnr values as single values not a range? You mentioned that your dd_kunnr had 1040 entries and if they are single values, then it is going to result in a long SQL statement. That is why you are getting this error.

It is not the internal table definition. How are you filling dd_kunnr and please also give us the data declaration statement of this varaible.

Srinivas

Former Member
0 Kudos

hi, add something in my opinion.

first, Srinivas Adavi is right. you'd better replace the hash table with a standard internal table.

Because according to your select condition, they are not full key, so the result maybe multiple, if exists several result entries with same objek(kunnr), you application will dump, as you define it as a unique key hashed table. So using a standard one to try the result, we can analyze the dump reason.

second, you dump info looks like your select statement is in problem, in my experience, it will occur when the RANGE is too large, the statement size exceed the maximum.

Can you check the dd_kunnr in this dump case, it's really a single cumster in it?

And please also aware that the type of objek and kunnr is not same. It's not proper for you to compare them directly.

Hope my reply will be useful.

thanks

Former Member
0 Kudos

hi, as you metioned, 'dd_kunnr is getting the data(there are about 1040 entries).'.

And according to you inputted, error is statement is too big.

I have mentioned in former reply, the problem is due to RANGE is too large, make the select statement too long.

Exceed the maximum length of SELECT in SAP, you can do some adjust in configure to enlarge the limit, or split the select to several steps.

Try it reduce the dd_kunnr, if the dump won't happen, then my assumption can be demonstrated.

thanks

0 Kudos

Thank you all.

Srinivas,

I dont know how to find DD_KUNNR defnition. It is not available in the program. What I know is logical database name DDF and all the nodes for DDF are KNA1,KNB1, BSID,BKPF, BSEG. If you can tell me how to find that I will find out.

zhenglin gu,

I tried with two select statements as a simple test like this. But the same short dump came at the first select statement.

select objek atwrt into table i_dtc

from ausp

where objek in dd_kunnr.

select objek atwrt into table i_dtc

from ausp for all entires of i_dtc2

where atinn = g_atinn.

Can you please let me know if there is a better way I can split the query and how I can reduce dd_kunnr.

Thanks ,

Sobhan.

0 Kudos

OK, DD_KUNNR definition is that it is a select option for KNA1-KUNNR.

So now we know that your problem is because of too many single values for kunnr in your DD_KUNNR. So what I suggest is to select from AUSP based on only G_ATINN and then delete the records that don't satisfy the DD_KUNNR criteria.


SELECT objek atwrt INTO TABLE i_dtc
                   FROM ausp
                 WHERE atinn = g_atinn.
DELETE i_dtc WHERE NOT objek IN dd_kunnr.

If it helps, please reward and close the post.

Thanks,

Srinivas

0 Kudos

Srinivas,

Excellent. It worked. I will award the points.

Thanks,

Sobhan.

Former Member
0 Kudos

Glad to hear that, so can you please reward and close?

Thanks,

Srinivas