Skip to Content
avatar image
Former Member

dump DBIF_RSQL_INVALID_CURSOR while writing data to local pc

Hi all together, 😊

I´ve been spending a lot of time on this issue without a success. My requirements are to get data from several db-tables and write them on the presentation server (local pc).

Some of the tables are small and there are no problems. But some of them are up to 8GB or 250.000 rows big. An internal table can storage round about 500MB.

I tried the FETCH NEXT CURSOR statement and the SELECT * ... ENDSELECT loop to read the data in packages (5000 rows) into an internal table. That works on the first pass well. The internal table handles the 5000 rows without any problems. After the first package was written to the local excel-file, the program tries to fetch the next cursor (next 5000 rows) and dumps!

==> DBIF_RSQL_INVALID_CURSOR

Error analysis:

One of the database selections included a database commit.

The selection was then supposed to continue. Before a

database commit, however, all outstanding database selections must be

concluded.

Possible causes in the application program:

Within a loop (SELECT/LOOP/EXEC SQL or a FETCH statement), one of the following

statements is used:

- MESSAGE (apart from MESSAGE S...)

- COMMIT WORK

- ROLLBACK WORK

- CALL SCREEN

- CALL DIALOG

- CALL TRANSACTION

- SUBMIT

- BREAK-POINT

- WAIT

................

And there is no primary key or something like that in the db-table to use a WHERE-Clause. 😭

I know, there are similar discussions on the web and this forum. I read a lot of them (nearly all of them), but there was no one which could help.

So please help me!

Thank you!

Here is my code:

FIELD-SYMBOLS: <gs_fs> TYPE ANY,
<gs_wa> TYPE ANY,
<gs_table> TYPE ANY TABLE,
<gs_comp> TYPE ANY.

CONSTANTS: co_package_size TYPE i VALUE 5000.

DATA: lt_write_data_to_file TYPE TABLE OF string,
l_xout TYPE string,
l_xout_field TYPE string,
l_tabname TYPE ddobjname,
l_dref TYPE REF TO data,
l_number_rows TYPE i,
l_bool TYPE string VALUE 'true',
l_db_cursor TYPE cursor,
l_start TYPE i VALUE 0,
l_end TYPE i VALUE co_package_size,
l_lines TYPE i.

CLASS cl_abap_char_utilities DEFINITION LOAD.
CONSTANTS c_hor_tab TYPE c VALUE cl_abap_char_utilities=>horizontal_tab.


CREATE DATA l_dref TYPE STANDARD TABLE OF (l_tabname) WITH NON-UNIQUE DEFAULT KEY.
ASSIGN l_dref->* TO <gs_table>.

SELECT COUNT(*) FROM (l_tabname) INTO l_number_rows.


IF l_number_rows >= 1.

OPEN CURSOR l_db_cursor FOR SELECT * FROM (l_tabname).

DO.
FETCH NEXT CURSOR l_db_cursor INTO TABLE <gs_table> PACKAGE SIZE co_package_size. " <== Here comes the dump, after the first 5000 " rows were written into the file

IF sy-subrc <> 0.
CLOSE CURSOR l_db_cursor.
EXIT.
ENDIF.


LOOP AT <gs_table> ASSIGNING <gs_wa>.
CLEAR l_xout.

DO.
ASSIGN COMPONENT sy-index OF STRUCTURE <gs_wa> TO <gs_fs>.

IF sy-subrc = 0.
IF sy-index = 1.
l_xout = <gs_fs>.
ELSE.
l_xout_field = <gs_fs>.
CONCATENATE l_xout l_xout_field INTO l_xout SEPARATED BY c_hor_tab.
ENDIF.
ELSE.
EXIT.
ENDIF.
ENDDO.

APPEND l_xout TO lt_write_data_to_file.
ENDLOOP.

CALL FUNCTION 'GUI_DOWNLOAD' " <== Here I already tried a lot of download functions and also implemented an own function to download
EXPORTING
filename = l_output_path
filetype = 'ASC'
write_field_separator = 'X'
append = 'X'
TABLES
data_tab = lt_write_data_to_file.

CLEAR lt_write_data_to_file.
ENDDO.

ENDIF.

Add comment
10|10000 characters needed characters exceeded

  • Follow
  • Get RSS Feed

1 Answer

  • May 15, 2014 at 01:33 PM

    GUI_DOWNLOAD and similar functions for local download interrupt the database cursor of package selects, as you have found out the hard way. I don't see an easy workaround for this at the moment, without trying something very different like downloading to app server and then have the stuff transferred via FTP, or so...

    "An internal table can storage round about 500MB"

    Standard system settings (as I know them) allow up to 4 GB of memory usage before your process aborts with memory related short dumps. Please double-check the limit with your system administrator. Maybe the parameters can be increased temporarily until your task is done.

    "some of them are up to 8GB or 250.000 rows big"

    This would be around 32K per row, are you sure about the 8 GB size?

    Maybe you can try reading it all into an internal table at once and do the GUI_DOWNLOAD in packages using the APPEND parameter (that's how I did something similar a while ago).

    Thomas

    Add comment
    10|10000 characters needed characters exceeded

    • You said there is no primary key, I don't want to believe this, as DB tables as I know them must have one. Please clarify. By using primary keys, you could try to come up with a logic like this:

      DO

      SELECT ... UP TO x ROWS WHERE primary_key > saved_value ORDER BY PRIMARY KEY

      EXIT if sy-subrc <> 0

      download

      save last primary key value(s)

      ENDDO

      Thomas