Skip to Content
0
Former Member
Jan 26, 2007 at 08:19 PM

Fetch next cursor/ invalid database interruption

331 Views

Hi ,

I need to select documents from BKPF table and also select corresponding documents from BSEG table process them and write it to a text file, for a particular company code there are millions of records in bkpf table and millions in bseg respectively.

internal table cannot store millions of documents so, I am selecting in packets using Open cursor and Fetch Next cursor.

my issue is when I select 1000 records and process them and write to file, and when cursor fetches for next 1000 records it is terminated stating " Invalid interruption of Data base.

eg: for a particular selection there are 50,000 records total to be downloaded to text file.

code :

OPEN CURSOR gc_bkpf FOR

SELECT bukrs

belnr

gjahr

blart

budat

monat

cpudt

cputm

usnam

xblnr

bktxt

waers

FROM bkpf

WHERE bukrs EQ p_bukrs

and bstat IN (' ','A','B')

and budat in sbudat

AND gjahr in sgjahr

AND monat EQ i_monat-monat.

DO.

refresh out_tab.

clear out_tab.

refresh t_bkpf.

FETCH NEXT CURSOR gc_bkpf

INTO TABLE t_bkpf

PACKAGE SIZE 1000.

IF sy-subrc NE 0.

CLOSE CURSOR gc_bkpf.

EXIT.

ELSEIF sy-subrc EQ 0.

( processing logic )

*

*

*

*

*

CONCATENATE f_itab-belnr zdelim

f_itab-blart zdelim

f_itab-bktxt zdelim

f_itab-xblnr zdelim

f_itab-hkont zdelim

f_itab-ktoks zdelim

f_itab-txt20 zdelim

f_itab-usnam zdelim

f_itab-bukrs zdelim

f_itab-monat zdelim

f_itab-cpudt zdelim

f_itab-budat zdelim

f_itab-waers zdelim

wrbtr1 zdelim

dmbe2 zdelim

kzkrs zdelim INTO out_tab.

append out_tab.

endif.

  • Download records

perform dowload_new. ( In this perform I am using FM GUI_DOWNLOAD or open dataset )

enddo.

After downloading 1000 records when it fetches for next 1000, it is getting terminated, but I noticed if I remove the Perfom download_new and place it after enddo, it is not terminating and all the 50,000 records are downloaded.

I know it will work for less amount of documents, I am concerned about huge no of records.

Please let me know how to solve the issue.

Thanks