on 07-22-2016 8:09 PM
Hi All,
I there any limitation of no of records that can be inserted into global temporary table?
we have one job that process around 90, 000, 000 records. During process we are using dynamic sql and store result in global temporary table.
Now every time we run this job there is inconsitancy of data. There are around 1 M, that some time appear and some time not.
So, is there any limitation of records/temporary memory space that can be used to store data? I cannot find any other reason for this data inconsistancy.
Given that the data needs to fit into memory, I'd say that's a pretty obvious limit.
Since you avoid any effort to explain about the 'inconsistencies', we have to guess about the cause for those. My guess would be: there's a programming error.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Lars,
Since i said we used dynamic sql, we tried some differenent approch. Replacing dynamic sql and global temeporary table with static code and its turn out to data start comming consistantly about 90,000,000.
I have check the dynamic sql. thats a simpal insert into global temporary table.
So, my only guess that it will be because of Global temporary table, but have not found any material that there is some kind of limitation..
User | Count |
---|---|
95 | |
11 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.