on 05-26-2017 6:30 PM
At the moment I am just trying to write a report.
When I preview it runs for about 25 minutes and then says it is out of memory. i only have 3 fields on the report so far. One is a Group header, one a Group footer.
it's querying an MS-SQL server database that has 480+ million records.
It's normalized and I'm accessing three tables, one field from each record.
I've googled and searched here. I haven't found anything that discusses how to fix this. Surely, CR can handle a large dataset like this. (Surely?) 🙂
Thank-you
1. Open your report
2. Go Database > Show SQL Query
3. Copy the SQL Query
4. Create a new report connecting to the same database, but this time use "Add Command" option
5. Paste the SQL query to the Command window and modify the first line - use "Select Top 1000000" instead of the simple "Select"
6. Refresh the report. If it works increase the number
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Though this thread doesn't solve the original problem, it certainly explains it.
The result limit portion will at least allow me to get reports written until I discover how to work around this.
I'll likely be posting a question soon, on, 'How to people work around the 32 bit memory limitation problem?'
I am guessing stored procedures, commands, views, or who knows, maybe there is a 64 bit version!
Thank-You for your help Today,
-= Bruce
Can we get some more information?:
1. Fields could be just numeric fields or could be Memo type fields up to 2048 characters long. It makes a difference for memory consumption
2. Three tables are linked or not? It also makes a difference.
Run report with Task Manager open and monitor memory consumption. If memory goes above 1.7 GB then it probably hits 2 GB Windows limitation for 32 bit application.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
All fields are listed as VARCHAR and set to the exact number for characters needed per field. (most under 20, a few as high as 50)
Though the actual data in the fields can be numeric, IP, date, the field type is listed as VARCAHR.
This specific query that is being performed is using three fields, two (again in the DB as VARCHAR) are numeric, and one text.
All tables are linked IN CRYSTAL using a common key I added to each table which is unique per row. There are no duplicate 'keys'
Each table in MS-SQL has a clustered index on that key field only.
I have enabled 'perform queries on server.' and am monitoring the query on the servers Activity monitor.
Since this is such a long time consuming process, I will update this reply once it completes as to your last unanswered question. (memory consumption on the Crystal machine)
As promised, here are the results of 'out of memory' test:
Task manager shows:
1,626,156k at the time the Crystal pop's up the 'Out of memory' box.
Crystal shows it stopped at:
76,115,046 rows
I will try limiting the data set. The problem with this, even if it works is, I need to be able to run reports on all the data. if it's 400 million, or 800 million rows. 🙂
That is what I expected - 1.6 GB definetely means you are reaching 2GB Windows limitation. No ways to change the limit.
Another workaround solution is to reduce the number of columns in the report.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I just spent the last 40 minutes searching for how to limit the results of a dataset per you reply:
You can also limit your dataset and test for 1 million records, 10 million records, 100 million records.
I can't find it anywhere. Can you explain how to limit how many records are queried for on the MS-SQL backend?
User | Count |
---|---|
84 | |
25 | |
12 | |
9 | |
6 | |
6 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.