Skip to Content
0

Bad performance CR Runtime 2013 SP18

Dec 20, 2016 at 10:48 AM

71

avatar image
Former Member

Hi,

We have a new application based on a SQL 2014 server and all printings are made with Crystal Reports via CR runtime .NET, which we updated now to latest SP18.

The new application is deployed by several customers as an upgrade from an old version based before on a Pervasive.SQL with an old CR print engine (crpe.dll) : which was allways running very fast everywhere !!!

Our big problem now is that after selling sometimes an expensive upgrade (SQL Standard, etc...), customers are angry because very often they have poor performances in preview/print reports (±5-10 seconds to print an invoice of one single A4 page : with the old CRPE it was 1-2sec !!!) but which can sometimes be very bad (±20-30 seconds !?) on some PC's. We saw that especially on Win8-64bit and Win7-32bit it's a real disaster with allways more than 12-15 sec as with a Win10-64bit it takes 3-4sec which is double slower than before with P.SQL/CRPE !

How can it be that printing takes so much time with the CR runtime connected to a 2014 SQL server ? The application itself runs fast, it's only this huge repons time by printing out.

What we have already tried :

- deactivate "verify on first refresh" in all reports : no significant change

- check "no printer" option in layouts : no change

- upgrade to latest available SP18 : nothing changes !

- preload an empty report at application start so that the first preview/print : a few seconds less at first time ... but application start of cours is now a bit slower, but it's better for customers.

- change the spooler parameter from default "Start printing immediately" to "Start printing after last page is spooled" : nothing changes !

What can we do more ?

The best of course is to upgrade Win8 to Win10 and sell new Win10 computers instead of old Win7 32bit but first we can not do it everywhere (so customers wants not to upgrade because of other graphic programs, etc...) and second it only gives a double respons time that the CRPE before ! So we have definetly to find a solution to make it faster otherwise we will lose some customers where the upgrade is installed and we can not sell/install the upgrade by the other customers as we decide now to find a solution for those bad performances before continuing to install upgrades !

Any idea on how to speed up this tremendous blackbox "CR runtime .NET" ?

Thanks and regards

Alain

Eicher B.C. (Belgium)

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Dell Stinnett-Christy Dec 20, 2016 at 02:38 PM
0

The old crpe engine was, I believe, direct access to the COM objects that run reports. The .NET SDK is a .NET wrapper around some COM objects and is slower because there is more going on to communicate with the runtime - you not only have the COM communication but also the .NET framework communication.

There are a few things you can do in your reports to speed them up:

1. Make sure that your entire formula in the Select Expert is being passed to the database for processing. In the report, go to Database >> View SQL to see the query that is being run and make sure that all conditions from the Select Expert are there. If they're not, the report is processing some of the data in memory which can slow the report down.

a. Avoid "if" statements in the Select Expert.

b. Avoid using Crystal formulas in the Select Expert, with the exception of things like CurrentDate. Frequently you can replace the Crystal formulas with a SQL Expression that will force the data to be processed in the database.

2. Do NOT use a Command that is linked to anything - a table or another command. This will cause all of the data to be pulled into memory so that the joins can be processed there.

3. Avoid connecting to more than one database. This will also cause data to be processed in memory.

4. If you're summarizing a lot of data in your report, it may be better to use a command to push the summaries down to the database. This will speed things up due to a couple of things:

a. The database is more optimized and usually has more memory for this type of processing.

b. You're returning much less data to the report so there is less network traffic.

And in the database, make sure you have indexes for the field or combination of fields you're using in your joins.

-Dell

Share
10 |10000 characters needed characters left characters exceeded
avatar image
Former Member Dec 22, 2016 at 09:11 AM
0

Thank you for your detailled respons !

I think points 1, 2, 4 we have not as our Select Expert is allways empty in our reports BUT indeed we can't avoid using multiple databases ! All our reports are based on 3 Databases with indeed external joins as we had no other choice I think.

Is only this the reason why it is so slow ?

What do you mean with "processed in memory" ? In memory from server or each PC/Terminal so that everything is passing through network ?! This would indeed be a disaster if it's doing this. We allways had multiple databases, also in Pervasive and all worked fine in crpe.

Our application is used for multiple companies, each one has its own database of course with his products, invoices, etc... but they have also common data like Zip codes, Intrastat Codes, and so on and sometimes also common customers/suppliers : so all this data is grouped for all companies once in a COMMON Database. It would be stupid to replicate all this data, with all disadvantages that this contains for users. And we have also a third database for each user where we write the document that has to be printed : that's why we can avoid using the Select Expert as it has to be of course allways dynamic selection on a invoice number, on a receipt, a product label, etc...

So I really don't know how avoid to use those 3 databases... What can we do to speed it up ?

Alain

Share
10 |10000 characters needed characters left characters exceeded
Dell Stinnett-Christy Dec 22, 2016 at 03:47 PM
0

"In memory" is in memory wherever the application is running. If it's a web app or a three-tier app where there is server-side processing, this means they process on the server and multiple concurrent reports would compete for memory. If it's a desk-top app, this means on each PC/Terminal.

I haven't really worked with Pervasive. Does it give you the ability to create a "database link" to a database from within another database? If so, I would create a link from each customer database to the common database and then views in the customer db that pull the data from the common db. This will get rid of one of your database connections.

If it were my project, I might also look at getting rid of the individual user databases and instead use a table in the customer database that not only has the info about the report to run, but also a key on the user ID so that it's easier to select the correct record in the table.

I realize this type of re-architecture is probably a long-shot, though.

-Dell

Share
10 |10000 characters needed characters left characters exceeded