Hi,
we are in support phase of a project.
As is the usual scenario in data loading, we had indexes deleted before running a infopackage in the relevant cube and post loading, create indexes, refresh statistics.
But as on now, the time being take to refresh statistics has increased to a considerable level, wheer we cannot afford that time. To give an idea about this the increase in time is from 10 mins to 60 mins.
As I understand about the statistics role, they are to get the information about the performance of the queries over that cube, viz DB response, query run time and so on.
Can we drop the process of statistics refresh? If so then what all can be the impacts of the same?
Thanking you in advance.
Regards
Naveen.A
Hi Naveen,
With a huge amount of data you can reduce % of Statistic calculation and also run this job only one time per week, and not for every upload.
Statistics are very important in particular at first steps and uploads, but it is not necessary to run the same forevery upload.
Ciao.
Riccardo.
Naveen,
Statistics uses a sample amount of data for maintaining data , you could set this to a lower limit .
On dropping statistics .. you lose some query performance monitoring information and data loading tuning information , but then if your queries are already tuned and data loads are fine , you could drop them ... also the statistics tables get very huge , thereby causing some time to refresh the same , archive your statistics and delete old statistics which have been archived for faster performance.
Arun
Assign points if useful
Add a comment