Skip to Content
0

Data binding for cascading filters - massive impact on performance?

Aug 17, 2017 at 09:24 AM

78

avatar image

Hi All,

I'm currently working on a project where there's a requirement to have a large number (20) of cascading filters working together to filter ~10 data sources. Filters are applied using dropdown boxes, with all dropdown boxes data bound to different dimensions of the same data source. Some of the dropdowns will be populated with large amounts of data - >20,000 items.

Data binding all the dropdowns appears to have had a massive negative impact on performance to the point that the dashboard is unusable.

I've checked BO and BW server utilisation, neither are being overworked. Backend is BW on Hana so I would have expected good performance.

Any tips or ideas on how to improve performance?

Thanks a million,
David

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

1 Answer

Martin Pankraz Aug 17, 2017 at 11:12 AM
0

Hi David,

I would recommend to check DesignStudio's performance assisstant. You can run it by adding the URL parameter &PROFILING=X. There are a couple of blog postings explaining the numbers like

https://blogs.sap.com/2013/12/08/design-studio-tips-and-tricks-measuring-performance/

You might also want to check the performance best practice guide.

In addition to that I would like to ask if you tried using the Dimension Filter component already? In there you can define target data sources to implement a cascading filter. Judging from your posting it sounds you are using the standard dropdown instead. It might be worth checking those.

Apart from that I would also check if it is possible to reduce the number of queries by merging. At last have a look at parallel data source execution.

Kind regards

Martin

Show 2 Share
10 |10000 characters needed characters left characters exceeded

Hi Martin,

Thanks for your suggestions.

I've been running Traces in Design Studio - all I've learnt from these is that a lot of time is spent retrieving data sources... What I'm really wondering is: is it typical for data binding a set of dropdowns to a data source to take so much time to load? What could I do to improve the time taken to populate dropdowns? Each dropdown is populated with a single dimension, maximum of 50,000 items.

I've proposed the dimension filter, but it's not what the client is looking for in terms of aesthetic.

Finally - how will merging queries improve load time for data bound filter dropdowns? All dropdowns are populated from a single query.

Thanks,
David

0

Hi David,

So it seems you need to stop loading values all the time. Did you consider getting rid of data binding and populating the values using the scripting methods of the dropdown? Of course you will need update methods then once filters impact the result set assigned to the dropdown. That will shift the workload to the front end. Apart from that having 500+ entries on a dropdown is not very useful.

I mentioned merging queries not only for reducing source data sources but also target data sources that you filter with the dropdowns. It is a general best practice to keep the number of queries low.

Kind regards

Martin

0