Skip to Content

Data binding for cascading filters - massive impact on performance?

Hi All,

I'm currently working on a project where there's a requirement to have a large number (20) of cascading filters working together to filter ~10 data sources. Filters are applied using dropdown boxes, with all dropdown boxes data bound to different dimensions of the same data source. Some of the dropdowns will be populated with large amounts of data - >20,000 items.

Data binding all the dropdowns appears to have had a massive negative impact on performance to the point that the dashboard is unusable.

I've checked BO and BW server utilisation, neither are being overworked. Backend is BW on Hana so I would have expected good performance.

Any tips or ideas on how to improve performance?

Thanks a million,
David

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • Aug 17, 2017 at 11:12 AM

    Hi David,

    I would recommend to check DesignStudio's performance assisstant. You can run it by adding the URL parameter &PROFILING=X. There are a couple of blog postings explaining the numbers like

    https://blogs.sap.com/2013/12/08/design-studio-tips-and-tricks-measuring-performance/

    You might also want to check the performance best practice guide.

    In addition to that I would like to ask if you tried using the Dimension Filter component already? In there you can define target data sources to implement a cascading filter. Judging from your posting it sounds you are using the standard dropdown instead. It might be worth checking those.

    Apart from that I would also check if it is possible to reduce the number of queries by merging. At last have a look at parallel data source execution.

    Kind regards

    Martin

    Add comment
    10|10000 characters needed characters exceeded

    • Hi David,

      So it seems you need to stop loading values all the time. Did you consider getting rid of data binding and populating the values using the scripting methods of the dropdown? Of course you will need update methods then once filters impact the result set assigned to the dropdown. That will shift the workload to the front end. Apart from that having 500+ entries on a dropdown is not very useful.

      I mentioned merging queries not only for reducing source data sources but also target data sources that you filter with the dropdowns. It is a general best practice to keep the number of queries low.

      Kind regards

      Martin