Skip to Content
author's profile photo Former Member
Former Member

Performance: in-memory cache is used instead of pageable

Hi,

I have a dataflow which contains a nested schema (XML extraction)

When I execute the job I see in the log that in-memory cache is used because there is a nested schema. The problem is that after few minutes the memory gets to 99% usage (23.2GB out of 24GP) and the process is very slow.

Is there a way to improve the performance?

Thanks,

Amir

Add a comment
10|10000 characters needed characters exceeded

Assigned Tags

Related questions

5 Answers

  • author's profile photo Former Member
    Former Member
    Posted on May 29, 2013 at 04:04 PM

    Hello

    There are many things you can do to improve performance, the Performance Optimization Guide here might help - http://help.sap.com/bods.

    Can you tell us a little more about what else is in your dataflow and your data volumes?

    Michael

    Add a comment
    10|10000 characters needed characters exceeded

    • Former Member

      Hi Michael,

      Some additional information:

      I read from a file of 7GB size ~55M rows

      I have a string separated by "/" with unkown number of parts, I separate it using XML.

      I have another string, separated by "," which is being manipulated as above.

      In additions arround 3 lookups are done for each row with another DB table containing 30M rows.

      Thanks,

      Amir

  • author's profile photo Former Member
    Former Member
    Posted on May 29, 2013 at 05:09 PM

    Hello,

    Right click job > properties and change the in-memory cache to pageable cache. Also check the processes that could be push down to database level and check for the number of lookups involved in the transformation.

    Arun

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on May 30, 2013 at 05:47 AM

    Hi Amir,

    As per my knowledge ,we can not use Pageable cache instead of InMemory for Nested Data.

    Two possible ways can be tried:

    1-Is is possible for you to increase the RAM size or/and Space in your OS drive(means wherever you have installed OS and BODS ).

    2-Is it possible to make file in pieces and then use(i know it will be really hard because of TAGS issues.But with the help of XML modifier you can try it out ).

    Please try once and update the thread 😊

    Regards,

    Shiva Sahu

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on May 30, 2013 at 07:42 AM

    Hi Amir,

    I don't know/undertsnad the full logic routing behind your dataflow, but one thing comes to mind is to consider using

    XML_Pipeline transform. So if you are using for example a very large XML file then this transform will help splitting it and processing it in small batches and this will of course improve the performance.

    See Refence guide for more details

    Thanks

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on May 30, 2013 at 08:35 AM

    Hi,

    One clarification - I am NOT using an XML file as input file, instead, the XML is created during data flow execution from an unknown (in advance) string separated by delimiter (delimiter is being replaced by XML tags and from that string an XML is extracted).

    Thnaks,

    Amir

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.