Skip to Content
author's profile photo Former Member
Former Member

Unnesting XML and memory tables

Post Author: Guy Jeffery

CA Forum: Data Integration

Hi,

I'm having issues with unnesting incoming XML messages within a real time job. The XML is being partially unnested into intermediate memory tables, and the intermediate memory tables are used to populate physical database tables. This works fine on a small scale - if I only have a few physical tables to populate, it works as planned. However, when I increase the complexity of the job, the job will begin to execute and then just stop and 'hang'. There are no error messages, and the log for that job has a blue tick next to it - but the job never completes.

I've had a look through the technical help file with BODI, and searched this forum, but can't find anything which mentions the situation above. Is it likely to be a memory issue? Or is there some upper limit on the number of objects used within a real time BODI job?

Has anyone experienced these issues before, and found a resolution? Any suggestions would be greatly appeciated, as I need a solution pretty urgently. Thanks in advance.

Guy

Add a comment
10|10000 characters needed characters exceeded

Assigned Tags

Related questions

6 Answers

  • author's profile photo Former Member
    Former Member
    Posted on Nov 22, 2007 at 10:19 AM

    Post Author: bhofmans

    CA Forum: Data Integration

    There is no upper limit on the number of objects that can be used, but it might be a memory issue.

    You could check the memory size of the al_engine when running into this issue. When you're getting close to 2 GB you will run out of memory and your job can stop (on Windows and Linux).

    In DI 11.7 we've made improvements to our memory handling by providing pageable cache options.

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Nov 22, 2007 at 10:42 AM

    Post Author: Guy Jeffery

    CA Forum: Data Integration

    Hi Ben,

    Thanks for the reply!

    I was wondering how BODI uses memory tables, in an effort to reduce my memory usage. Currently, I receive my nested XML, and read it into a memory table. I then use this memory table as a source for around 150 physical tables. Each of the 150 tables are populated in separate dataflows, with the memory table as a source, and unnested 3 levels - by unnesting 1 level each in 3 separate Queries - and populating the table.

    Do you have any tips for ways I might be more efficient, and hopefully resolve my problem?

    I've checked my memory usage, and it seems to find an upper limit of about 1.5G, rather than 2GB.

    I appreciate the helpful support!

    Thanks,

    Guy

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Nov 22, 2007 at 11:09 AM

    Post Author: bhofmans

    CA Forum: Data Integration

    Guy,

    In real-time jobs, the whole job is executed in one al_engine process, this includes all data processing, but also the memory tables. It's started once and stays active the whole time to treat incoming messages.

    This is different for batch jobs where each data flow is a seperate process, and data flows are started in series or parallel depending on the job design, once finished processing they disappear again.

    So for your real-time job this means the whole processing can only use a maximum of 2 GB of memory (upper limit for 32-bit processes). Upgrading to DI 11.7 would give you more options like using pageable cache which will use disk space as additional memory to overcome the 2 GB limitation. But the first thing to check would be the amount of memory used - I'm just assuming here you need more then 2 GB, you can easily check this via the Windows task manager. Which DI versions are you using ? Is this on Windows ?

    - Ben.

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Nov 22, 2007 at 11:49 AM

    Post Author: Guy Jeffery

    CA Forum: Data Integration

    Ben,

    Thanks again for the reply, and for aiding my understanding.

    Unfortunately, we're using DI 11.5.1, and there's no option to upgrade. It's running on Solaris rather than Windows. As I said before, the al_engine process seems to reach a maximum of about 1.5 GB before hanging.

    So I was hoping there would be something I could around job design i.e. the most efficient way of using the memory tables, when I need to make repeated calls to them within one execution of the real time job. Is this possible at all?

    Thanks,

    Guy

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Nov 22, 2007 at 08:41 PM

    Post Author: bhofmans

    CA Forum: Data Integration

    No, I don't see a generic solution in job design you could try. Please open a case so that an engineer can investigate and make recommendations.

    I'm a little surprised you see this behavior before you reach 2 GB, there might be something else happening after all. Again, support should be able to investigate this.

    Add a comment
    10|10000 characters needed characters exceeded

  • author's profile photo Former Member
    Former Member
    Posted on Nov 23, 2007 at 11:44 AM

    Post Author: Guy Jeffery

    CA Forum: Data Integration

    Thanks Ben, I'll look into that.

    Guy

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.