Skip to Content
author's profile photo Former Member
Former Member

sql vs dataflow

Hi

I have table with 5 000 000 records and 100 columns (quite big table) and in one of my DF I copy all records into other table withpout any transformation.

I'm wondering why when I use DataFlow tha total time which is needed to copied all records is few times more then I use sql fucntion (select * from ) or event sql in script (insert into)

In DataFlow it takes 40 minutes, in sql functions: 15-20 minutes, in script 10 minutes

Could you explain me how it works?

Or maybe I need change some parameter in properties of DataFlow?

And how long does it take in your environment?

My DS cversion: 4.0 SP3

Database version: sql 2008 R2 (2 CPU; 32 GB RAM)

Aplication server: quad; 12 GB RAM

Regards

Add a comment
10|10000 characters needed characters exceeded

Assigned Tags

Related questions

2 Answers

  • author's profile photo Former Member
    Former Member
    Posted on May 16, 2015 at 07:17 PM

    If you are doing it with dataflow then you have to check the optimized SQL generated by BODS.

    If the generated SQL is Insert statement then that would be push down operation.

    So I would suggest you to check whether BODS is generating Optimized SQL.

    And fire that query on SQL dataflow.

    Thanks,

    Swapnil

    Add a comment
    10|10000 characters needed characters exceeded

    • Former Member Former Member

      Hi ,

      That is the main problem that your source and target are on different database.

      Normally when you trnasfer data from source DB to different target DB then it will take more time.

      So are there any DB present between those database ,if yes you can add that link to datastore configuration and then check the optimized SQL.

      So if optimized SQL generated will be insert statement then it would be very faster compared to normal transfer.

  • author's profile photo Former Member
    Former Member
    Posted on May 16, 2015 at 08:49 PM

    Hi,

    Can you increase Target loader

    Regards,

    Manoj.

    Add a comment
    10|10000 characters needed characters exceeded

    • Hi Rad,

      If you are simply comparing Source and Target then for updates and inserts you can use a Table comparison Transform to improve performance.

      The other target performance options(Rows per commit Size) which you have used will be faster only for Inserts.. For updates it will always be slow. There are also some Source options like Cache which you can set to Cache if the Source table is large.

      Other option at database level would be to created indexes on the target table which will give excellent performance

      Regards

      Arun Sasi

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.