Skip to Content
author's profile photo Former Member
Former Member

Table Comparison Performance

Anyone has experience with using table comparison transforming for large volume data, e.g. 10M records.

How is the performance?

Any experience and suggestions is welcome.

Add a comment
10|10000 characters needed characters exceeded

Assigned Tags

Related questions

1 Answer

  • Posted on Jul 28, 2009 at 01:38 PM

    Performance will depend on a lot of factors, but the two big ones are:

    1. how many columns you are using in your compare (more columns will be slower)

    2. how you do the compare (sorted input is WAY faster than row comparison)

    Without knowing any details about your data or the process you are trying to implement it's difficult to make any additional recommendations. But keep in mind that there are other options - especially for really big data sets.

    For example, you can always use a two step process that first deletes existing records matching the incoming set of primary keys, and then do a straight insert of the rows. This will avoid the whole comparison step - and avoid doing updates which are much slower than inserts.

    However, this only works if you are replacing the existing records. It wouldn't work if your table compare is part of a type 2 dimension load or something that requires you to track history.

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.