Skip to Content
0
Former Member
Oct 08, 2013 at 04:57 PM

DTP Filter Dups process taking much longer

38 Views

Hello,

I have a dtp that has always been set to handle dup record keys and loading has been very fast. Lately we have noticed that loading the same amount of records there has been a great increase in runtime. Comparing past loads to current loads I see that the increase in runtime is on "Filter Out New Records with the Same Key" in each package. It went from an average of 7secs per package to an average of 2.5mins for each package. There has been no changes to the transformation or dtp. Package size\Semantic Key\etc. is exact same. What could of changed to make the Filter step jump up so high?

Thanks,

Alex