Skip to Content
0

Data Services Jobs much slower after migration

Mar 01, 2017 at 03:45 PM

173

avatar image

Hello community,

we are currently migrating our Data Services Jobs (which are ETL processes for a data warehouse) from DS 3.2 to DS 4.2 SP7.

Most of our jobs didn't work directly after the migration, but we make process in fixing these errors. However, we noticed that our jobs are much slower than before the migration - about 3.5 to 4 times. With this behavior, we get a total runtime for all jobs that isn't acceptable anymore.

The migrated DS runs on a server which has more CPU/memory than the 3.2 one.

I'm aware of the performance optimization guide - but I still have some questions:

1. Is it normal that the jobs are much slower in a new version?

2. Are there any "low hanging fruits" / actions which we should implement first to improve the performance?

3. Are there any tips/ best practices in addition to the performance optimization guide which could be practical?

I would appreciate any help.

Regards,

Marius

10 |10000 characters needed characters left characters exceeded

Hi Marius,

Has the (default-) package size changed because of the upgrade? As in, data is transferred in 1000 recs per package instead of -say- 5000? In my experience, that makes a huge difference in performance.

cheers,

Eduard

0

Hi Eduard,

where do I have to set the package size / where can I check this setting?

Regards,

Marius

0
* Please Login or Register to Answer, Follow or Comment.

2 Answers

Arun Sasi Mar 02, 2017 at 02:17 PM
0

Hi Marius,

Did the migration succeed properly from DS 3.2 to DS 4.2 SP7? It seems that something went wrong during the upgrade process.

What is the upgrade scenario which you used? Please refer below link and answer

https://wiki.scn.sap.com/wiki/display/EIM/Best+Practices+for+upgrading+older+Data+Integrator+or+Data+Services+repositories+to+the+latest+version+of+Data+Services+4.2

As Eduard mentioned, you can try increasing the Rows per commit size under the Target table options from 5000 to 10000

Also after upgrade process did you do a thorough regression testing of the jobs?

Regards

Arun Sasi

Share
10 |10000 characters needed characters left characters exceeded
Marius Margraf Mar 02, 2017 at 02:54 PM
0

Hi Arun,

the migration succeeded properly as far as I can say. There was no error when we upgraded the repository.

At this time, we are currently right in the middle of (regression) testing our jobs - this is why we noticed that they are a lot slower than before. After migration, we had to fix some errors which came up when we executed the jobs. To improve performance, we added join ranks to our data flows, which improved the performance just a little bit.

We were following the scenario 3 of the article you mentioned. In fact, we added an extra step (after step 8 of the scenario) and exported the .atl file from the upgraded repository and imported it into an new fresh 4.2 repository so that we don't work on an upgraded repository.

I will try to increase the rows per commit size, but I'm not sure if this will help. Source and target database are the same as before upgrade/migration - if SAP didn't make the data services worse, it has to do something with the server running data services or some settings in the tool...

Regards,

Marius

Show 2 Share
10 |10000 characters needed characters left characters exceeded

What is the system configuration of the new 4.2 Job Server? I hope it has more than 32 GB of RAM and enough disk space.

It would be better if you could check the joins again so that we can rule out the issue with the joins. Also could you check if the Database has any performance issues(thinking loud)?

Regards

Arun Sasi

0

Hello Marius Margraf,

Did you find the root cause?. Did you narrow down to the Data flow level to see which ones are taking long time to execute?. There must be some pattern to the issue.

0