cancel
Showing results for 
Search instead for 
Did you mean: 

How to Improve Performance of DTP in SAP 7.3.

Former Member
0 Kudos

when i am running a DTP  Maximum time is spent for rules in DTP . Total loading times is 14 min and having 10 data packages and at each data package 1m  is spent i.e. around 10 min are spent at rules part ,how to reduce this time

Accepted Solutions (1)

Accepted Solutions (1)

JasonLax
Product and Topic Expert
Product and Topic Expert
0 Kudos

This thread is currently in SCN Support, which is the support space for SAP Community Network and not SAP products.  Please move it to an appropriate space where topic experts will see and reply to it: SCN Site Index

JasonLax
Product and Topic Expert
Product and Topic Expert
0 Kudos

Moving thread to SAP NetWeaver Business Warehouse.

Answers (9)

Answers (9)

0 Kudos

If the issue still could not be resolved, please have a look at th recently released guided answer for DTP perforamance:

Guided Answers: Data Transfer Process (DTP) performance/memory issues

Best Regards,
Jiandong

former_member753391
Discoverer
0 Kudos

Please make sure only minimal calculations are carried out in the rules.

Prepare all the data and calculations in the start routine as much as possible 

mohdshadab_shaikh
Active Participant
0 Kudos

Hi Nadeem,

You can try for following options.

1. Make it parallel processing. This will automatically keep processing packages at the same time. This means you save time on steps which would take time and be ready for next steps. This would ensure there is not time lag until next package start processing. Some of the steps would be completed.

2. Keep it full/delta but decrease the package size from SAP standard 50k to <20k. This will ensure that the data is divided in small chunks for less memory would be required to process each package. This will ensure faster processing. This option is very useful when you are processing milllions of records since many systems do not have enough to hold it together, in such case we decrease the package size.

3. Ensure that the indexes of the loading cube are deleted before loading and re-created after loading. This increased loading performance tremendously. In many heavy loads, the loading fails if the indexes are not deleted.

Hope the above options helps you.

Mohamed Shadab

Former Member
0 Kudos

Dear Nadeem,

Please check the Individual field mapping rules (if transfer strucuture contains many number of fields and routines loading takes longer time!)

  • Check in routines if any unnecessary select queries running!
  • Avoid and be specific while writing the routines (Start and end routines) because for each packet these routines will get execute.

Check the above things and optimize the code.

Regards

Kiran N

0 Kudos

Hi Nadeem,

In case there are custom routines (with ABAP code) you might want to check SAP note 1847431

Regards,

René

abdullahqureshi
Contributor
0 Kudos

Hi Nadeem,

Check out these docs:-

http://scn.sap.com/docs/DOC-30461

http://scn.sap.com/docs/DOC-31781

Regards,

Abdullah

Former Member
0 Kudos

We have semantic keys defined in DTP which will also increase the data loading performance. Enter value under Package Size on the Extraction tab in the DTP maintenance transaction. Try To avoid load large volume data into a single DTP request select Get all new data request by request in extraction tab.

In case if ur doing full first load from DSO to target always loads from Active table as it contains less number of records with Change log table. In case load to DSO, we can eliminate duplicate records by selecting option "Unique Data Records". If you select this option then It will overwrite the master data record in case it time independent and will create multiple entries in case dime dependent master data.

and go through this link:

http://help.sap.com/saphelp_nw70ehp2/helpdata/en/47/e8c56ecd313c86e10000000a42189c/content.htm

chanda
Contributor
0 Kudos

Hi Nadeem,

Check your RSBERRORLOG Table entries in Se16.

secondly in ST14-> other tables --> here you can see RSBRRORLOG TABLE size.

When the DTP is running go to SM51 and check the job is struck at which Table ( last column ).

If it is always errorlog table, ask basis to delete the entries in error log table and do re organisation of this table.

This will help improve the DTP performance.

Regards,

Sudhir

Former Member
0 Kudos

Hi,

What type of transformation mapping do you have for the transforamtion corresponding to this DTP?

If there are any routines try optimising the codes.

Also check the Parallel processing settings in your DTP.

Prathish