Skip to Content
0
Former Member
Nov 26, 2009 at 07:49 AM

DTP Performance Isuue

28 Views

Hi All,

I am trying to load historic data from PSA to Cube with direct update.

Some details -

Cube-0PS_C041, DS - 0CO_OM_NWA_1.

Data source has around 6 requests & total records are close to 700000.

The transformation is having start routine where master data tables (0ACTIVITY, 0PROJECT, 0WBS_ELEMT etc) are read into internal table & then used in a routines in rules.

The issue is DTP taking too time in production system as compared to quality. It's clear from data load details that start routine is taking time. But same start routine in QA took 2 mins for one packet of around 2000 records, which is taking around 10- 15 mins in Prod.

Tried increasing the backgrd processes but got dupm - 'memory space not availabel' in that case.

Start routine code is optimised & is in the form -

IF NOT SOURCE_PACKAGE[] IS INITIAL.

SELECT ACTIVITY CPR_PSGUID STATUSSYS0 COMP_CODE

BUS_AREA PLANT

FROM /BI0/PACTIVITY

INTO TABLE LT_ACTIVITY FOR ALL ENTRIES IN SOURCE_PACKAGE

WHERE ACTIVITY = SOURCE_PACKAGE-VORNR.

IF SY-SUBRC = 0.

SORT LT_ACTIVITY BY ACTIVITY.

ENDIF.

ENDIF.

Could any one suggest on enhancing the performance. It's being show-stopper in go-live!!

Regards,

Pritesh.