cancel
Showing results for 
Search instead for 
Did you mean: 

DTP is taking long time while loading data from DSO to cube or Cube to Cube

0 Kudos


Hi ,

When i try to load the data from DSO to cube or cube to cube , dtp is taking long time.

we are using dropping?creating step in PC.

we are loading 10 lakhs records which is taking 1.30 Mins.

No Start/End routine. No formula.

In DTP monitor, i found it is taking time in "Conversion of Characteristic Values to SIDs" step.

could you please advise.

Regards,

Vinay Saraf

Accepted Solutions (1)

Accepted Solutions (1)

former_member214415
Active Participant
0 Kudos

Hi Vinay,

This error is coming because SID is generated at the time of loading data to info cube. So better to create SID at the time of activation of DSO only. So data loading will be good and time will be less because conversion of characteristics to SID will be avoided.

Loading data from cube to cube is rare.

Follow DTP performance activities also.

Thanks,

Swapna Jain

0 Kudos

Hi Swapna,

Your answer is valid when we are loading the Data directly from Datasource.

Here we are loading to DSO and From DSO we are loading to Cube.

Regards,

vinay saraf

former_member214415
Active Participant
0 Kudos

Hi Vinay,

Anyhow, you loaded master data first and then you are loading transaction data. So SID is generated at the time of loading master data only. But if for some characteristics SID is not generated, then it will get generate. Now the question is When it should be created?

You are transferring data from DSO->Cube right. First you are loading data to DSO but you are using it as a staging layer so skip SID generation upon DSO activation.

But if you are having reporting on DSO then you should do SID generation upon activation. So the reporting performance will be good.

Thanks,

Swapna Jain

Answers (2)

Answers (2)

0 Kudos

Hi All,

Look at the below screen shot.

I was taking about this.

former_member186053
Active Contributor
0 Kudos

Hi Vinay,

What is the datapackage size you set in your DTP, as it will also impact DTP load performance.

Have you defined semantic group in your DTP,this will reduce the performance.

892513 - Consulting: Performance: Data loading, number of packages, request size

Thanks,

Vengal.

0 Kudos

Hi Vengal.

Data Packet size is 50,000.

We havent used  the Filter and semantic Groups.

How to select Semantic Groups , i mean on which basis we should Identify the Infoobject which should be used for semantic grouping.

Regards,

vinay saraf

shalaka_golde
Participant
0 Kudos

Hi,

I think you should avoid creating Semantic Groups because it introduces an extra overhead of sorting and grouping the data before it is loaded into the target.Please check if the cube has been compressed.Since indexes are created on the F-fact table,if the cube is not compressed the index creation as well as deletion steps will also take time.

Regards,

Shalaka

shalaka_golde
Participant
0 Kudos

You can also check if there is a routine written in the transformation between DSO to Cube and Cube to Cube and whether the code has been performance tuned.

Regards,

Shalaka

0 Kudos

Hi,

The transformation is plane.

No routines, No formulas.

Regards,

vinay saraf

shalaka_golde
Participant
0 Kudos

Hi,

Ok.Then can you check the Compression status as well as the dimension to fact Table ration of this cube?

Regards.

Shalaka

KodandaPani_KV
Active Contributor
0 Kudos

Hi,

can you increase the parallel process in the DTP screen.

increase to 3- 6 process.

select the DTP -> go to the go to  menu-> select setting for batch mangaer -> increase the process -> save it execute the DTP.

check the process avalibility in SM50/51.

-Phani.

0 Kudos

Hi Phani,

We can use this option, when we have enough Background Job.

When this DTP will be running most of the Jobs will be occupied.

I want to know what this " Conversion of Characteristic Values to SIDs" does.

This is happening in Most of the DTP when we try to load to cube.

Is anything we are missing?

Regards,

Vinay Saraf

KodandaPani_KV
Active Contributor
0 Kudos

Hi,

have you loaded the master data for rescpetive transaction data.

while loading the transaction data before master should be load respective targets.

-phani.

0 Kudos

Hi Phani,

Since the issue is present in PRD, we are loading the master data first and later Transaction data.

Regards,

vinay Saraf

0 Kudos

Hi Phani,

The same thing happens with Drop and create index step.

It is taking more time to complete.

The create index step is taking 45 mins to complete.

can you advise.

Regards,

vinay saraf

former_member186053
Active Contributor
0 Kudos

Hi Vinay,

please look into the sap note, which talks about compression how it improves your create or delete index performance.

590370 - Too many uncompressed request (f table partitions)

407260 - FAQs: Compression of InfoCubes

Regards,

Vengal.