cancel
Showing results for 
Search instead for 
Did you mean: 

Performance - LDS with Update mode u201EDelete Allu201C

Former Member
0 Kudos

Hi experts,

I loaded data from a BW Cube using LDS with Update mode u201EDelete Allu201C for Reported Financial Data in Test System.

I checked the data in Totals Cube and noticed that, data has duplicated with reversed values.(not really deleted)

If we do it with a huge data volume (i.e in Production), does it produce a performance Problem?

Thanks.

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

HEllo all,

Duplicated data can be compressed and delete records with 0 value to overcome the performance issue.

Thanks,

Harry

Former Member
0 Kudos

That's right Harry,

i usually dont use "collapse" option for new request to analyse data in detail. But in this case that is the best way.

Thanks.

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi Cyber

Adding to Dan's point

You need to specify based on the posting levels wether you need to have an audit trail or not of number of times you run the tasks or load data & you want to have the lates one or all the task runs

If you choose to go with Audit trail it will have impact on Database & performance

rgds

Dheeraj

Former Member
0 Kudos

WHen you delete data from BCS' monitor or workbench, you never have records deleted. They are just zeroed (most likely by posting inversed records, as you noticed).

Former Member
0 Kudos

Yes, I noticed that it works like below if I use Update mode u201CDelete allu201D

Existing Data

Item: 1000 Partner: A Value:100

Data Collected

Item: 1000 Partner: A Value:120

Result After Data Collection with Update Mode u201CDelete Allu201D

Item: 1000 Partner: A Value:100

Item: 1000 Partner: A Value:-100

Item: 1000 Partner: A Value:120

just assume that,

existing data ->approx. 2 million data records

data collected -> approx. 2 million data records

Result After Data Collection-> 6 million data records (including inversed records)

Therefore after running LDS (Update Mode Delete All) the data volume will approx. triple and it will cause a performance problem.

Am I thinking correct?

Or is there any clever BCS functionality that prevent the performance problem in this case?

Regards.

Former Member
0 Kudos

Hi Cyber,

Maybe try to use Periodic input type, but then the data in your BW cube should be store as periodic key figures.

hope this help,

Pawel

Former Member
0 Kudos

This is system's "Delete All" functionality which is creating more records

Delete all functionality will delete the existing values & updates the new file for the specified period. which means system will negate the existing values for the give period.

To give an Example.

In a period you have uploaded first one and realised that you have uploded wrong data. So you have uploaded one more file.

File details are metioned bellow.

First file :

LE FS Item Values

001 FS0010 100

002 FS0020 200

Secound File:

LE FS Item Values

002 FS0020 300

After uploading the second file u can file following entries in LOTR.

001 FS0010 100

002 FS0020 200

001 FS0010 -100

002 FS0020 -200

002 FS0020 300

If u aggrigate them u will get the values of

001 FS0010 0

002 FS0020 300

(This nagitive values will be created if we upload 2 files (with delete all) in the same period.)

Edited by: kamal kishore on Jun 9, 2009 9:16 AM

dan_sullivan
Active Contributor
0 Kudos

If these are not posting level 00, then there are associated documents which are being reversed. Is that the case? If so there are setting for deleting documents when the task is re-executed.

Former Member
0 Kudos

Hi Cyber,

As per my understanding its the delete all functionality issue.

Please tell me did you load 2 files into the system for the same period. May be b'coz of data issue.

Thanks

Kamal

Edited by: kamal kishore on Jun 5, 2009 2:11 PM