on 06-05-2009 8:15 AM
Hi experts,
I loaded data from a BW Cube using LDS with Update mode u201EDelete Allu201C for Reported Financial Data in Test System.
I checked the data in Totals Cube and noticed that, data has duplicated with reversed values.(not really deleted)
If we do it with a huge data volume (i.e in Production), does it produce a performance Problem?
Thanks.
HEllo all,
Duplicated data can be compressed and delete records with 0 value to overcome the performance issue.
Thanks,
Harry
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Cyber
Adding to Dan's point
You need to specify based on the posting levels wether you need to have an audit trail or not of number of times you run the tasks or load data & you want to have the lates one or all the task runs
If you choose to go with Audit trail it will have impact on Database & performance
rgds
Dheeraj
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
WHen you delete data from BCS' monitor or workbench, you never have records deleted. They are just zeroed (most likely by posting inversed records, as you noticed).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Yes, I noticed that it works like below if I use Update mode u201CDelete allu201D
Existing Data
Item: 1000 Partner: A Value:100
Data Collected
Item: 1000 Partner: A Value:120
Result After Data Collection with Update Mode u201CDelete Allu201D
Item: 1000 Partner: A Value:100
Item: 1000 Partner: A Value:-100
Item: 1000 Partner: A Value:120
just assume that,
existing data ->approx. 2 million data records
data collected -> approx. 2 million data records
Result After Data Collection-> 6 million data records (including inversed records)
Therefore after running LDS (Update Mode Delete All) the data volume will approx. triple and it will cause a performance problem.
Am I thinking correct?
Or is there any clever BCS functionality that prevent the performance problem in this case?
Regards.
This is system's "Delete All" functionality which is creating more records
Delete all functionality will delete the existing values & updates the new file for the specified period. which means system will negate the existing values for the give period.
To give an Example.
In a period you have uploaded first one and realised that you have uploded wrong data. So you have uploaded one more file.
File details are metioned bellow.
First file :
LE FS Item Values
001 FS0010 100
002 FS0020 200
Secound File:
LE FS Item Values
002 FS0020 300
After uploading the second file u can file following entries in LOTR.
001 FS0010 100
002 FS0020 200
001 FS0010 -100
002 FS0020 -200
002 FS0020 300
If u aggrigate them u will get the values of
001 FS0010 0
002 FS0020 300
(This nagitive values will be created if we upload 2 files (with delete all) in the same period.)
Edited by: kamal kishore on Jun 9, 2009 9:16 AM
Hi Cyber,
As per my understanding its the delete all functionality issue.
Please tell me did you load 2 files into the system for the same period. May be b'coz of data issue.
Thanks
Kamal
Edited by: kamal kishore on Jun 5, 2009 2:11 PM
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
9 | |
4 | |
3 | |
2 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.