on 06-25-2012 7:14 AM
I am having a requirement , where I have to read from a HANA analytical/calculation view and dump the results to a external database . the analytic view feeds off tables connected to ECC via SLT .
Thus I have to do delta writes ( have not decided yet , but perhaps will use data services SQL transform) to the external data base.
How do I do it. since views are "on demand" i,e does not have any persistancy , I cannot put a column with a flag. I do not want to write the entire contents of a view to another table in HANA ( might get as many as 250 million rows ) where I can add a column for a flag.
what are some other efficient elegant ways of doing this .
PS : Database tables dont have any order , so I dont think I can do this .. have data services run every 15 min , and at each run it does a count and stores a count to a tables . after 15 min it takes that count as an input and does a current count , and then do a 'select last (count(t1)- count(t2)) * from xxxx
Hi Rajarshi,
Do you not have a creation/changed date/timestamp field in one of your main ECC tables that you use in the join? If so you can base your deltas off that can't you like the way you described you would do with counts?
Thanks,
Anooj
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
since its a calculation view drawing from FLAGLFLEXT, CE1xxxx and special ledger tables , finding one data stamp that could connect all 3 would be difficult . ( rather I have not explored ) .
and now when you say date stamp , another thing dawns upon me which I had not thought of before ...
that FLAGLFLEXT might get updated , but not CE1xxxx, then , what is my analytic view exposing anyway ? i guess i need to understand the requirements better.
Rajarshi - No, I am not aware of any db level parameters you could use for this either on a column table level or on an analytical level.
I have heard tables of type "history column" will let you track changes in the table using it's internal timestamp column I suppose. However, not sure if it is a good idea to convert huge transaction tables into history ones - probably bad idea.
There are other not so elegant ideas i can think of especially if you are able to connect your BODS to the SLT system
Hi Rajarshi,
How about writing a trigger on the table in HANA to update a timestamp column whenever data is updated or inserted. The row can be marked with the time when the row is inserted and updated and this column can be used as the delta field.
Just a radical though.
Please note that writing a trigger (that too row level) will have impact on the data loading performance. I am sure you already know this.
Regards,
Ravi
Hi Ravindra,
I was wondering, Can we do only "DELTA" loading using SLT?
I would want to load the HISTORY DATA i.e "FULL" using BODS and want to use SLT to load only "DELTA".
is it possible via SLT?
I see LOAD ( FULL LOAD) and REPLICATE ( FULL LOAD + DELTA), Want to know if you have only DELTA loading with SLT?
Regards,
Krishna
Hi Rajarshi,
Want to know if there is any way that we can use HANA DB as Source for ETL processing and identify the delta records either from tables or views?
To improve performance we create most of the tables as column tables and not row tables. How to identify the delta? In case, if we have to write the SQL script instead of BODS to massage the data then how to identify the delta records for processing?
One way I am thinking is to insert additional time stamp field in table structure while replicating the data from ECC to HANA and use that as delta identifier... Any other thoughts?
Thank you.
Regards,
Vijay
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
A couple of ways around this:
1) The function which you wrote in the calculation view can also insert into tables; if you create a Table of your own with the output structure- you can write it to a HANA table of your own choosing
- the idea is that the calculation view has calculations performed on the data; (ideally you should have delta directly from the source; and the calculation view is a representation - the delta should be passed onto the external database ) and after those calculations, the data is to be sent outwards;
2) append the records; do not over write- since this is in memory- can handle the entire storage of the data you are putting out; ( i assume you have some sort of 0recordmode type of field, or debit /credit,signed values)
3) when you send the data to the external database; remember to accumulate all the records; this should give you the final delta model - should be Summed up
This should reduce the load on the transfer and would not require a lot of coding
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It suddenly dawned upon me ,, that delta from analytic view is a illogical concept !!!!
Delta can be for individual rows , but not on aggregates . another line incoming from the SLT to the raw tables might get rolled up into an aggregate without any additional row in the analytica view .
aah very good point on the analytical view aggregation
On procedures: You can insert/update/delete within a stored procedure as long as it is not defined as read only. With scripted calc views, procedures are always created as Read Only and hence it cannot be used to insert/update/delete data.
Thanks,
Anooj
Hello Rajarshi,
My thoughts:
if you are using SLT to replicate the data from ECC system to SAP HANA.
then you can see the delta records in the SAP HANA and same you can see in the report tool also or during the transfer also.
Please let me know if i understood wrong.
Regards,
neha
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
What you said is correct .. SLT does (initial load ) + delta , so when you execute an analytic view you see that initial load + Delta 's . but my requirement is to send only the delta part of the analytic view to a external database .
so
SLT ----> initial load to HANA --- > analytic view/calc view with initial load ---> writes to a external DB
and what I then need is
SLT ---> delta load to HANA --- > analytic view/calc view executes with ( delta load) --->writes to external DB
or
SLT ---> delta load to HANA ----> analytica views/calc view executes with ( initial + delta) load --->> writes only the delta results to external DB
so what kind of parameter should I pass to the calc view that it filters by delta load .
User | Count |
---|---|
86 | |
10 | |
10 | |
9 | |
6 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.