cancel
Showing results for 
Search instead for 
Did you mean: 

delta data transfer from a analytical view

rajarshi_muhuri
Active Participant
0 Kudos

I am having a requirement , where I have to read from a HANA analytical/calculation view and dump the results to a external database . the analytic view feeds off tables connected to ECC via SLT .

Thus I have to do delta writes ( have not decided yet , but perhaps will use data services SQL transform)  to the external data base.

How do I do it. since views are "on demand" i,e does not have any persistancy , I cannot put a column with a flag. I do not want to write the entire contents of a view to another table in HANA ( might get as many as 250 million rows )  where I can add a column for a flag.

what are some other efficient elegant ways of doing this .

PS : Database tables dont have any order , so I dont think I can do this .. have data services run every 15 min , and at each run it does a count and stores a count to a tables . after 15 min it takes that count as an input and does a current count , and then do a 'select last (count(t1)- count(t2)) * from xxxx

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Rajarshi,

Do you not have a creation/changed date/timestamp field in one of your main ECC tables that you use in the join? If so you can base your deltas off that can't you like the way you described you would do with counts?

Thanks,

Anooj

rajarshi_muhuri
Active Participant
0 Kudos

since its a calculation view drawing from FLAGLFLEXT, CE1xxxx and special ledger tables , finding one data stamp that could connect all 3 would be difficult . ( rather I have not explored ) .

and now when you say date stamp , another thing dawns upon me which I had not thought of before ...

that FLAGLFLEXT might get updated , but not CE1xxxx, then , what is my analytic view exposing anyway ?  i guess i need to understand the requirements better.

rajarshi_muhuri
Active Participant
0 Kudos

anooj .. is there any db level parameter , where i can do select * from table 'where records were inserted between dbtime =t1 and  and dbtime t1+15 min ?

Former Member
0 Kudos

Rajarshi - No, I am not aware of any db level parameters you could use for this either on a column table level or on an analytical level.

I have heard tables of type "history column" will let you track changes in the table using it's internal timestamp column I suppose. However, not sure if it is a good idea to convert huge transaction tables into history ones - probably bad idea.

There are other not so elegant ideas i can think of especially if you are able to connect your BODS to the SLT system

former_member184768
Active Contributor
0 Kudos

Hi Rajarshi,

How about writing a trigger on the table in HANA to update a timestamp column whenever data is updated or inserted. The row can be marked with the time when the row is inserted and updated and this column can be used as the delta field.

Just a radical though.

Please note that writing a trigger (that too row level) will have impact on the data loading performance. I am sure you already know this.

Regards,

Ravi

rajarshi_muhuri
Active Participant
0 Kudos

ravi : Now that sounds very complicated . Let me explore the FS more and see if anything easier can be done .I will update this thread if I do find a simpler solution

Thanks everybody for their inputs

former_member182302
Active Contributor
0 Kudos

Hi Ravindra,

I was wondering, Can we do only "DELTA" loading using SLT?

I would want to load the HISTORY DATA i.e "FULL" using BODS and want to use SLT to load only "DELTA".

is it possible via SLT?

I see LOAD ( FULL LOAD) and REPLICATE ( FULL LOAD + DELTA), Want to know if you have only DELTA loading with SLT?

Regards,

Krishna

Answers (3)

Answers (3)

Former Member
0 Kudos

Hi Rajarshi,

Want to know if there is any way that we can use HANA DB as Source for ETL processing and identify the delta records either from  tables or views?

To improve performance we create most of the tables as column tables and not row tables. How to identify the delta? In case, if we have to write the SQL script instead of BODS to massage the data then  how to identify the delta records for processing?

One way I am thinking is to insert additional time stamp field in table structure while replicating the data from ECC to HANA and use that as delta identifier... Any other thoughts?

Thank you.

Regards,

Vijay

Former Member
0 Kudos

A couple of ways around this:

1) The function which you wrote in the calculation view can also insert into tables; if you create a Table of your own with the output structure- you can write it to a HANA table of your own choosing

     - the idea is that the calculation view has calculations performed on the data;  (ideally you should have delta directly from the source;  and the calculation view is a representation  - the delta should be passed onto the external database ) and after those calculations, the data is to be sent outwards;

2) append the records; do not over write- since this is in memory- can handle the entire storage of the data you are putting out; ( i assume you have some sort of 0recordmode type of field, or debit /credit,signed values)

3) when you send the data to the external database; remember to accumulate all the records; this should give you the final delta model  -  should be Summed up

This should reduce the load on the transfer and would not require a lot of coding

rajarshi_muhuri
Active Participant
0 Kudos

zeeshan: i dont think you can write to a table from a SQL script or procedure.

rajarshi_muhuri
Active Participant
0 Kudos

It suddenly dawned upon me ,, that delta from analytic view is a illogical concept !!!!

Delta can be for individual rows , but not on aggregates . another line incoming from the SLT to the raw tables might get rolled up into an aggregate without any additional row in the analytica view .

the requirement is wrong ..

Former Member
0 Kudos

aah very good point on the analytical view aggregation

On procedures: You can insert/update/delete within a stored procedure as long as it is not defined as read only. With scripted calc views, procedures are always created as Read Only and hence it cannot be used to insert/update/delete data.

Thanks,

Anooj

former_member182277
Contributor
0 Kudos

Hello Rajarshi,

My thoughts:

if you are using SLT to replicate the data from ECC system to SAP HANA.

then you can see the delta records in the SAP HANA and same you can see in the report tool also or during the transfer also.

Please let me know if i understood wrong.

Regards,

neha

rajarshi_muhuri
Active Participant
0 Kudos

What you said is correct .. SLT does (initial load ) + delta , so when you execute an analytic view you see that initial load + Delta 's . but my requirement is to send only the delta part of the analytic view to a external database .

so

SLT ----> initial load to HANA --- > analytic view/calc view with initial load ---> writes to a external DB

and what I then need is

SLT ---> delta load to HANA --- > analytic view/calc view executes with ( delta load)  --->writes to external DB

or

SLT ---> delta load to HANA ----> analytica views/calc view executes with ( initial + delta) load --->> writes only the delta results to external DB

so what kind of parameter should I pass to the calc view that it filters by delta load .

former_member182277
Contributor
0 Kudos

Hello Rajashri,

then in that case you can achiev the same by using date/timestamp attributes.

Regards,

neha