Skip to Content
avatar image
Former Member

SLT and HANA deleting archived data

From the SLT Operations Guide, section 5.2 I can see the following statement

5.2 Archiving Data in Source Systems

The trigger-based replication also considers the deletion in source tables by archive activities (since it is not possible to distinguish on the database level between delete actions cause by archiving versus regular deletion of data records). As a consequence, SAP LT Replication Server will also replicate archiving activities as delete actions in the SAP HANA database.

In a typical standalone/sidecar implementation of SAP HANA, I would assume that in most cases this is not favorable behavior or a desired function. In typical DW/DataMart implementations, the data should be persisted in the target even after the source system data may have been archived. I can refer back to how BW operates in this caes - any new/changed data is extracted to BW, but any archiving operations do not affect the already extracted data in the target system.

I know there is functionality available to load archived data into HANA, but that would seem like a troublsome method to 'put the pieces back together' and get a wholistic picture of all the historical data (online data + archived objects) and would present some interesting challenges in the target (HANA).

Is there any way to disable the functionality to replicate deleted data due to archiving? Is there anyone with some experience navigating around this hurdle in a standalone/sidecar scenario that can shed some light as to how they handled this?

Thanks,

Justin

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

4 Answers

  • Best Answer
    Jul 09, 2013 at 08:01 AM

    Hi Justin,

    this is a functional gap at the kernel level. So there cannot be a standard aproach so far. But let me share my ideas.

    1. Delete triggers before and recreate triggers after the archiving run

         Can you ensure that the archiving run is no productive data/transaction is execute during the      archiving run - you can delete the trigger before the run and recreate it afterwards. The result:      triggers cannot record and archive deletes will not cause a delete on the HANA system.

         Deleting triggers can be executed via expert funtions on the SLT system at transaction LTRC.

    2. Define a transformation rule, that exclude deletion

         You could define a rule that check the operations, a if it is a delete you can skip this from      processing. This rule has to be defined before you start the archiving run, but again you have to      ensure that no productive data is delete (that do not belong to the archiving run)

    Both options require manual steps. So we are looking to improve it - but due to the fast that this is a core kernel story - it is not a enhancement that can be achieved asap.

    Best,

    Tobias

    Add comment
    10|10000 characters needed characters exceeded

    • Former Member Tobias Koebler

      Hi Tobias,

      Thanks for your input. As per my understanding from your mentioned process it will skip all the deleted records.

      But if i want to skip only archived records, is there any way to set flag against those deleted 'archived' records.

      Thanks,

      Shubhrajit Chowdhury

  • avatar image
    Former Member
    Jul 08, 2013 at 04:33 PM

    Hi Justin,

    I have no real experience in this scenario, but just wanted to help you to find a workable solution.

    My idea is as follow: Use 2 data sources towards 1 target at the HANA-side, one is the "real object" and the other one the "archiving object". If some data is moved (deleted) towards archiving, the result (sum) is still the same on the HANA-side.

    This can be done during configuration steps in SLT, as you have lots of options to transform/filter/merge your source data, also at the HANA-side (like additional field to avoid duplicate entries).

    And you need to select the relevant tables to define a load object from your archive object anyway.

     

    I hope this will help you.

    Regards,

    Andre

    Add comment
    10|10000 characters needed characters exceeded

  • avatar image
    Former Member
    Mar 25, 2015 at 02:53 PM

    Hi Tobias,

    Option 3 (having a dedicated archiving server) seems an interesting option. However it's not clear to us how to implement it. Say we have 2 ECC servers to simplifies this: one for regular activities and one for archiving. But we have only one SLT server, so how would we distinguish in the ABAP rule the archiving server? Is there any easy way to do that? Could you please elaborate?


    We have severals BI projects impacted by archiving. I am sure many customers are facing this situation. Do you have any news on the long-term kernel standard solution?


    Thanks a lot,

    Christian

    Add comment
    10|10000 characters needed characters exceeded

    • Former Member Former Member

      Hi Zsolt,

      I suggest you contact SAP AGS support and ask them help to implement this solution for you. This is what we did due to procedure is not straight forward. It required configuration change and modify DB trigger in ECC, create new ABAP code in SLT for each table and this has to be done table by table. SLT DMIS SP08 make this procedure a little simple but still require lots manual changes, user based solution also has limitation which you can only allow ONE user in ECC run archiving job.

      Thanks,

      Xiaogang

  • avatar image
    Former Member
    Sep 17, 2015 at 07:23 PM

    Interesting solution for sidecar implementations. I could see how customers might want this functionality and SAP should look to add an easier option to ECC / SLT. For example, a "do not replicate delete operations" check box per table in SLT.

    Although, I wonder if it would just be better to not archive the data if the data is that important to real-time reporting. I work with several SoH customers and they would simply lose the data if they archived. (Assuming they don't copy it to another location before archiving). In my opinion this boils down to the fundamental difference between synchronous real-time operational reporting and asynchronous data warehousing. Side car is almost a hybrid but it leans closer to synchronous real-time than a traditional batch load solution.

    Add comment
    10|10000 characters needed characters exceeded

    • Former Member Former Member

      Hopefully a future SP will include an easy button when replicating a table. Given that all the components are in place, an option to include archive data on a per table basis could be added to SLT. Wishful thinking I know, but why not just add another column to IUUC_REPL_TABSTG to manage this.


      However, it's still a complex issue to solve when running SoH. Merging the data (on the fly) between two tables would require a lot of calculation or SQL engine overhead. It might just be that archiving is not necessary with SAP HANA but I am sure there are some that would disagree.