Skip to Content

Debugging import from data cluster

Hi all,

Im currently facing a problem importing data from a data cluster that drives me nuts. Is there any way to further debug an import from a data cluster?

The short version of the problem I'm having is this. I always get an exception CONNE_IMPORT_WRONG_COMP_TYPE when importing data from the data cluster (more specifically importing the data cluster from the database). However, I'm 99.9% sure that the structure for the import ist correct. Therefore, I'd like to find out more details where or why exactly the exception occurs.

And here is a more elaborate description of the error. The reason I'm only 99.9% the structure used for exporting to the database is equal to the structure for importing is that I'm building the import structure dynamically in memory. Therefore, there could well be some subtle differences. But judging from all I can see in the debugger the structures used for are equivalent. Furthermore, the problem seems to be related to exporting/importing a structure that contains a boxed components. The exception only occurs if cases where a structure containing some parts marked es BOXED is involved. However, I wasn't able to reproduce the problem with some small example structures. Currently, I'm only able to reproduce the exception with a quite complex structure from SAP IS-U which I'm using in a data migration scenario.

Any ideas or suggestions how to solve this? Has anyone here ever seen anything similar to this?

Thanks in advance,
Christian

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

6 Answers

  • Best Answer
    Sep 21, 2017 at 01:16 PM

    Hi Enno,

    i now fixed the stupid copy and past bug i had in my test program (see below). And the results are even more surprising. In my test runs the 10.000 executions of the direct export and import of the complex test structure took about 21s. 10.000 executions of converting the data to XML, exporting the XML string, importing the XML string and converting it back took about 8,5s.

    That means the XML version is actually 2,5 times faster! This is in my opinion a really surprising result.

    A possible reason for this could be that the direct export to a data cluster needs to store type information as well (this can be seen when reading a data cluster using the cl_abap_expimp_utilities=>dbuf_import_create_data method). It seems that storing this type information is quite some overhead and therefore the XML based variant is faster. If this assumption is true it might be the case that for more complex or more simple structures the direct import is faster (maybe Horst can comment on this).

    For me the summary is that when working with data clusters one should always convert the data to XML first. This approach solve all possible problems with structure changes and is also much faster.

    Christian

    REPORT zcd_test_xml_data_cluster.
    
    INTERFACE lif_performance_test.
      DATA export_data TYPE /fact/prod_mig_s_alldata.
      DATA import_data TYPE /fact/prod_mig_s_alldata.
      METHODS run.
    ENDINTERFACE.
    
    CLASS lcl_direct_export DEFINITION.
      PUBLIC SECTION.
        INTERFACES lif_performance_test.
        METHODS constructor.
    ENDCLASS.
    
    CLASS lcl_direct_export IMPLEMENTATION.
      METHOD lif_performance_test~run.
        DATA key TYPE /fact/basis_s_datacst_key.
    
        key-guid = cl_system_uuid=>create_uuid_c32_static( ).
        key-version = 1.
    
        EXPORT data
          FROM me->lif_performance_test~export_data
          TO DATABASE /fact/basis_dcst(zz)
          ID key COMPRESSION ON.
    
        IMPORT data
          TO me->lif_performance_test~import_data
          FROM DATABASE /fact/basis_dcst(zz)
          ID key.
      ENDMETHOD.
    
      METHOD constructor.
        me->lif_performance_test~export_data = VALUE #( t_partner = VALUE #( ( s_globaldata-name_first = |Christian| ) ) ).
      ENDMETHOD.
    
    ENDCLASS.
    
    CLASS lcl_xml_export DEFINITION.
      PUBLIC SECTION.
        INTERFACES lif_performance_test.
        METHODS constructor.
    ENDCLASS.
    
    CLASS lcl_xml_export IMPLEMENTATION.
      METHOD lif_performance_test~run.
        DATA export_xml_string TYPE string.
        DATA import_xml_string TYPE string.
    
        DATA(key) = VALUE /fact/basis_s_datacst_key(
          guid = cl_system_uuid=>create_uuid_c32_static( )
          version = 1
        ).
    
        CALL TRANSFORMATION id
         SOURCE mig_data = me->lif_performance_test~export_data
         RESULT XML export_xml_string.
    
        EXPORT data = export_xml_string
           TO DATABASE /fact/basis_dcst(zz)
           ID key COMPRESSION ON.
    
        IMPORT data
          TO import_xml_string
          FROM DATABASE /fact/basis_dcst(zz)
          ID key.
    
        CALL TRANSFORMATION id
         SOURCE XML import_xml_string
         RESULT  mig_data = me->lif_performance_test~import_data.
    
      ENDMETHOD.
      METHOD constructor.
        me->lif_performance_test~export_data = VALUE #( t_partner = VALUE #( ( s_globaldata-name_first = |Christian| ) ) ).
      ENDMETHOD.
    ENDCLASS.
    
    PARAMETERS p_wu TYPE i OBLIGATORY DEFAULT 10.
    PARAMETERS p_run TYPE i OBLIGATORY DEFAULT 1000.
    
    START-OF-SELECTION.
      DATA(direct_export) = NEW lcl_direct_export( ).
      DATA(xml_export) = NEW lcl_xml_export( ).
    
      "Executing performance test for direct export and import
      DO p_wu TIMES.
        direct_export->lif_performance_test~run( ).
      ENDDO.
    
      GET RUN TIME FIELD DATA(direct_exp_t0).
      DO p_run TIMES.
        direct_export->lif_performance_test~run( ).
      ENDDO.
      GET RUN TIME FIELD DATA(direct_exp_t1).
    
      "Executing performance test with XML conversion
      DO p_wu TIMES.
        xml_export->lif_performance_test~run( ).
      ENDDO.
    
      GET RUN TIME FIELD DATA(xml_exp_t0).
      DO p_run TIMES.
        xml_export->lif_performance_test~run( ).
      ENDDO.
      GET RUN TIME FIELD DATA(xml_exp_t1).
    
      WRITE: / |Runtiem for { p_run } import and exports:|.
      WRITE: / |Direct export and import of the data: { direct_exp_t1 - direct_exp_t0 } |.
      WRITE: / |Export and import using XML conversion: { xml_exp_t1 - xml_exp_t0 } |.
    Add comment
    10|10000 characters needed characters exceeded

    • That is really unexpected!

      But measurement in "my" system differs the other way:

      Direct: 1.673.559 ms

      XML: 2.536.943 ms

      Sometimes more, sometimes less but "direct" always is faster.

      I had to modify your code because of missing data types. I just used T000 as DATA and then thought that computing complex structures might cause the difference.

      I then defined following "complex" data structure

        TYPES: BEGIN OF ty_data,
                 t000  TYPE t000,
                 t001w TYPE STANDARD TABLE OF t001w WITH DEFAULT KEY,
               END OF ty_data.
      [...]
          me->lif_performance_test~export_data = VALUE #( t000-mtext = 'EINS'
                                           t001w = value #( ( werks = '1000' name1 = 'Werk 1000' )
                                                            ( werks = '2000' name1 = 'Werk 2000' )
                                                            ( werks = '3000' name1 = 'Werk 3000' )
                                                           )
                                                         ).
      
      

      But same run time results...

      And I used INDX instead...

  • Sep 19, 2017 at 07:26 AM

    There's also a program RSINDX00 for analyzing data clusters. Undocumented and for internal usage. Maybe you give it a try, but don't ask me about details ...

    Add comment
    10|10000 characters needed characters exceeded

  • Sep 19, 2017 at 01:10 PM

    Although the following code don't create exactly the original data objects, maybe it can help for troubleshooting:

    DATA: tab_cpar TYPE tab_cpar.
    tab_cpar = cl_abap_expimp_utilities=>dbuf_import_create_data( dbuf = data_buffer ).
    Add comment
    10|10000 characters needed characters exceeded

  • Sep 19, 2017 at 12:16 PM

    Import and Export to/from somewhere is easy and helpful. Unless you don't change anything...

    As an alternative I suggest to transform the data structures into XML-String and save the XML-Data:

    http://www.tricktresor.de/blog/daten-dynamisch-verwalten/

    The advantage is that this technique is very unsusceptible against structure changes. You can remove or add fields anywhere in the structure. Of course they will not be considered reading the data but most important: They don't break the import.

    I know this does not help you with your current problem but might help to avoid this issue in future...

    Add comment
    10|10000 characters needed characters exceeded

  • Sep 19, 2017 at 05:50 AM

    Can you create RTTI type objects for both types and compare those?

    Add comment
    10|10000 characters needed characters exceeded

    • One more thing I noticed while trying to solve th problem was the following. The structure I'm working with looks something like this

      define type zcd_test {
      c1 : type_1;
      c2 : type_2;
      c3 : type_3; }

      In this example type_2 contains the boxed component. When I now change the structure to look like this the export and import works:

      define type zcd_test {
          c1 : type_1;
          c3 : type_3;
      }

      The same is true if I change the structure to look like this:

      define type zcd_test {
          c2 : type_1;
          c3 : type_3;
      }

      So it doesn't seem to be a problem with the boxed component itself. However, as soon as I change the structure to contain c1, c2 and c3 the import fails.

      I'll try to look a the RTTI objects tonight and try to find a difference.

      Christian

  • Sep 26, 2017 at 10:07 AM

    Hi Enno Wulff ,

    We have a productive code where contents of Std. SAP Tables are EXPORT'ed into Data-Clusters.

    We face the same problem with every SAP Upgrade, if the structure of the std. tables have changed. So if the EXPORT was done for old release and the IMPORT for new release, the latter fails :(

    after testing the XML variant I'll stick with it for our use case. Much simpler and less error prone with respect to required custom code.

    If i understand correctly, you "transform" the data to XML-Stream & EXPORT the stream to the Cluster Table. Now when we IMPORT the XML-Stream (and the target structure has changed) the transformation ID simply ignores the fields which cannot be mapped.

    Am i correct in my understanding?

    BR,

    Suhas

    PS - Is it somehow possible to have some "English" tags for your blogs so that they show up in the Google searches? :)

    Add comment
    10|10000 characters needed characters exceeded