Skip to Content
3

Debugging import from data cluster

Sep 18, 2017 at 07:16 PM

663

avatar image

Hi all,

Im currently facing a problem importing data from a data cluster that drives me nuts. Is there any way to further debug an import from a data cluster?

The short version of the problem I'm having is this. I always get an exception CONNE_IMPORT_WRONG_COMP_TYPE when importing data from the data cluster (more specifically importing the data cluster from the database). However, I'm 99.9% sure that the structure for the import ist correct. Therefore, I'd like to find out more details where or why exactly the exception occurs.

And here is a more elaborate description of the error. The reason I'm only 99.9% the structure used for exporting to the database is equal to the structure for importing is that I'm building the import structure dynamically in memory. Therefore, there could well be some subtle differences. But judging from all I can see in the debugger the structures used for are equivalent. Furthermore, the problem seems to be related to exporting/importing a structure that contains a boxed components. The exception only occurs if cases where a structure containing some parts marked es BOXED is involved. However, I wasn't able to reproduce the problem with some small example structures. Currently, I'm only able to reproduce the exception with a quite complex structure from SAP IS-U which I'm using in a data migration scenario.

Any ideas or suggestions how to solve this? Has anyone here ever seen anything similar to this?

Thanks in advance,
Christian

10 |10000 characters needed characters left characters exceeded

"I wasn't able to reproduce the problem with some small example structures."

Neither I ...

0
* Please Login or Register to Answer, Follow or Comment.

6 Answers

Best Answer
Christian Drumm
Sep 21, 2017 at 01:16 PM
1

Hi Enno,

i now fixed the stupid copy and past bug i had in my test program (see below). And the results are even more surprising. In my test runs the 10.000 executions of the direct export and import of the complex test structure took about 21s. 10.000 executions of converting the data to XML, exporting the XML string, importing the XML string and converting it back took about 8,5s.

That means the XML version is actually 2,5 times faster! This is in my opinion a really surprising result.

A possible reason for this could be that the direct export to a data cluster needs to store type information as well (this can be seen when reading a data cluster using the cl_abap_expimp_utilities=>dbuf_import_create_data method). It seems that storing this type information is quite some overhead and therefore the XML based variant is faster. If this assumption is true it might be the case that for more complex or more simple structures the direct import is faster (maybe Horst can comment on this).

For me the summary is that when working with data clusters one should always convert the data to XML first. This approach solve all possible problems with structure changes and is also much faster.

Christian

REPORT zcd_test_xml_data_cluster.

INTERFACE lif_performance_test.
  DATA export_data TYPE /fact/prod_mig_s_alldata.
  DATA import_data TYPE /fact/prod_mig_s_alldata.
  METHODS run.
ENDINTERFACE.

CLASS lcl_direct_export DEFINITION.
  PUBLIC SECTION.
    INTERFACES lif_performance_test.
    METHODS constructor.
ENDCLASS.

CLASS lcl_direct_export IMPLEMENTATION.
  METHOD lif_performance_test~run.
    DATA key TYPE /fact/basis_s_datacst_key.

    key-guid = cl_system_uuid=>create_uuid_c32_static( ).
    key-version = 1.

    EXPORT data
      FROM me->lif_performance_test~export_data
      TO DATABASE /fact/basis_dcst(zz)
      ID key COMPRESSION ON.

    IMPORT data
      TO me->lif_performance_test~import_data
      FROM DATABASE /fact/basis_dcst(zz)
      ID key.
  ENDMETHOD.

  METHOD constructor.
    me->lif_performance_test~export_data = VALUE #( t_partner = VALUE #( ( s_globaldata-name_first = |Christian| ) ) ).
  ENDMETHOD.

ENDCLASS.

CLASS lcl_xml_export DEFINITION.
  PUBLIC SECTION.
    INTERFACES lif_performance_test.
    METHODS constructor.
ENDCLASS.

CLASS lcl_xml_export IMPLEMENTATION.
  METHOD lif_performance_test~run.
    DATA export_xml_string TYPE string.
    DATA import_xml_string TYPE string.

    DATA(key) = VALUE /fact/basis_s_datacst_key(
      guid = cl_system_uuid=>create_uuid_c32_static( )
      version = 1
    ).

    CALL TRANSFORMATION id
     SOURCE mig_data = me->lif_performance_test~export_data
     RESULT XML export_xml_string.

    EXPORT data = export_xml_string
       TO DATABASE /fact/basis_dcst(zz)
       ID key COMPRESSION ON.

    IMPORT data
      TO import_xml_string
      FROM DATABASE /fact/basis_dcst(zz)
      ID key.

    CALL TRANSFORMATION id
     SOURCE XML import_xml_string
     RESULT  mig_data = me->lif_performance_test~import_data.

  ENDMETHOD.
  METHOD constructor.
    me->lif_performance_test~export_data = VALUE #( t_partner = VALUE #( ( s_globaldata-name_first = |Christian| ) ) ).
  ENDMETHOD.
ENDCLASS.

PARAMETERS p_wu TYPE i OBLIGATORY DEFAULT 10.
PARAMETERS p_run TYPE i OBLIGATORY DEFAULT 1000.

START-OF-SELECTION.
  DATA(direct_export) = NEW lcl_direct_export( ).
  DATA(xml_export) = NEW lcl_xml_export( ).

  "Executing performance test for direct export and import
  DO p_wu TIMES.
    direct_export->lif_performance_test~run( ).
  ENDDO.

  GET RUN TIME FIELD DATA(direct_exp_t0).
  DO p_run TIMES.
    direct_export->lif_performance_test~run( ).
  ENDDO.
  GET RUN TIME FIELD DATA(direct_exp_t1).

  "Executing performance test with XML conversion
  DO p_wu TIMES.
    xml_export->lif_performance_test~run( ).
  ENDDO.

  GET RUN TIME FIELD DATA(xml_exp_t0).
  DO p_run TIMES.
    xml_export->lif_performance_test~run( ).
  ENDDO.
  GET RUN TIME FIELD DATA(xml_exp_t1).

  WRITE: / |Runtiem for { p_run } import and exports:|.
  WRITE: / |Direct export and import of the data: { direct_exp_t1 - direct_exp_t0 } |.
  WRITE: / |Export and import using XML conversion: { xml_exp_t1 - xml_exp_t0 } |.
Show 1 Share
10 |10000 characters needed characters left characters exceeded

That is really unexpected!

But measurement in "my" system differs the other way:

Direct: 1.673.559 ms

XML: 2.536.943 ms

Sometimes more, sometimes less but "direct" always is faster.

I had to modify your code because of missing data types. I just used T000 as DATA and then thought that computing complex structures might cause the difference.

I then defined following "complex" data structure

  TYPES: BEGIN OF ty_data,
           t000  TYPE t000,
           t001w TYPE STANDARD TABLE OF t001w WITH DEFAULT KEY,
         END OF ty_data.
[...]
    me->lif_performance_test~export_data = VALUE #( t000-mtext = 'EINS'
                                     t001w = value #( ( werks = '1000' name1 = 'Werk 1000' )
                                                      ( werks = '2000' name1 = 'Werk 2000' )
                                                      ( werks = '3000' name1 = 'Werk 3000' )
                                                     )
                                                   ).

But same run time results...

And I used INDX instead...

1
Horst Keller
Sep 19, 2017 at 07:26 AM
7

There's also a program RSINDX00 for analyzing data clusters. Undocumented and for internal usage. Maybe you give it a try, but don't ask me about details ...

Show 1 Share
10 |10000 characters needed characters left characters exceeded

Thanks. Will give that a try as well.

0
Sandra Rossi Sep 19, 2017 at 01:10 PM
2

Although the following code don't create exactly the original data objects, maybe it can help for troubleshooting:

DATA: tab_cpar TYPE tab_cpar.
tab_cpar = cl_abap_expimp_utilities=>dbuf_import_create_data( dbuf = data_buffer ).
Show 9 Share
10 |10000 characters needed characters left characters exceeded

Hi Sandra Rossi,

If this methods does what I thinks it does you just rendered a lot of custom ABAP code useless. Will give this a try immediately.

Thanks a lot!
Christian

0

Hi Horst, Sandra,

I investigated the problem further using your suggestions. The first thing I did was to compare the RTTI objects. As far as I can tell they are equivalent for the import and the export structure. Here are two screenshots from the debugger. The first is the export structure, the second the import structure.

Furthermore, the size of the import and export structure correspond to what the report RSINDX00 returns:

The next thing I did was to test the method Sandra suggested. For this I created the following simple test program. It simply reads the cluster database and tries to convert it.

DATA cluster_data TYPE TABLE OF /fact/basis_dcst.

SELECT * FROM /fact/basis_dcst
  WHERE relid = 'DC'
    AND guid = '00505684AC1C1ED7A6A65B76016B40D9'
    AND version = '1'
  INTO TABLE @cluster_data.


DATA(converted_cluster_data) =    cl_abap_expimp_utilities=>dbuf_import_create_data(
     dbuf = CONV #( cluster_data[ 1 ]-clustd ) ).

This works fine if the exported data doesn't contain the boxed structure that seems to be the root of my problem. However, as soon as the export contains the boxed structure the call to the method cl_abap_expimp_utilities=>dbuf_import_create_data runs for a few seconds. Then the program is terminated and the work process is restarted.

Do you have any further suggestion what do try out?

Thanks
Christian

0

Oh, a core dump :-(

That's an incident for SAP support.

(A core dump is an exception in the kernel that ends the whole workprocess instead of being handled by the runtime environment and ending only the internal session with a short dump; it should never occur).

I forwarded this to development.

4

Hi Christian,

Can you provide a sample program that leads to the core dump or open a ticket? ABAP Development wants to have a look at it.

Horst

0

Hi Horst,

This is the ticket I opened regarding the issue: 474402 / 2017 Error importing boxed component from data cluser
The initial response was not very encouraging, though.

I have an example program in the customer system that can be used to reproduce the issue. I'll try to also create a small test program and add it to the ticket.

Christian

0

Hi Christian,

I forwarded the ticket number to the person responsible (and hope that it will be picked up soon). Neither should there be import errors nor core dumps for exported boxed components. Otherwise we had forbidden it explicitly and documented it accordingly.

Horst

2

Hi all,

this is the OSS note related to the problem described above.

Christian

2550632 DBUF_IMPORT_CREATE_DATA mit Boxed Komponenten

2

Yay!

0

Great, Thanks!
However, after testing the XML variant I'll stick with it for our use case. Much simpler and less error prone with respect to required custom code. And also the performance is quite good.

Christian

0
Enno Wulff Sep 19, 2017 at 12:16 PM
2

Import and Export to/from somewhere is easy and helpful. Unless you don't change anything...

As an alternative I suggest to transform the data structures into XML-String and save the XML-Data:

http://www.tricktresor.de/blog/daten-dynamisch-verwalten/

The advantage is that this technique is very unsusceptible against structure changes. You can remove or add fields anywhere in the structure. Of course they will not be considered reading the data but most important: They don't break the import.

I know this does not help you with your current problem but might help to avoid this issue in future...

Show 3 Share
10 |10000 characters needed characters left characters exceeded

Hi Enno Wulff,

thanks for your reply. I already thought about this alternative as well already. Do you have any experiences regarding performance. Especially when working with a large amount of data as in a migration scenario? In contrast to the direct export / import I'd suspect the required XML transformation adds quite some overhead.

Christian

1

No. Sorry. I also think that the native IMPORT/EXPORT is the fastest and easiest way to store Data. Although XML transformation is working really fast I could imagine some performance issues on a very big amount of data. Maybe using an XSTRING instead of plain XML might reduce the overhead a bit.

btw: With RSINDX00 (thanks Horst Keller !) it should be possible to rebuild the stored data structures... If you have time... And you're bored... And you've nothing else to do... :D

2

Thanks for the feedback! I hadn't thought that the performance difference would have been that little...

0
Horst Keller
Sep 19, 2017 at 05:50 AM
1

Can you create RTTI type objects for both types and compare those?

Show 3 Share
10 |10000 characters needed characters left characters exceeded

Hello Horst,

just one question to make sure I understand you correctly. I should create a RTTI object for the export and the import structure, right? Which part of these objects would you compare? As I'm working with deeply nested structures AFAIK I won't see the information related to the nested elements in the top level RTTI object.

Christian

0

Yep, create a RTTI object for the export and the import structure and try to find a difference.

0

One more thing I noticed while trying to solve th problem was the following. The structure I'm working with looks something like this

define type zcd_test {
c1 : type_1;
c2 : type_2;
c3 : type_3; }

In this example type_2 contains the boxed component. When I now change the structure to look like this the export and import works:

define type zcd_test {
    c1 : type_1;
    c3 : type_3;
}

The same is true if I change the structure to look like this:

define type zcd_test {
    c2 : type_1;
    c3 : type_3;
}

So it doesn't seem to be a problem with the boxed component itself. However, as soon as I change the structure to contain c1, c2 and c3 the import fails.

I'll try to look a the RTTI objects tonight and try to find a difference.

Christian

0
Suhas Saha
Sep 26, 2017 at 10:07 AM
0

Hi Enno Wulff ,

We have a productive code where contents of Std. SAP Tables are EXPORT'ed into Data-Clusters.

We face the same problem with every SAP Upgrade, if the structure of the std. tables have changed. So if the EXPORT was done for old release and the IMPORT for new release, the latter fails :(

after testing the XML variant I'll stick with it for our use case. Much simpler and less error prone with respect to required custom code.

If i understand correctly, you "transform" the data to XML-Stream & EXPORT the stream to the Cluster Table. Now when we IMPORT the XML-Stream (and the target structure has changed) the transformation ID simply ignores the fields which cannot be mapped.

Am i correct in my understanding?

BR,

Suhas

PS - Is it somehow possible to have some "English" tags for your blogs so that they show up in the Google searches? :)

Show 3 Share
10 |10000 characters needed characters left characters exceeded

Hi Suhas, yes that's right.The XML-transformation is very tolerant to structure changes.

I will try to also set english tags on tricktresor.de :)

Regards

Enno

2

In that case, in fact, you don't need EXPORT/IMPORT tables.

Simply create a custom table with a key and a RAW string.

Transform into an xstring and INSERT that into your table.

2

I have proposed the XML solution to the Business Analyst & we're gonna implement it!

Thanks for the tip :-)

1