Skip to Content

Bulk-Insert for native SQL INSERT statement

Hi.

We run into a big performance problem by inserting data sets to an external Oracle data base.

The identified problem is the native SQL insert statement itself.

It seems that there is no chance to execute a Bulkinsert with native SQL.

I tried several ways to develope natvie SQL logic. Including online generated subroutine pools, generated reports, ADBC Classes, inline EXECSQL statements etc. All versions propably working but the performance of insert statement.

We need to export millions of data sets and it is not practicable to execute one insert statement for a single dataset.

Is there any way to boost the performance for inserting data sets?

No Open SQL is possible because the target DB is not the SAP DB (Multidatabase connection).

Thanks in advance

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

8 Answers

  • Nov 22, 2010 at 03:42 PM

    Hi Hermann,

    "...so writing to the ORACLE database costs time because you do it row by row?"

    >>Yes. So it is.

    "You said you tried ADBC. With ADBC it should be possible to have bulk processing

    with native SQL. What was your issue with that?"

    >> My issue with that is, that it is not possible to do a bulk insert with ADBC 😉 .

    Kind Regards

    Thomas

    Edited by: Thomas Steier on Nov 22, 2010 4:44 PM

    Add comment
    10|10000 characters needed characters exceeded

    • Hi Volker,

      >

      > ... oops, did I get something wrong ?

      > I thought both DBs were oracle?!?!?!

      > With one of them not belonging to a SAP system, but both beinig oracle?

      it might be me who got it wrong. I assumed we talk about a ORACLE and a

      MAXDB... . With both DB's on ORACLE your previous post should be enough

      to help out.

      > NOLOGGING unfortunately does not have the desired effect for single INSERTS.

      > Need to hunt out for a reference, but I mind to remember, that the effect of

      > the LOGGING attribute only applies for array operations.

      and your memories are right (mine were wrong). On askTom i found this nice

      matrix:

      Table Mode    Insert Mode     ArchiveLog mode      result
      -----------   -------------   -----------------    ----------
      LOGGING       APPEND          ARCHIVE LOG          redo generated
      NOLOGGING     APPEND          ARCHIVE LOG          no redo
      LOGGING       no append       ""                   redo generated
      NOLOGGING     no append       ""                   redo generated
      LOGGING       APPEND          noarchive log mode   no redo
      NOLOGGING     APPEND          noarchive log mode   no redo
      LOGGING       no append       noarchive log mode   redo generated
      NOLOGGING     no append       noarchive log mode   redo generated

      So nologging only works with append (which works only with array options).

      So i think with your previous post everything is said.

      Kind regards,

      Hermann

  • Nov 19, 2010 at 01:39 PM

    Thanks for replay.

    No. I didn't tried that because the Data source and the Target DB is not the same.

    So, I like to read from SAP DB (DDIC) and write that data to a second additional connected database.

    The multi database connection was configured with transaction DBCO.

    Add comment
    10|10000 characters needed characters exceeded

    • Former Member

      Well, shouldn't this work with an Oracle database link?

      INSERT INTO target@target_system (colum1, colum2) SELECT colum1, colum2 FROM source@sap_system
      

      You can access all SAP tables from Oracle. If you need to read the data through some SAP-API function (or whatever), you might still select everything on SAP side to a temporary table, and then use a INSERT with subquery over the database link? I have to say, i never tried this myself, but this is what i would try at least.

  • avatar image
    Former Member
    Nov 19, 2010 at 12:53 PM

    Have you tried to run everything in Oracle with a single statement like

    INSERT INTO target (colum1, colum2) SELECT colum1, colum2 FROM source;
    

    ?

    Edited by: Carsten Grafflage on Nov 19, 2010 1:56 PM

    Add comment
    10|10000 characters needed characters exceeded

  • Nov 19, 2010 at 03:25 PM

    Thats a good gues. But the main problem is that one data set is written with one statement.

    And when I understand your suggestion that will be not different.

    I need something like "INSERT INTO [schema].[table] FROM TABLE [SAP-DDIC Table]."

    Unfotunatly the appendix "Connection" of ABAP Open-SQL only works if the external database table is the exact replic of the SAP DDIC tabel. E.g. the MARA table must be named "MARA" with all fields included.

    But it was a good idea.

    Thanks.

    Add comment
    10|10000 characters needed characters exceeded

  • Nov 19, 2010 at 06:37 PM

    Hi Thomas,

    can you explain what a data-set is in your desciption.

    Is it a set of multiple rows (internal table) with an insert target for a single table in the remote DB?

    Is is a single row needed to be distributed to several tables?

    As from your description I'd assume it is a single row for a single table, and you'll not be able to do bulk inserts in that case.

    ... or did I get it wrong?

    Volker

    Add comment
    10|10000 characters needed characters exceeded

  • Nov 22, 2010 at 12:06 PM

    Hi Volker.

    With "data-set" I mean a single result line. E.g. I read the whole MARA than a data set is one line of MARA table with all columns. If I use a join statement the data set consists of the selected columns.

    I will "copy" a DDIC table to an external oracel database. So, not the main database used by SAP system.

    I tried the CONNECTION thing of the open SQL Insert statement. The problem is, that this causes in an short dump with the error message "SAPSQL_WA_TOO_SMALL". But the target table include all fields of the table and the fields have the data types as the function module "DB_MAP_DDFIELDS" as returned. There is a difference of 4 Bytes between the two workareas.

    Add comment
    10|10000 characters needed characters exceeded

  • Nov 22, 2010 at 03:04 PM

    Hi Thomas,

    so writing to the ORACLE database costs time because you do it row by row?

    You said you tried ADBC. With ADBC it should be possible to have bulk processing

    with native SQL. What was your issue with that?

    Kind regards,

    Hermann

    Add comment
    10|10000 characters needed characters exceeded

  • Nov 24, 2010 at 08:46 AM

    Hi @all.

    First many thanks for your help. I think this is a special problem and I spend days in research of this topic.

    Now, thanks to your help, I found a way to do bulk insert with native SQL and I like to share this knowledge with you.

    @Carsten: The DB link hint was great but unfortunatly you need permissions on the database which you will never get on a productive system 'cause of security means. So, this would be the easiest but most risky way.

    @Volker: Never spend time in working with the CONNECTION apendix of the INSERT statement. I tried it in several ways at the expense of my nerves ;). The result is, that you will have more issues than results. This is a good hint if someone wants to transfer data from one SAP DDIC to another SAP DDIC. But it is no solution to transfer data from SAP DDIC to an plain Oracle database table.

    The hint to tnsnames.ora and configuration is great.

    @Hermann: Ok. The ADBC framework actual does not support bulk inserts. BUT tataaa.....

    THE SOLUTION:

    Assable the insert statements without the INSERT part. E.g. "INTO (MATNR, MATART, MATKL) VALUES ('000000000000000001', 'ABC', null)". Collect some of the assambled statements in an internal table and in potions of not more than 200 INTO-Statements assamle the SQL statement as follows:

    INSERT ALL

    INTO (MATNR, MATART, MATKL) VALUES ('000000000000000001', 'ABC', null)

    INTO (MATNR, MATART, MATKL) VALUES ('000000000000000002', 'DEF', 'YYY')

    INTO (MATNR, MATART, MATKL) VALUES ('000000000000000003', 'GHJ', null)

    .....

    SELECT * FROM dual

    Now you can use the ADBC framework to execute update with this statement and it will insert all your values in your tabel.

    I found out that it is absolutly dependent on network traffic and system performance how fast the insert statement will processed.

    In one moment the instert was processed in 16sec. and in the other it takes 59sek for the same insert.

    The amount of 200 can vary.

    Add comment
    10|10000 characters needed characters exceeded

    • Hi.

      I have written the SQL command as described as in this thread above.

      Execute the INSERT ALL Statement as String with the ADBC framework.

      But my further experiance is that you don't have any meaningful benefit thrugh it. It is not as faster as preshared statements.

      This is probably the ADBC framework is already optimizing SQL request.

      However the only way I know to extract data with acceptable Performance is to extract the values read in a CSV file and to push it with FTP to the target system. On target system (where the database is located on ) you can read and insert the data in seconds.

      I analysed two applications which are specialized on data extraction and both went the way of data extraction into a file in such cases where a lot of data have to be extracted.

      By the way. 5000 entries is as not as much you have to consider a way of performace improvement. I spoke about millions of entries.