Skip to Content

Upload Materials with z table

Nov 08, 2017 at 07:30 AM


avatar image

Hi everyone.

This is the first time I upload a question because some consultants tell me that is not possible anything that I proposed. So I decided to upload a question to know if my thoughts are wrong or maybe correct.

We have a lot of materials to upload to the system (500k) and we need to extend at least 100 centers. Every consultant tells me that we need to create a file with this information required by SAP but my file size is 8 Gb. So I thought that if we create a z table and I access the database directly to upload all records and run a custom program that reads the z table I will win a lot of time. Is this possible?



10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

2 Answers

Ingo Bruss
Nov 08, 2017 at 09:05 AM

Sure it's possible; you can create your Z table, load the data into it and then write a program that accesses this table and calls BAPI_MATERIAL_SAVEREPLICA to create the materials.

Question is whether this is feasible; you'd need to code error handling, mark which Z table records are alreaday processed, make it restartable, provide a simulation mode, maybe even parallelization, ...

Loading materials via LSMW provides more support; you can maybe split your file in smaller chunks.

Show 2 Share
10 |10000 characters needed characters left characters exceeded

Thaks for your answer Ingo.

I will try to explain what I have now.

We create a DataBase on another server because support system tells us that SAP never ever allows inserting info directly into a "z table", they tell us that the best way is creating another database.

So I create 6 tables: MARA, MARC, MLGN, MVKE, EANS, CHARACTERISTICS. Then we did create an ABAP program that connects to the new database and read all materials that have a flag of "pending". When everything goes perfect returns to the database and change the flag to "done" and if the material had an error then change the flag to "error".

The problem is because to upload 5,000 materials into SAP I will need to create different rows in each table, and take days uploading data.

MARA: 5,000 rows

MARC: 350,000 rows

MLGN: 5,000 rows

MVKE: 5,600,000 rows (because we have 16 distribution channels by center)

EANS: 20,000 rows (approx)


So I want to change my architecture and create this tables into SAP Database, access it by a database user and insert all rows that I have to upload. Then the ABAP program will not need to open connection to another database, read the first material, mark status and close conection. I expect to increase my performance. Ingo if you think that I'm crazy no problem, but I feel so frustrated because I do not know very well SAP. Can this actions help me to encrease the performace?

Another question is: Can SAP upload the material entirely?? what I mean is join the 6 tables and upload it by one "select".

Really thanks Ingo.


Hi, never directly write to SAP tables (MARA, MARC, ...)

Z tables can be created everywhere, need not be on an own server.

You can write into your own Z tables any way you like.

That's about what I can add to this topic, regards, Ingo Bruß

Frédéric Girod Nov 14, 2017 at 07:20 AM

Hi Oscar,

500 000 entries are a big quantities of data, in your source system, could you make some code ? could you send IDOC (EDI) ?



Show 3 Share
10 |10000 characters needed characters left characters exceeded

Thanks Frederic.

Yes, I could create an IDOC and send it.

I am using Pentaho Data Integration (PDI) to move my info and can create anything.

Do you think that this way will be much more efficiently?



I don't know your level in SAP, maybe you already know that but ..

There is 3 ways to integrate data into SAP :

- Batch-Input: like a macro, full control, very slow, error could be reprocessed

- Direct-Input: like an insert in the table, low control, very quick , no reprocess

- BAPI: high control, slow, no reprocess

All this techno needs files and will be a problem with 500 000 entries

Idoc used one of this techno, but could be reprocessed, could be split in block (you could say to SAP to commit all the 1000 entries), could be process in parallels, ...

If I were you, I will try to used Idoc

But, please, inform the Basis team about this amount of data, that will have an impact on the database size ... You must make some test before sending the full entries.




Thanks Fred.

This week we will try with my interface tables (z tables) into the SAP database and in the next weeks we will try use idocs.

Thanks Oskar