cancel
Showing results for 
Search instead for 
Did you mean: 

Migration SAP system - Export optimization

joo_migueldimas
Active Participant
0 Kudos

Hello,

I´m doing a migration/heterogeneous system copy... but I´m facing one big problem. This is a BW system with size of 3TB running on Windows Server 2012 with database MS SQL Server. The problem is the export database time using the SWPM tool.

I already did two tests, the first one I performed the export database without any other options/methods and it takes 99 hours (~4 days) which is a lot of time!

We need to reduce this export database time! I read the system copy guide and with that information and other information that I read in some SCN blogs, I performed the second test using the table splitting method! I split some big tables, for example, I split the biggest table in the BW system, with 650 Gb, in 175 parts! The server contains 20 CPU's therefore I set 80 R3load parallel jobs! It takes 75 hours (~3 days). Unfortunately it is still long time!

The following image was taken during this last test (split tables test):

I need to perform this database export in 1 day! It´s possible? I believe it is possible because I already read some SCN posts like the following one where someone describe that he export one database with 9TB in 18 hours!!!

But how it´s possible?! Can you give me please your opinion, your experience and your vision in how to optimize the database exportation!?

I really need to minimizing the downtime of this migration.

Please give me some tips.

Best regards,

João Dimas

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

Hello João,

You already have the answers in the link shared by you. Try all the steps there which also includes having latest migration tools. See if there is a possibility to increase the hardware resources as well.

Thanks.

Best Regards,

Anita

joo_migueldimas
Active Participant
0 Kudos

Hello Anita,

Hum... but that is not enough . The information it that link is useful, but I don´t understand well how to set them... for instance, how to set the "Migration Monitor" (manually) or how to set the package splitting... I´m with a lot of doubts regarding them. And yes, I already read some documents specifically for the package splitting (like Package Splitter Tool").

Regarding the package splitting... it´s possible to do that in exporting database activity? I didn´t understand what is the main different between splitting tables and splitting packages! Can you help me please?

BR,

João Dimas

Former Member
0 Kudos

Hello João,

You will find detailed information on Package splitting, Table splitting and Migration monitor, including their usage in the System Copy Guide.

Regarding your question, yes its possible to go for package splitting during export.

Also, the following link will help answer a few more queries you may have:

https://scn.sap.com/thread/2088749

Thanks.

Best Regards,

Anita

joo_migueldimas
Active Participant
0 Kudos

Hello Anita,

I did other test using the table splitting using our BW system (with 2,5Tb of database size).

During the export database I selected the table splitting option and I created the file.txt that contains the largest tables in this database and for each table I set the number of splits (<table>%<nr_of_splits>). This is the content of that file:

/BIC/AZO_PERCS00%50

/BIC/AZO_SUBLA00%70

/BIC/AZO_ZEDRM00%50

/BIC/B0000073000%50

/BIC/B0000560000%90

/BIC/B0000563000%80

/BIC/B0000575000%170

/BIC/B0000589000%50

/BIC/B0000590000%50

/BIC/B0001125000%50

/BIC/B0001126000%50

/BIC/FZC_SBDIA%50

/BIC/FZC_SBDIA2%50

/BIC/FZC_SBMES%50

/BIC/FZC_SBMES2%50

/BIC/PZCHTR_ID%50

/BIC/SZCHTR_ID%50

/BIC/XZCHTR_ID%50

So, the export database finished without errors and I see in export database directory 3.835 items (with the tables parallelized) as you can see below:

Now, in the last thursday I started the import database in the new database server (with 20 CPUs). For this task I used the export database directory that was generated before. I set 75 R3load processes (Number of Parallel Jobs)! My goal as I described before is to improve the export and most important the importing time! I started the importing and the SWPM set 1218 parallel jobs! At the beginning everything was fine... the CPU average was at 85%-95% (expected CPU usage due to the number of parallel jobs that I set)!

But after running 3 days, in the last 5 jobs of 1218, I noticed that the whole process was only using one R3load process!!! So... at this moment it continues to process the same table/package (the largest table in database that I split in 170 parts!!!)... /BIC/B0000575000  and as I see in Windows Server Resource Monitor it simply processes a part at a time!

...

...\BIC\B0000575000-95.080

...\BIC\B0000575000-95.081

...\BIC\B0000575000-95.082

...\BIC\B0000575000-95.083

...


At this moment it is processing the part ...\BIC\B0000575000-95.081

Each part takes about two/three hours to finish! I see there are 107 parts of this _BIC\B0000575000-95 for the table /BIC/B0000575000!


I must say this...THIS IS SO STUPID!! Why the SWPM/process doesn´t get/catch more R3load processes if I set 75 parallel jobs... 75 R3loads!!?? ... and assigns each of them to each part of that package that still didn´t load (\BIC\B0000575000-95)!! WHY!?!? This doesn´t make any sense!

How to solve this...? I can't understand why this behaves like that!

Anyone can help me please?!... I need to speed up this importing process!

Thank you,

JD

Former Member
0 Kudos

Hello João,

This is indeed strange and I wish I had the answers to your glaring WHYs!

I'm hoping an SAP Note could be the solution or maybe not.

I found the following details in SAP System Copy Guide which is why I would advise you to log an incident with SAP as well.

Table Splitting

For copying large ABAP tables, the tool R3ta has been developed to automatically generate WHERE conditions, with which a subset of table data can be accessed. These WHERE conditions are integrated into the R3load TSK files. Using WHERE conditions may not be optimal for every database management system and therefore has to be considered carefully.

Attention

As the usage of WHERE conditions requires a lot of experience and many manual steps and because there are still some problems not yet solved, we cannot release this feature generally.

You may use the feature WHERE conditions and the tool R3ta and in many cases it will work without problems, but if you run into problems, you cannot claim for support or an immediate fix of the problem. Nevertheless, we welcome any feedback which helps us to improve the tools.

The generated WHERE conditions should not cover more than one column. If R3ta calculated conditions with more columns, run the tool again with different row-count parameters.

If you decide to create WHERE conditions manually (without the R3ta tool), you must be aware of the fact, that a badly chosen WHERE condition can increase the total table processing time a lot.

In addition, the consultant takes over the responsibility for the completeness of the data!

You can often reduce the export time of a table which is processed with WHERE conditions if you create a (temporary) additional index on the column used within the WHERE condition.

Thanks.

Best Regards,

Anita