cancel
Showing results for 
Search instead for 
Did you mean: 

Max db (Content Server) Export/Import using several loadercli

Former Member
0 Kudos

Hi All, 

We have got a 1.2 TB database version 7.6.06.20. out of this 1.2 Tb database COMPONENTS0001 is only more than 900 Gbs.

We did a test export on one of the test databases of approx 300 gbs size as per procedure described in SAP Note 962019 & it took 35 hours to do that . This will mean if we do this against our prod system it will be 3-4 days which we cant afford.

I have got some questions if anyone can answer:

1) How can we split the COMPONENTS0001 in multiple parallel streams to export. I have read somewhere that you can break it with number of records.

2) If we break it, can we export the rest of tables individually which are CONTREP, DOUMENTS**** & all other component tables apart from COMPONENTS0001 using below command:

EXPORT TABLE SAPR3.XXXXXXX

catalog outstream file

'/sapdb/<SID>/backup/export/SAPR3_XXXXXXX.cat'

data outstream file

'/sapdb/<SID>/backup/export/SAPR3_XXXXXXX.data' PAGES

package outstream file

'/sapdb/<SID>/backup/export/SAPR3_XXXXXXX.pack' csv

3) If we do both of above what happens to SAPR3.cat & SAPR3.pack files which are otherwise created with SDB_EXPORT.sql as per SAP Note 962019. SAPR3 will be generated along with every table apart from COMPONENTS0001. How do we generate it for COMPONENTS0001.  Also what happens to SAPR3.pack.

Also if you can confirm if we go from AIX (64bit) to Windows 64 bit, can we use instead a backup/restore method instead of above ?

Any help will be great help.

Thanks in advance,

Ajay Sehgal

Accepted Solutions (1)

Accepted Solutions (1)

thorsten_zielke
Contributor
0 Kudos

Hi,

please let us know in case you have already an open SAP support ticket concerning this issue to avoid duplicate work on our side...

Let me recommend continuing to work through the SAP ticket (and follow/test the suggested note 1770207) as it is already assigned to a development support specialist, but at the same time I kindly ask you to share the solution/fix here so that others might benefit from it.

Thorsten

Former Member
0 Kudos

Hi Thorsten,

Thanks for your response & sorry for delay as i have been busy working.

Yeah we have got a SAP Ticket opened with SAP & still struggling to get a proper solution. Though we got a SAP Note 1770207 but it believe it still require further refinement.

But more comments are welcome if anyone had same issue & how did they overcome it as this is a open forum.

Regards,

Ajay

Former Member
0 Kudos

Hi All,

I am sharing the process for doing a parallel export of Content Server for doing heterogeneous system copy  using loadercli.

We have tested the export in our test system in which COMPONENTS0001 had row count of  1031374 & DOCUMENTS0001 also of same row size.

What we did was we exported all the tables one by one individually & broke the COMPONENTS0001 in parallel exports based on record count.

So here is the order in which we did:

1)  Export only the catalog informationusing EXPORT_CAT.sql having below commands:

EXPORT USER

catalog outstream file '/<path of export>/SAPR3.cat'

2) Once the above is done, we will need to export all tables individually apart from COMPONENTS0001 using different script for each table. So tables to be exported are:

CONTREP

DOCUMENTS0001 to DOCUMENTS0006

COMPONENTS0002 to COMPONENTS0006

Create EXPORT_CONTREP.sql with below commands:

EXPORT TABLE CONTREP

DATA OUTSTREAM FILE 'CONTREP.data'

Create one each for DOCUMENTS000X:

So for example EXPORT_DOCU001.sql for DOCUMENTS0001

EXPORT TABLE DOCUMENTS0001

DATA OUTSTREAM FILE 'DOCU01.data'

LOB OUTSTREAM LONG_PROPERTY 'DOCU01_PROP'

LONGFILE LONG_VALUE 'DOCU01_LONG'

     

-------

--------

and so on until last table (we had DOCUMENTS0006 so EXPORT_DOCU006.sql):

EXPORT TABLE DOCUMENTS0006

DATA OUTSTREAM FILE 'DOCU06.data'

LOB OUTSTREAM LONG_PROPERTY 'DOCU06_PROP'

LONGFILE LONG_VALUE 'DOCU06_LONG'

The same we will create the Sql scripts for COMPONENTS000xxx tables:

As we will be only splitting COMPONENTS0001 so we exported rest of tables like below. For example cat EXPORT_COMP002.sql for COMPONENTS0002 with below syntax

EXPORT TABLE COMPONENTS0002

DATA OUTSTREAM FILE 'COMP02.data'

LONGFILE LONG_VALUE 'COMP02_LONG'

---- Until the last Table

$ cat EXPORT_COMP006.sql

EXPORT TABLE COMPONENTS0006

DATA OUTSTREAM FILE 'COMP06.data'

LONGFILE LONG_VALUE 'COMP06_LONG'

By this time you have got all scripts ready for tables apart from one you will break. For us it was COMPONENTS0001 having 1 million rows, so we broke the table into 4 sets using below:

$ cat EXPORT_COMP01_01.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_1.data' START 1 250000

LONGFILE LONG_VALUE 'COMP01_LONG_1'

$ cat EXPORT_COMP01_02.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_2.data' START 250001 500000

LONGFILE LONG_VALUE 'COMP01_LONG_2'

$ cat EXPORT_COMP01_03.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_3.data' START 500001 750000

LONGFILE LONG_VALUE 'COMP01_LONG_3'

$ cat EXPORT_COMP01_04.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_4.data' START 750001 1100000

LONGFILE LONG_VALUE 'COMP01_LONG_4'

So all our export scripts were ready by now.

So next step was to run them parallel which can be done with below command:

loadercli -n <server_name> -d <database_name> -u <user_name>,<password> -b <script>

What we did was we export all tables apart from COMPONENTS0001 parallel & they were quick being small in size.

Then we ran scripts for COMPONENTS0001 in parallel from EXPORT_COMP01_01.sql to EXPORT_COMP01_04.sql & all ran fine.  The unloading command same for all exports.

We are yet to test the Import. Once we have done the import I will share the import process also.

But please feel free to share your experiences or steps which you took which might benefit others like us.

Warm Regards

Ajay Sehgal

thorsten_zielke
Contributor
0 Kudos

Hi Ajay,

I fully agree with you. Keep on posting and share your experience - just wanted to point out that you should tell us, if you have already opened a SAP ticket regarding this. Other than that, it is fine to work through the forum as well.

Thorsten

Former Member
0 Kudos

Hi Thorsten,

By the way do you have any suggestions for this .. as i cant see many people responding to my query.

I divided COMPONENSTS0001 into 4 parts doing 250k each.

The first one i ran individually & it completed in approx 1.5 hours, where when i ran rest of 3 together It took 8 hours which was bit weird.

So I am not sure was this the size of data which might have differed in first one & rest of 3.

Warm Regards

Ajay

Answers (0)