Skip to Content
avatar image
Former Member

Is there a way to dynamically copy a live database to a duplicate database without having to stop the Sybase16 engine

Hi I would like to know if there is a way to dynamically copy a live database to a duplicate database without having to stop the Sybase16 engine. There is a need to have a copy of the production database in another (training) database, I don't want to stop the Sybase engine to copy database1 to database2. Is there a tool or script which will allow me to copy one to another automatically

cheers John

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

4 Answers

  • Jan 07, 2015 at 10:33 AM

    Database refreshing is a process by which a Sybase ASE database is overwritten or refreshed from an existing backup:

    In general, it is more often that a backup of the current production database is taken and a copy of the same database in a development or testing environment is refreshed from this backup.

    The database to be refreshed needs to be the same size or greater size as the source database with data and log segments created preferably in the image of the source database (otherwise data and log segments could mix upon refresh).

    The process of refreshing has 3 Steps:

    1. 1. Bcp Out of Target System Tables
    2. 2. Loading the prod/existing image from the existing backup
    3. 3. Deleting the loaded system tables(not all the tables)
    4. 3. BCP in the tables which are taken in the previous step

    Steps:

    Check the device sizes in the source DB and create accordingly in the target machine:

    sp_helpdb <DB_Name>

    go

    bcp <<DBNAME>>..sysusers out  sysusers.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..sysalternates out sysalternates.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..sysusages  out  sysusages.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..syslogins out  syslogins.txt –S<<SERVER NAME>> -U<User_Name> -c

    select spid,dbid,db_name(dbid),user_name(uid),hostname,program_name,physical_io,cpu,status from master..sysprocesses where db_name(dbid) = <<DBNAME>>

    Kill If Any Processes is reporting or put the server as single user mode(at your risk):

    select "kill " + convert(varchar(10),spid),cmd,status

    from master..sysprocesses

    where db_name(dbid) = <<DBNAME>>

    use master

    go

    load database <<DBNAME>>  from  “Dump path1"

    stripe on “Dump path12”…..

    go

    online database <<DBNAME>>

    go

    use  <<DBNAME>>

    go

    sp_configure "allow updates",1

    go

    delete from <<DBNAME>>..sysusers

    delete from <<DBNAME>>..sysalternates

    delete from <<DBNAME>>..sysusages

    delete from <<DBNAME>>..syslogins

    go

    bcp <<DBNAME>>..sysusers  out  sysusers.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..sysalternates out sysalternates.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..sysusages  out  sysusages.txt –S<<SERVER NAME>> -U<User_Name>    -c

    bcp <<DBNAME>>..syslogins  out  syslogins.txt –S<<SERVER NAME>> -U<User_Name>    -c

    select count(*) from <<DBNAME>>..sysusers

    select count(*)  from <<DBNAME>>..sysalternates

    go

    sp_configure "allow updates",0

    go

    ==================================

    Sample Load Database Script:

    load Database <DB_Name> from 'compress::/<Full_Dump_Path>'

    stripe on 'compress::/<Full_Dump_Path>'

    go

    =======================================

    Thanks & Regards

      Kiran K Adharapuram

    Product Support Engineer

    Add comment
    10|10000 characters needed characters exceeded

    • Hello Mark,

      Good Day..!!!

      I never said to drop the syslogins from master database 😢 🤯

      Logins & Roles are server level. Ie. applies to all db's in the server.

      If you want to dump-db on prod and load-db on test, and preserve the logins from prod, then they have to be synched from prod before the load-db. It is normal practice.

      If you don't then the users/logins/roles are screwed, and you have to fix it up after load-db.

      Users are db level. but of course, the user is mapped to the Login.


      When you load-db, you are loading the users-mapped-to-logins-at-prod so if (a) is not taken care of, the user-to-mapped-to-logins-at-test are screwed.

      If you plan to dump-db on prod, load-db on test, then you need the Login & Roles synched, beforehand. It has nothing to do with drop/create db.

      Hope I am clear, But I agree mistake deleting of master..syslogins will be too costly 😔 😢.

      Thanks for the catch Mark.

      Regards

      Kiran Kumar A

  • Jan 07, 2015 at 05:57 AM

    The usual method would be to use DUMP DATABASE to make a backup of the production database, then use CREATE DATABASE FOR LOAD to create the training database and then LOAD DATABASE to load the backup into the new training database.


    Add comment
    10|10000 characters needed characters exceeded

  • Jan 07, 2015 at 10:32 AM

    John,

    Its very Simple schedule a backup in the off business hours and make sure you transfer to the target machine(through ftp or scp)

    BCP out the necessary systems to sync the users and logins, and load the database from the copied dump.

    Bring the DB online in the target database.

    Regards

    Kiran Kumar A

    Add comment
    10|10000 characters needed characters exceeded

  • Jan 09, 2015 at 06:59 AM

    Hi John,

    Is the above solutions worked out ?

    Regards

    Kiran Kumar A

    Add comment
    10|10000 characters needed characters exceeded