Skip to Content
0

Export parts of SQL 16 DB to another SQL 16 DB

Dec 21, 2016 at 07:56 AM

175

avatar image

Hello, all

I need to move data from 30+ tables from one DB to another.

The DBs have the same structure.

I need to move only some of the data according to a specific where clause.

I tried using ISQL OUTPUT statement, but, it appears I cannot pass any dynamic parameters to it.

For example, this is how I export data for 1 table

//-----------------------------------------

select * from t001_table where c_key = 'key';

output using 'driver=SQL Anywhere 16;UID=dba;PWD=pwd;Server=server_name';DBN=db_name';ASTART=No;LINKS=TCPIP();CON='''''; into "DBA"."t001_hotel_sync_timestamp" create table off;

//--------------------------------------

No, I have 30 tables like this, so I thought to make the connection string a parameter and pass it to the output statement. But I seem to fail to do that. It does not appear that OUTPUT statement is ready to accept any variables in it's parameters, only constants.

Is there any workaround to this issue?

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Volker Barth Dec 21, 2016 at 09:34 AM
0

I would recommend to use the builtin "Remote Data Access" facility, i.e. to use proxy tables to copy the data.

Within the 2nd database, you would create a read-only remote server accessing the 1st database, then create proxy tables via CREATE EXISTING TABLE for each remote table you want to copy, and then use "SELECT * INTO MyLocalTable FROM MyRemoteTable ORDER BY Whatever" to copy the data (and to compare it afterwards).

After that, you can remove the remote tables and the remote server.

A sample can be found in the answer of that FAQ from the SQL Anywhere Forum:

create existing table in local db

Share
10 |10000 characters needed characters left characters exceeded
Arcady Abramov Dec 21, 2016 at 10:00 AM
0

Thank you, Volker

Remote tables tend to lock the data for editing in the target DB. There are tables where I need to move 50,000+ rows and the target DB is operational at the time. Such operation may lockout other users.

I would rather not use it if there is an alternative.

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Remote tables tend to lock the data for editing in the target DB.

What do you mean exactly? Are rows in the source or in the target database "locked"? - You could also use INSERT ... SELECT FROM" instead of "SELECT INTO", and the locking should generally depend on the isolation level chosen. And of course you can switch roles, i.e. create the remote server within the "source" database to access the "target" database and copy via "INSERT MyRemoteTable SELECT FROM MyLocalTable..."...

Besides that, instead of the DBISQL OUTPUT command you can easily use the UNLOAD statement which is a real SQL statement and therefore does support variables and the like.

0
Arcady Abramov Dec 21, 2016 at 10:47 AM
0

Does UNLOAD statement allow exporting directly into a different DB? I found no evidence of that in the help.

Regarding remote tables - the reason we are not using them to retrieve data is exactly because while transaction is opened, the whole pages of the remote table a locked in the target DB. It has been a while since we used them for anything other than inserting small prices of data with immediate commit.

Show 2 Share
10 |10000 characters needed characters left characters exceeded
Does UNLOAD statement allow exporting directly into a different DB?

No, you will have to export to a file and then use LOAD TABLE or the like to import on the other database. I had simply suggested that because UNLOAD should not interfere with other transactions using the source table, particularly when using a fitting isolation level.

whole pages of the remote table a locked in the target DB

I still do not really understand what you say – possibly because in my usage of proxy tables, I have never noticed locking problems (even with millions of rows), but that may be due to less transactions on the remote databases... The "pages" of a remote table are not part of the local database, a proxy table is more like a view on a remote object... However, I have often used proxy tables to "fill" local temporary tables and then joined those local temporary tables with the real target tables to modify the latter ones, so that may be a possible workaround, I hope... – At least that way, there should be no difference when updating the target table compared to other means to mass-import data while a table is in active usage by other transactions, too.

-------------------------

FWIW, I guess you will get more feedback on this issue in the other forum, so I'd suggest to ask there...

0

oops, got lost in the forum navigation. Sorry....

0