cancel
Showing results for 
Search instead for 
Did you mean: 

Wanted: ability to export/import a schema

Former Member
0 Kudos

Currently there is no easy way to extract and restore objects, permissions, data, etc for a particular schema.

Say you needed to extract a schema to be restored on to multiple iq instances (prod -> regression and testing and development). Doing a backup/restore of the database is not practical as the schema may be only a few hundred megabytes while the database is terabytes in size plus there is the whole mess of adding removing users due to the different environment.

In Oracle, we can do this very easily with:

export schemas

expdp system/tiger@db11g schemas=SCOTT,USER1,USER2 directory=TEST_DIR dumpfile=schemas.dmp logfile=expdp.log

import only USER1 schema

impdp system/tiger@db11g schemas=USER1 directory=TEST_DIR dumpfile=schemas.dmp logfile=impdp.log

Surely the smart people in SAP Sybase can create such a feature without too much trouble.

Including link to datapump.  Oracle's had this for more than a decade:

http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php

Accepted Solutions (0)

Answers (4)

Answers (4)

markmumy
Advisor
Advisor
0 Kudos

I have heard  that this was put into the latest release of IQ.  Supposedly patch 20, version SP8.20 has a schema export utility.  I haven't seen the release notes yet but this was a feature that was discussed. Patch level 20 was released a few weeks ago.

Mark

former_member182258
Participant
0 Kudos

This feature will be part of IQ 15.4 ESD #5 later this month.  It did not make it into IQ 16 SP8.20, however.  Most likely, it will be part of next SP for IQ 16 - mid 2015.

Former Member
0 Kudos

I developed a set of stored procedures that extract DDL statements for various IQ object types (tables, indexes, views, stored procedures, functions, logins etc.). You can generate DDL statements for the whole database, for a particular login (schema) or for a set of specific objects. I successfully used these stored procedures on a number of customer sites. You are welcome to download it for free from my website http://gvirtz-consulting.com/downloads.html. The stored procedures have been tested on IQ 12.6, 12.7, 15.2 and 15.4. They generally work also on IQ 16, but they may not support some IQ 16 new features (yet).

As to the data export/import, you can use IQ extract functionality (TEMP_EXTRACT_MAME* options). Generating of corresponding LOAD TABLE statements from the information in IQ catalog tables is pretty much possible. It is possible to pack the whole export and import process into a single stored procedure, which can be run using an IQ event, for example.

I agree that IQ should provide export/import utilities out of the box. Actually, the absence of this feature drove me to develop the store procedures set explained above.

Former Member
0 Kudos

Thanks Leonid!  

I'm thinking that I could extend SQL Fairy (aka SQL Translator) to handle the extraction of the schema sans data.  Using SQL Fairy would also give us the ability to move the schema between different DBMSs.  (Oracle -> IQ, PostgreSQL -> IQ, IQ -> MariaDB  etc).  

The only short coming in this approach is the development time to expand the ASE and IQ plugins (producer/consumer) to handle the dbms specific DDL.  Going from say IQ -> Oracle RDBMS -> IQ would lose any IQ specific ddl where there is no equivalent in Oracle.

The more I think about it, the more SQL Fairy seems to be the answer for my particular situation in the long run. 

jason

Former Member
0 Kudos

  I like the idea of using SQLFairy.

  Just today I have been adding bits to it by creating and tweaking clone of the SQL::Translator::Parser::DBI::Sybase as a new class SQL::Translator::Parser::DBI::SQLAnywhere.

  I am probably too stupid and/or lazy to do it "correctly" 😕 ... but that never stopped any idiot before!

  It sure would have been nice if SybaseIQ metadata sources (stored-procedures/views/etc) conformed to the ASE variants ... It would be even nicer if SAP would do the SQLFairy work, as well as release the better version of DBD::SQLAnywhere on CPAN.

  FYI: I am using the Sybase/SAP provided version of DBD::SQLAnywhere because at last check the one on CPAN was faulty (never returned true for ->more_results()). In the past I ordinarily used DBD::Sybase, but AFAIK it doesn't support all of the same features (e.g. "LOAD TABLE <...> FROM CLIENT FILE ..."), and in any case the SQL::Translator::Parser::DBI::Sybase still wouldn't work correctly for IQ). One thing I'm not sure about is whether SQLAnywhere is used for anything other than IQ (I suspect it is). If that is the case, then the assumptions made in SQLFairy will not quite work right. We would need to clone *::SQLAnywhere to be IQ specific (i.e. *::SQLAnywhereIQ)

It seems to me SQLFAIRY (i.e. SQL::Translator*) could use a bit of enhancement too. For instance the sqlt program doesn't seem to have a mechanism for exporting only objects under a single user id, or using wildcard matches for object names/types.

c_baker
Employee
Employee
0 Kudos

SAP Sybase PowerDesigner is the recommended tool to properly extract (reverse engineer) all the DDL from IQ (and also supports many other DBMS engines).  Forward engineering to those DBMS engines is also supported.

PowerDesigner supports the test/dev/production cycle with linked and synchronized, models for each of your environments.  These models can be stored and managed in a repository for better management and control.

You can also model data movement (ETL, Replication) from your source systems as well.

http://www54.sap.com/pc/tech/database/software/model-driven-architecture/index.html

Depending on your release, you will already have a single user, run-time copy of PowerDesigner on the IQ Windows PC Client CD, that comes with your IQ Server.

Former Member
0 Kudos

Chris, none of what you've recommended solves the problem. 

PowerDesigner does not handle data migrations above a certain size.  It will crash or freeze solid.   Sybase Central was never completed.  The current Sybase Control Center is unusable for even simple tasks. 

A simple schema export/import tool is what is needed.

c_baker
Employee
Employee
0 Kudos

I've written extensions to PD in the past to manage IQ. Already have LOAD TABLE support plus some additional features.  Have recently been considering extending it to do TEMP_EXTRACT?

Was planning to post it to this group, but will wait to finish adding the additional capabilties.

Maybe that can meet your requirements.  Is the schema export/import all you need?

Former Member
0 Kudos

PowerDesigner would require another box (windows) to perform the export (if it actually finishes) and the import is excruciatingly slow.  If I want to automate it, I have to come up with a powershell script to manage PD.  OH, don't forget I now have to purchase a PowerDesigner license and a windows license.  Why would I want to use PD for a simple export/import of a schema?   That's a bit like "I have a hammer therefore everything is a nail". 

Message was edited by Moderator : Andrew Mumford

former_member213365
Active Participant
0 Kudos

I agree with Jason.  A fuller featured import/export is needed.

Being able to import a single table from a dump will put Sybase up on Oracle.  As far as I know Oracle 11g doesn't support that.  12c changes that somewhat.

Former Member
0 Kudos

Jim, if you're talking about RMAN, you are correct.  You can restore a tablespace but not a single table. 

11g datapump will import a single table from a datapump export. 

Two different tools to provide two different services

former_member213365
Active Participant
0 Kudos

Yes, that's it.  I don't usually cover the operations side of things.  But I know that backup/restore of datawarehouses in Oracle is a bit challenging.

Former Member
0 Kudos

Yes/no.  Backups in Oracle are so different from Sybase or the others.  You have to wrap your brain around it.  Somethings are easier and others not.

I think the fallacy that many vendors have is limited perspective. You'll see it in open source projects as well.  The KISS principle should be followed IMO whenever possible.  (Don't ask me about running Java in the dbms unless you want me to rant that it is one of the most idiotic things that Oracle, Sybase, etc did)

former_member182258
Participant
0 Kudos

This is a feature we are considering for a future release of IQ.

Former Member
0 Kudos

Thanks Courtney  

I'm more than a bit hesitant of relying on SAP/Sybase following through on such a feature in a future release of a product.  Sybase, albeit ASE engineering, is quite famous for making promises and never following through on them.  Maybe with SAP things are different.

former_member185199
Contributor
0 Kudos

run the free version in WINE and you can completly script it in VBS, noi frontend necessary

Former Member
0 Kudos

Legally I would still need a windows license and a PowerDesigner license.  Running PD via Wine is not very stable at all even with the new releases of Wine. 

Any issues I ran into would be without any support from SAP Sybase because PD is not supported under wine.  No SAP product is.

markmumy
Advisor
Advisor
0 Kudos

Jason,

Would the SA feature dbunload (http://dcx.sybase.com/1200/en/dbusage/loimpin.html and http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.help.sqlanywhere.12.0.1/dbadmin/dbunlo...) be a good starting point?  It is an external executable rather than SQL, but can do schema or data.  I ask because the framework is already in place within SA/IQ to do this which would make the time to implement that much faster than developing something 100% from scratch.

Mark

former_member182258
Participant
0 Kudos

The SAP Sybase IQ team is proud of the extensive set of features that have been introduced into the product in the last set of releases.  These include such capabilities as fully parallel bulk loading, data affinity for shared nothing performance in a flexible shared everything architecture, Hadoop integration, in-database analytics, full text search, in-memory write-optimized delta store, role-based access control, and a re-architected column store for more extreme data compression.  We set our priorities based on a balance of introducing innovations, satisfying new feature requests, and addressing functionality gaps.  We listen to our customers, and do our best to meet their needs.  Sometimes, we have to make difficult choices, but we are committed to making SAP Sybase IQ one of the best analytics products in the market.  Your request has not fallen on deaf ears, and we are tracking it along with other requests.  We assess and prioritize feature requests on a regular basis, and will do our best to accomodate as many as we can, as we plan the roadmap and development activities going forward.

Former Member
0 Kudos

Thanks Courtney   that's reassuring

Former Member
0 Kudos

I looked at it but it is rather limited.  The extraction would have to be a repeatable process that would be automated to the point of craziness. 

Former Member
0 Kudos

Do you have a cr # for the feature request?  I have a case open with Marina Lauzier (TSE) and she's looking for an existing cr # before creating a new one.

former_member182258
Participant
0 Kudos

The one we have been tracking is 679404, which involves backing up and restoring individual DBSpaces.  This is a more general request, but since you can assign individual database objects to particular DBSpaces, this would accomplish what you are looking for, too.

Former Member
0 Kudos

Correct me if I'm wrong but we already can backup and restore dbspaces (specifically the dbfiles involves with the dbspace to the same instance). The dbspace has to be readonly but it seems to work in 15.4.  What exactly does 679404 cover?  I'm just trying to understand if 679404 is redundant or not.

former_member182258
Participant
0 Kudos

Currently, you have to backup all R/W DDBSpaces together.  This would give you the option to backup and restore a single R/W DBSpace independently of the other ones.

Former Member
0 Kudos

Would I be able to restore into another instance?  For example, if I made a dbspace backup in production and restore into regression?  Also, isn't having multiple user dbspaces an add on feature (IQ_VLDBMGMT)?  That option isn't exactly cheap so we wouldn't really be able to take advantage of the cr.

former_member182258
Participant
0 Kudos

That is the idea.  And yes, it is currently part of the VLDB option.

Former Member
0 Kudos

thanks.

I'm not sure if it would be a viable alternative to an official schema export/import tool as the schemas could be anywhere from a few mb to terabytes in size sharing the same space with dozens or more other schemas.  Having a dbspace per schema would become rather unwieldy pretty quickly.  I can imagine having the 'big' schemas having their own dbspaces but not the smaller ones.

Former Member
0 Kudos

I completely agree with you. It will be a wonderful powerful feature which I will include in my disaster recovery procedure.

Last year, I asked the Sybase support what I could do if I want to restore a 5MB table from a 120GB dump file (620GB db). They answer was: use "archive database". I told them that loading a 620GB archive db was taking almost as long as loading the dump file (5hours45min on my test server) and that I wanted to request a new feature. Their answer was that since ASE has a functionality performing this function, it will not be pursued.

I am sure that SAP/Sybase engineers are smart enough to provide with this feature.

Former Member

Just because ASE or some other Sybase product has a feature should not be a reason for that feature to be excluded from IQ. IMO