on 08-20-2013 3:36 PM
Currently there is no easy way to extract and restore objects, permissions, data, etc for a particular schema.
Say you needed to extract a schema to be restored on to multiple iq instances (prod -> regression and testing and development). Doing a backup/restore of the database is not practical as the schema may be only a few hundred megabytes while the database is terabytes in size plus there is the whole mess of adding removing users due to the different environment.
In Oracle, we can do this very easily with:
export schemas
expdp system/tiger@db11g schemas=SCOTT,USER1,USER2 directory=TEST_DIR dumpfile=schemas.dmp logfile=expdp.log
import only USER1 schema
impdp system/tiger@db11g schemas=USER1 directory=TEST_DIR dumpfile=schemas.dmp logfile=impdp.log
Surely the smart people in SAP Sybase can create such a feature without too much trouble.
Including link to datapump. Oracle's had this for more than a decade:
http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php
I have heard that this was put into the latest release of IQ. Supposedly patch 20, version SP8.20 has a schema export utility. I haven't seen the release notes yet but this was a feature that was discussed. Patch level 20 was released a few weeks ago.
Mark
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I developed a set of stored procedures that extract DDL statements for various IQ object types (tables, indexes, views, stored procedures, functions, logins etc.). You can generate DDL statements for the whole database, for a particular login (schema) or for a set of specific objects. I successfully used these stored procedures on a number of customer sites. You are welcome to download it for free from my website http://gvirtz-consulting.com/downloads.html. The stored procedures have been tested on IQ 12.6, 12.7, 15.2 and 15.4. They generally work also on IQ 16, but they may not support some IQ 16 new features (yet).
As to the data export/import, you can use IQ extract functionality (TEMP_EXTRACT_MAME* options). Generating of corresponding LOAD TABLE statements from the information in IQ catalog tables is pretty much possible. It is possible to pack the whole export and import process into a single stored procedure, which can be run using an IQ event, for example.
I agree that IQ should provide export/import utilities out of the box. Actually, the absence of this feature drove me to develop the store procedures set explained above.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Leonid!
I'm thinking that I could extend SQL Fairy (aka SQL Translator) to handle the extraction of the schema sans data. Using SQL Fairy would also give us the ability to move the schema between different DBMSs. (Oracle -> IQ, PostgreSQL -> IQ, IQ -> MariaDB etc).
The only short coming in this approach is the development time to expand the ASE and IQ plugins (producer/consumer) to handle the dbms specific DDL. Going from say IQ -> Oracle RDBMS -> IQ would lose any IQ specific ddl where there is no equivalent in Oracle.
The more I think about it, the more SQL Fairy seems to be the answer for my particular situation in the long run.
jason
I like the idea of using SQLFairy.
Just today I have been adding bits to it by creating and tweaking clone of the SQL::Translator::Parser::DBI::Sybase as a new class SQL::Translator::Parser::DBI::SQLAnywhere.
I am probably too stupid and/or lazy to do it "correctly" 😕 ... but that never stopped any idiot before!
It sure would have been nice if SybaseIQ metadata sources (stored-procedures/views/etc) conformed to the ASE variants ... It would be even nicer if SAP would do the SQLFairy work, as well as release the better version of DBD::SQLAnywhere on CPAN.
FYI: I am using the Sybase/SAP provided version of DBD::SQLAnywhere because at last check the one on CPAN was faulty (never returned true for ->more_results()). In the past I ordinarily used DBD::Sybase, but AFAIK it doesn't support all of the same features (e.g. "LOAD TABLE <...> FROM CLIENT FILE ..."), and in any case the SQL::Translator::Parser::DBI::Sybase still wouldn't work correctly for IQ). One thing I'm not sure about is whether SQLAnywhere is used for anything other than IQ (I suspect it is). If that is the case, then the assumptions made in SQLFairy will not quite work right. We would need to clone *::SQLAnywhere to be IQ specific (i.e. *::SQLAnywhereIQ)
It seems to me SQLFAIRY (i.e. SQL::Translator*) could use a bit of enhancement too. For instance the sqlt program doesn't seem to have a mechanism for exporting only objects under a single user id, or using wildcard matches for object names/types.
SAP Sybase PowerDesigner is the recommended tool to properly extract (reverse engineer) all the DDL from IQ (and also supports many other DBMS engines). Forward engineering to those DBMS engines is also supported.
PowerDesigner supports the test/dev/production cycle with linked and synchronized, models for each of your environments. These models can be stored and managed in a repository for better management and control.
You can also model data movement (ETL, Replication) from your source systems as well.
http://www54.sap.com/pc/tech/database/software/model-driven-architecture/index.html
Depending on your release, you will already have a single user, run-time copy of PowerDesigner on the IQ Windows PC Client CD, that comes with your IQ Server.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Chris, none of what you've recommended solves the problem.
PowerDesigner does not handle data migrations above a certain size. It will crash or freeze solid. Sybase Central was never completed. The current Sybase Control Center is unusable for even simple tasks.
A simple schema export/import tool is what is needed.
I've written extensions to PD in the past to manage IQ. Already have LOAD TABLE support plus some additional features. Have recently been considering extending it to do TEMP_EXTRACT?
Was planning to post it to this group, but will wait to finish adding the additional capabilties.
Maybe that can meet your requirements. Is the schema export/import all you need?
PowerDesigner would require another box (windows) to perform the export (if it actually finishes) and the import is excruciatingly slow. If I want to automate it, I have to come up with a powershell script to manage PD. OH, don't forget I now have to purchase a PowerDesigner license and a windows license. Why would I want to use PD for a simple export/import of a schema? That's a bit like "I have a hammer therefore everything is a nail".
Message was edited by Moderator : Andrew Mumford
Yes/no. Backups in Oracle are so different from Sybase or the others. You have to wrap your brain around it. Somethings are easier and others not.
I think the fallacy that many vendors have is limited perspective. You'll see it in open source projects as well. The KISS principle should be followed IMO whenever possible. (Don't ask me about running Java in the dbms unless you want me to rant that it is one of the most idiotic things that Oracle, Sybase, etc did)
Jason,
Would the SA feature dbunload (http://dcx.sybase.com/1200/en/dbusage/loimpin.html and http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.help.sqlanywhere.12.0.1/dbadmin/dbunlo...) be a good starting point? It is an external executable rather than SQL, but can do schema or data. I ask because the framework is already in place within SA/IQ to do this which would make the time to implement that much faster than developing something 100% from scratch.
Mark
The SAP Sybase IQ team is proud of the extensive set of features that have been introduced into the product in the last set of releases. These include such capabilities as fully parallel bulk loading, data affinity for shared nothing performance in a flexible shared everything architecture, Hadoop integration, in-database analytics, full text search, in-memory write-optimized delta store, role-based access control, and a re-architected column store for more extreme data compression. We set our priorities based on a balance of introducing innovations, satisfying new feature requests, and addressing functionality gaps. We listen to our customers, and do our best to meet their needs. Sometimes, we have to make difficult choices, but we are committed to making SAP Sybase IQ one of the best analytics products in the market. Your request has not fallen on deaf ears, and we are tracking it along with other requests. We assess and prioritize feature requests on a regular basis, and will do our best to accomodate as many as we can, as we plan the roadmap and development activities going forward.
Correct me if I'm wrong but we already can backup and restore dbspaces (specifically the dbfiles involves with the dbspace to the same instance). The dbspace has to be readonly but it seems to work in 15.4. What exactly does 679404 cover? I'm just trying to understand if 679404 is redundant or not.
Would I be able to restore into another instance? For example, if I made a dbspace backup in production and restore into regression? Also, isn't having multiple user dbspaces an add on feature (IQ_VLDBMGMT)? That option isn't exactly cheap so we wouldn't really be able to take advantage of the cr.
thanks.
I'm not sure if it would be a viable alternative to an official schema export/import tool as the schemas could be anywhere from a few mb to terabytes in size sharing the same space with dozens or more other schemas. Having a dbspace per schema would become rather unwieldy pretty quickly. I can imagine having the 'big' schemas having their own dbspaces but not the smaller ones.
I completely agree with you. It will be a wonderful powerful feature which I will include in my disaster recovery procedure.
Last year, I asked the Sybase support what I could do if I want to restore a 5MB table from a 120GB dump file (620GB db). They answer was: use "archive database". I told them that loading a 620GB archive db was taking almost as long as loading the dump file (5hours45min on my test server) and that I wanted to request a new feature. Their answer was that since ASE has a functionality performing this function, it will not be pursued.
I am sure that SAP/Sybase engineers are smart enough to provide with this feature.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Just because ASE or some other Sybase product has a feature should not be a reason for that feature to be excluded from IQ. IMO
User | Count |
---|---|
83 | |
10 | |
10 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.