10.07.2015 Views

Expert Oracle Exadata - Parent Directory

Expert Oracle Exadata - Parent Directory

Expert Oracle Exadata - Parent Directory

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

CHAPTER 13 MIGRATING TO EXADATArequirements for exporting your database without necessarily slowing down theprocess. Unfortunately, the ability to compress table data on the fly was notintroduced until release 11gR1. If your database is 10g and you need tocompress your dumpfiles before transferring them to <strong>Exadata</strong>, you will need todo that using external tools like gzip, zip, or compress. Note that the use of thedata COMPRESSION option in Data Pump requires <strong>Oracle</strong> Advanced Compressionlicenses.FLASHBACK_TIME, FLASHBACK_SCN: Believe it or not, by default Data Pump doesnot guarantee the read consistency of your export. To export a read-consistentimage of your database you must use either the FLASHBACK_SCN or theFLASHBACK_TIME parameter. If you use FLASHBACK_TIME, Data Pump looks up thenearest System Change Number (SCN) corresponding to the time you specifiedand exports all data as of that SCN. FLASHBACK_TIME can be passed in to DataPump as follows:FLASHBACK_TIME="to_timestamp('05-SEP-2010 21:00:00','DD-MON-YYYY HH24:MI:SS')"If you choose to use FLASHBACK_SCN, you can get the current SCN of yourdatabase by running the following query:SQL> select current_scn from v$database;FULL, SCHEMAS, TABLES: These options are mutually exclusive and specify whetherthe export will be for the full database, a selection of schemas, or a selection ofindividual tables. Note that certain schemas, like SYS, MDSYS, CTXSYS, andDBSNMP, are never exported when doing a full database export.PARALLEL: The PARALLEL parameter instructs Data Pump to split the work up intomultiple parts and run them concurrently. PARALLEL can vastly improve theperformance of the export process.NETWORK_LINK: This parameter specifies a database link in the target database tobe used for the export. It allows you to export a database from a remote server,pull the data directly through the network via database link (in the targetdatabase), and land the files on an <strong>Exadata</strong> file system. We see this as more of aconvenience than anything else, as it saves you the extra step of transportingthe dumpfiles manually at the end of the export. It is used by Grid Control toautomate the migration process using the “Import From Database” process.Using this method for manual migration doesn’t make much sense— if you aregoing to copy the data over a database link anyway, why not load it to targettables directly, using CTAS or direct-path insert, instead of dumping it to diskand reloading back later on?Now let’s turn our attention to the import process. Schema-level import is usually preferable whenmigrating databases. It allows you to break the process up into smaller, more manageable parts. This isnot always the case, and there are times when a full database import is the better choice. Most of thetasks we will talk about here apply to both schema-level and full database imports, As we go along, we’llnote any exceptions you will need to be aware of. If you choose not to do a full database import, beaware that system objects including roles, public synonyms, profiles, public database links, systemprivileges, and others will not be imported. You will need to extract the DDL for these objects using theSQLFILE parameter and a FULL=Y import. You can then execute the DDL into the target database to createthem. Let’s take a look at some of the parameters useful for migrating databases.424

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!