10.07.2015 Views

Expert Oracle Exadata - Parent Directory

Expert Oracle Exadata - Parent Directory

Expert Oracle Exadata - Parent Directory

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

CHAPTER 13 MIGRATING TO EXADATAportable data file. This file can then be copied to <strong>Exadata</strong> and loaded into the target database using theimpdp command. Data Pump made its first appearance in <strong>Oracle</strong> 10g, so if your database is version 9i orearlier, you will need to use the old Export/Import (exp/imp) instead. Export and Import have beenaround since <strong>Oracle</strong> 7, and although they are getting a little long in the tooth they are still very effectivetools for migrating data and objects from one database to another. And, even though <strong>Oracle</strong> has beentalking about dropping exp and imp for years now, they are still part of the base 11.2 install. First we’ll talkabout Data Pump and how it can be used to migrate to <strong>Exadata</strong>. After that we can take a quick look atways to migrate older databases using Export and Import. Keep in mind that new features andparameters are added to Data Pump with each major release. Check the <strong>Oracle</strong> documentation forcapabilities and features specific to your database version.From time to time in this chapter we’ll make reference to tests and timings we saw in our lab. Table13-1 shows some of the relevant characteristics of the servers and databases we used for these tests. TheLAB112 database is the source database and EXDB is the target (<strong>Exadata</strong>) database. It is an <strong>Exadata</strong> V2quarter rack configuration.Table 13-1. Lab ConfigurationDatabase Db Version Platform Processors CPU ClockLAB112 11.2.0.1 Red Hat Linux 5, 32 bit 8 Dual Core, Intel XeonIntel Xeon MP CPUEXDB 11.2.0.1 <strong>Oracle</strong> Enterprise Linux 5, 64 bit 2 Quad Core, Intel XeonE5540 (Nahalem)2.80GHz2.53GHzHere is a breakdown of the segments in my test database.SEGMENT_TYPEMBYTES-------------------- ------------CLUSTER 63INDEX 13,137INDEX PARTITION 236LOBINDEX 48LOBSEGMENT 290TABLE 20,662TABLE PARTITION 1,768TYPE2 UNDO 142Now, let’s take a look at some of the Data Pump parameters you’ll want to know about. Here aresome of the key parameters that are useful for migrating databases.COMPRESSION: Data Pump compression is a relatively new feature. In 10g you hadthe ability to compress metadata, but in 11g this capability was extended totable data as well. Valid options are ALL, DATA_ONLY, METADATA_ONLY and NONE.Using the COMPRESSION=ALL option Data Pump reduced the size of our exportfrom 13.4G to 2.5G, a compression ratio of over 5 times. That’s a prettysignificant savings in storage. When we ran the test with compression turnedon, we fully expected it to slow down the export, but instead it actually reducedour export time from 39 minutes to just over 9 minutes. This won’t always bethe case, of course. On our test system the export was clearly I/O-bound. But itdoes point out that compression can significantly reduce the storage423

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!