In this tutorial, you learn how to migrate on-premises Oracle 11g database schemas to an 11g database in Oracle Database Cloud Service, using Data Pump conventional export and import.
Last Updated
March 2016, Release 16.1.5
Please note that Oracle develops its cloud services in an agile manner and makes rapid changes to meet our users’ needs. Therefore this tutorial may not reflect all of the changes since the last update to this service. For the very latest information see Oracle Database Cloud Service on Oracle Help Center.
Time to Complete
Approximately 40 minutes.
Background
A variety of methods exist for migrating data from on-premises databases to databases in the Database Cloud Service. These include Data Pump export and import, exporting to external tables, remote cloning, and uplugging and plugging databases. Some of these methods support migrating on-premises 11g to cloud 11g, some support migrating on-premises 12c to cloud 12c, and some methods support migrating and upgrading on-premises 11gR2 to cloud 12c.
This tutorial focuses on migrating on-premises 11g databases to 11g databases in the Database Cloud Service. Four methods are available for 11g to 11g migration:
Data Pump Conventional Export/Import
Within this method, you can select among four modes to control what portion of a database to export:
Full database
Tablespace(s)
Schema(s) (the method used in this tutorial)
Tables, partitions, and subpartitions
Data Pump Transportable Tablespace (can be used only if the source on-premises database character set is AL32UTF8)
RMAN Transport Tablespace (RMAN CONVERT TABLESPACE/DATAFILE and Data Pump Export, Data Pump Import)
SQL Developer using SQL*Loader or INSERT statements
Scenario
You have an on-premises Oracle 11g database. You want to migrate it to the Database Cloud Service. In this tutorial, you will use Data Pump conventional export/import and the Secure Copy (scp) command to perform the migration. You will:
Create a small sample data set in an on-premises Oracle 11g database:
Create a tablespace.
Download and run a script for this tutorial to:
Create 3 users (schemas) in the tablespace.
Create a table for each user and insert data into the table.
In the on-premises database, perform a Data Pump Export:
In the on-premises database host, create an operating system directory to contain an on-premises database export dump file.
In the on-premises database, use SQL*Plus to create a directory object that references the operating system directory.
Invoke Data Pump Export to create a dump file from the tablespace, specifying 2 of the 3 schemas.
Prepare the Oracle 11g Database Cloud Service:
On the Oracle Database Cloud Service VM, create an operating system directory that will receive the export dump file.
In the Oracle Database Cloud Service database, use SQL*Plus to create a directory object that references the operating system directory.
Create a tablespace with the same name as the tablespace in the on-premises database.
Copy the export dump file from the on-premises database to the Oracle Database Cloud Service using the Secure Copy (scp) command.
On the Oracle Database Cloud Service VM, invoke Data Pump Import to import the on-premises database schemas and data into the cloud database.
What Do You Need?
An on-premises Oracle 11g database instance
Note: Because we are using the Data Pump conventional export/import method, the on-premises Oracle 11g source database may have any character set. (In contrast, the Data Pump export/import method that uses transportable tablespaces requires that the on-premises database has a character set of Unicode (AL32UTF8) to match the AL32UTF8 character set of database instances in the Database Cloud Service).
A copy of the SSH private key that is paired with the SSH public key that was used to create your target cloud database instance
Preferably use passphrase-protected SSH keys for secure access (use of passphrase–protected SSH keys might also best ensure that the scp command will copy files successfully from on-premises environments to cloud environments)
An Oracle 11g database in Oracle Database Cloud Service
During the data pump import step, you will be prompted for a password. You will need the “Administration” password that was specified when the Database Cloud Service instance was created.
Creating Data in the On-premises Oracle 11g Database
This tutorial assumes that you do not have data in your on-premises 11g database. If you already have data to migrate, you can skip this step.
Download the migrate_11g_11g.sql script file. Move this script file to the Oracle home.
Execute the migrate_11g_11g.sql script to create the tablespace, 3 schemas, tables, and records. SQL> @migrate_11g_11g.sql;Note: The example code in this tutorial is for a database named onpremdb1. If you use this script, substitute the path in this example with your actual database name and location. The migrate_11g_11g.sql script contains the following commands:CREATE SMALLFILE TABLESPACE migrate_11g_11g
DATAFILE
'/u01/app/oracle/database/oradata/onpremdb1/migrate_11g_11g.dbf' SIZE 52428800 AUTOEXTEND ON NEXT
1 MAXSIZE 104857600
BLOCKSIZE 8192
NOLOGGING
DEFAULT NOCOMPRESS
ONLINE
SEGMENT SPACE MANAGEMENT AUTO
EXTENT MANAGEMENT LOCAL AUTOALLOCATE
/
CREATE USER oecloud IDENTIFIED BY oecloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER oecloud QUOTA UNLIMITED ON migrate_11G_11G
/
GRANT CONNECT, RESOURCE to oecloud
/
CREATE TABLE oecloud.orders (id number, name varchar2(100))
/
INSERT INTO oecloud.orders (ID, NAME) VALUES ('1', 'AcmeOrd01')
/
CREATE USER shcloud IDENTIFIED BY shcloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER shcloud QUOTA UNLIMITED ON migrate_11g_11g
/
GRANT CONNECT, RESOURCE to shcloud
/
CREATE TABLE shcloud.shipments (id number, name varchar2(100))
/
INSERT INTO shcloud.shipments (ID, NAME) VALUES ('1', 'AcmeSh01')
/
CREATE USER hrcloud IDENTIFIED BY hrcloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER hrcloud QUOTA UNLIMITED ON migrate_11g_11g
/
GRANT CONNECT, RESOURCE to hrcloud
/
CREATE TABLE hrcloud.employees (id number, name varchar2(100))
/
INSERT INTO hrcloud.shipments (ID, NAME) VALUES ('1', 'Smith')
/
COMMIT
/Description of this image
Exit SQL*Plus.
Preparing the Tablespace in the On-premises Database for Transport
Perform the following steps on the on-premises 11g database.
Define a Data Pump export directory, data_pump_exp, in the on-premises database. Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existing dpdump directory in the operating system. As a best practice, you may wish to first use the mkdir command to create a new operating system directory and then point the database directory to that new directory.CREATE OR REPLACE DIRECTORY DATA_PUMP_EXP as '/u01/app/oracle/database/admin/onpremdb1/dpdump/';Description of this image
Set permissions on the Data Pump export directory, granting READ and WRITE to sys.GRANT READ, WRITE ON DIRECTORY DATA_PUMP_EXP to sys; Description of this image
Exit SQL*Plus. At the command line prompt, run the Data Pump Export utility expdp as the SYSTEM user, or as another user with the DATAPUMP_EXP_FULL_DATABASE role. Provide the password for the SYSTEM user when prompted.
Note: For this exercise, limit your export to 2 of the 3 user schemas that were created by the script. Do this by specifying SCHEMAS=oecloud,shcloud. This will result in the oecloud and shcloud schemas exporting, and the hrcloud schema not exporting.expdp system@onpremdb1 dumpfile=export.dmp logfile=export.log SCHEMAS=oecloud,shcloud directory=DATA_PUMP_EXP transport_full_check=yWhen the export process is done, you will see a message similar to this: Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
/u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmpDescription of this image
Copying the Dump File to the Cloud Database Instance
Next, use the scp command in your on-premises environment, to copy the dump file from the on-premises 11g database to the Cloud virtual machine (VM) hosting the 11g database.
Before using the scp command to copy the export file to the cloud instance, prepare the SSH private key that is paired with the SSH public key that was used to create your target cloud database instance. If you will be running the scp command from your Oracle home directory, you can place your SSH private key file in that directory. In a command shell, use the chmod 600 command to set the file permissions of the private key file so that only you have access to it.
Note: For the following example, you need to swap in your SSH private key name; this example uses a private key named ua_key_openssh.chmod 600 ua_key_opensshDescription of this image
Use the scp command to copy the export file to the cloud instance.
Note: For the following example, you need to swap in your SSH private key name, your directory paths, and your public cloud IP address. Note: For speed and simplicity of this tutorial, we are defining the scp command to point to the pre-existing dpdump directory in the target Database Cloud Service VM operating system. As a best practice, you may wish to first use the mkdir command to create a new operating system directory in the target Database Cloud Service VM and then point to that new directory. Note: If this is the first time that you are connecting to the target Database Cloud Service VM operating system, you may receive a warning that the authenticity of the host can’t be established. If you are prompted to continue connecting, reply yes.scp –i private_key.ppk /u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmp oracle@PUBLIC_CLOUD_IP_ADDRESS:/u01/app/oracle/admin/ORCL/dpdump/export.dmpDescription of this image
Preparing the Oracle 11g Database in the Cloud
Perform the following steps in your Oracle Database Cloud Service account.
Start a SQL*Plus session, and create a Data Pump export directory, DP_FROM_ONPREM, in the database.
Note: Remember to replace our generic CLOUD_DATABASE_NAME with your actual database name. (In this example, the database name is ORCL.) Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existing dpdump directory in the operating system. As a best practice, you may wish to first use the mkdir command to create a new operating system directory and then point to that new directory.CREATE OR REPLACE DIRECTORY DP_FROM_ONPREM as '/u01/app/oracle/admin/CLOUD_DATABASE_NAME/dpdump/';
Set permissions on the DP_FROM_ONPREM export directory, granting READ and WRITE to sys.GRANT READ, WRITE ON DIRECTORYDP_FROM_ONPREM to sys;
Create the users oecloud and shcloud in the cloud database instance.CREATE USER oecloud IDENTIFIED BY oecloud;
CREATE USER shcloud IDENTIFIED BY shcloud;
Grant CONNECT and RESOURCE to oecloud and shcloud.GRANT CONNECT, RESOURCE to oecloud;
GRANT CONNECT, RESOURCE to shcloud;
Create a tablespace with the same name as the tablespace in the on-premises database, migrate_11g_11g. CREATE SMALLFILE TABLESPACE migrate_11g_11g
DATAFILE
'/u02/app/oracle/oradata/ORCL/migrate_11g_11g.dbf' SIZE 52428800 AUTOEXTEND ON NEXT 1
MAXSIZE 104857600
BLOCKSIZE 8192
NOLOGGING
DEFAULT NOCOMPRESS
ONLINE
SEGMENT SPACE MANAGEMENT AUTO
EXTENT MANAGEMENT LOCAL AUTOALLOCATE;
Exit from SQL*Plus, and import the tablespace migrate_11g_11g.dbf into the Database Cloud Service database, using the impdp Data Pump import command.
Note: Remember to replace LOCAL_DATABASE_NAME with your 11g cloud database name.
Note: When you invoke the impdp command, you are prompted for a password. Provide the “Administrator” password that was used to create the Database Cloud Service database instance.
Note: The 2 errors listed at the end of the import can be ignored.impdp system@LOCAL_DATABASE_NAME dumpfile=export.dmp SCHEMAS=oecloud,shcloud DIRECTORY=dp_from_onprem
Verifying that the 11g On-Premises Database Data Migrated to the 11g Cloud Database
Verify that the table with data from the 11g on-premises database migrated successfully to the 11g cloud database.
Log in to SQL*Plus in the 11g cloud database, to verify that the data from the on-premises 11g database has been imported into the 11g cloud database. The schema tables and records should now be accessible in the Database Cloud Service database.