Purpose
In this tutorial, you learn how to migrate on-premises Oracle 11g database schemas to an 11g database in Oracle Database Cloud Service, using Data Pump conventional export and import.
Last Updated
March 2016, Release 16.1.5
Please note that Oracle develops its cloud services in an agile manner and makes rapid changes to meet our users’ needs. Therefore this tutorial may not reflect all of the changes since the last update to this service. For the very latest information see Oracle Database Cloud Service on Oracle Help Center.
Time to Complete
Approximately 40 minutes.
Background
A variety of methods exist for migrating data from on-premises databases to databases in the Database Cloud Service. These include Data Pump export and import, exporting to external tables, remote cloning, and uplugging and plugging databases. Some of these methods support migrating on-premises 11g to cloud 11g, some support migrating on-premises 12c to cloud 12c, and some methods support migrating and upgrading on-premises 11gR2 to cloud 12c.
This tutorial focuses on migrating on-premises 11g databases to 11g databases in the Database Cloud Service. Four methods are available for 11g to 11g migration:
- Data Pump Conventional Export/Import
- Within this method, you can select among four modes to control what portion of a database to export:
- Full database
- Tablespace(s)
- Schema(s) (the method used in this tutorial)
- Tables, partitions, and subpartitions
- Within this method, you can select among four modes to control what portion of a database to export:
- Data Pump Transportable Tablespace (can be used only if the source on-premises database character set is AL32UTF8)
- RMAN Transport Tablespace (RMAN CONVERT TABLESPACE/DATAFILE and Data Pump Export, Data Pump Import)
- SQL Developer using SQL*Loader or INSERT statements
Scenario
You have an on-premises Oracle 11g database. You want to migrate it to the Database Cloud Service. In this tutorial, you will use Data Pump conventional export/import and the Secure Copy (scp) command to perform the migration. You will:
- Create a small sample data set in an on-premises Oracle 11g database:
- Create a tablespace.
- Download and run a script for this tutorial to:
- Create 3 users (schemas) in the tablespace.
- Create a table for each user and insert data into the table.
- In the on-premises database, perform a Data Pump Export:
- In the on-premises database host, create an operating system directory to contain an on-premises database export dump file.
- In the on-premises database, use SQL*Plus to create a directory object that references the operating system directory.
-
- Invoke Data Pump Export to create a dump file from the tablespace, specifying 2 of the 3 schemas.
- Prepare the Oracle 11g Database Cloud Service:
- On the Oracle Database Cloud Service VM, create an operating system directory that will receive the export dump file.
- In the Oracle Database Cloud Service database, use SQL*Plus to create a directory object that references the operating system directory.
- Create a tablespace with the same name as the tablespace in the on-premises database.
- Copy the export dump file from the on-premises database to the Oracle Database Cloud Service using the Secure Copy (
scp
) command. - On the Oracle Database Cloud Service VM, invoke Data Pump Import to import the on-premises database schemas and data into the cloud database.
What Do You Need?
- An on-premises Oracle 11g database instance
- Note: Because we are using the Data Pump conventional export/import method, the on-premises Oracle 11g source database may have any character set. (In contrast, the Data Pump export/import method that uses transportable tablespaces requires that the on-premises database has a character set of Unicode (AL32UTF8) to match the AL32UTF8 character set of database instances in the Database Cloud Service).
- A copy of the SSH private key that is paired with the SSH public key that was used to create your target cloud database instance
- Preferably use passphrase-protected SSH keys for secure access (use of passphrase–protected SSH keys might also best ensure that the
scp
command will copy files successfully from on-premises environments to cloud environments)
- Preferably use passphrase-protected SSH keys for secure access (use of passphrase–protected SSH keys might also best ensure that the
- An Oracle 11g database in Oracle Database Cloud Service
- During the data pump import step, you will be prompted for a password. You will need the “Administration” password that was specified when the Database Cloud Service instance was created.
This tutorial assumes that you do not have data in your on-premises 11g database. If you already have data to migrate, you can skip this step.
- Download the
migrate_11g_11g.sql
script file. Move this script file to the Oracle home. - Log in to SQL*Plus as
sys
withsysdba
privileges.
Description of this image - Execute the
migrate_11g_11g.sql
script to create the tablespace, 3 schemas, tables, and records.
SQL> @migrate_11g_11g.sql;
Note: The example code in this tutorial is for a database namedonpremdb1
. If you use this script, substitute the path in this example with your actual database name and location. Themigrate_11g_11g.sql
script contains the following commands:CREATE SMALLFILE TABLESPACE migrate_11g_11g
DATAFILE
'/u01/app/oracle/database/oradata/onpremdb1/migrate_11g_11g.dbf' SIZE 52428800 AUTOEXTEND ON NEXT
1 MAXSIZE 104857600
BLOCKSIZE 8192
NOLOGGING
DEFAULT NOCOMPRESS
ONLINE
SEGMENT SPACE MANAGEMENT AUTO
EXTENT MANAGEMENT LOCAL AUTOALLOCATE
/
CREATE USER oecloud IDENTIFIED BY oecloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER oecloud QUOTA UNLIMITED ON migrate_11G_11G
/
GRANT CONNECT, RESOURCE to oecloud
/
CREATE TABLE oecloud.orders (id number, name varchar2(100))
/
INSERT INTO oecloud.orders (ID, NAME) VALUES ('1', 'AcmeOrd01')
/
CREATE USER shcloud IDENTIFIED BY shcloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER shcloud QUOTA UNLIMITED ON migrate_11g_11g
/
GRANT CONNECT, RESOURCE to shcloud
/
CREATE TABLE shcloud.shipments (id number, name varchar2(100))
/
INSERT INTO shcloud.shipments (ID, NAME) VALUES ('1', 'AcmeSh01')
/
CREATE USER hrcloud IDENTIFIED BY hrcloud
DEFAULT TABLESPACE migrate_11g_11g
TEMPORARY TABLESPACE TEMP
/
ALTER USER hrcloud QUOTA UNLIMITED ON migrate_11g_11g
/
GRANT CONNECT, RESOURCE to hrcloud
/
CREATE TABLE hrcloud.employees (id number, name varchar2(100))
/
INSERT INTO hrcloud.shipments (ID, NAME) VALUES ('1', 'Smith')
/
COMMIT
/ - Exit SQL*Plus.
Perform the following steps on the on-premises 11g database.
- Define a Data Pump export directory,
data_pump_exp
, in the on-premises database.
Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existingdpdump
directory in the operating system. As a best practice, you may wish to first use themkdir
command to create a new operating system directory and then point the database directory to that new directory.CREATE OR REPLACE DIRECTORY DATA_PUMP_EXP as '/u01/app/oracle/database/admin/onpremdb1/dpdump/';
- Set permissions on the Data Pump export directory, granting
READ
andWRITE
tosys
.GRANT READ, WRITE ON DIRECTORY DATA_PUMP_EXP to sys;
- Exit SQL*Plus. At the command line prompt, run the Data Pump Export utility
expdp
as theSYSTEM
user, or as another user with theDATAPUMP_EXP_FULL_DATABASE
role. Provide the password for the SYSTEM user when prompted.
Note: For this exercise, limit your export to 2 of the 3 user schemas that were created by the script. Do this by specifyingSCHEMAS=oecloud,shcloud
. This will result in theoecloud
andshcloud
schemas exporting, and thehrcloud
schema not exporting.expdp system@onpremdb1 dumpfile=export.dmp logfile=export.log
SCHEMAS=oecloud,shcloud directory=DATA_PUMP_EXP transport_full_check=y
When the export process is done, you will see a message similar to this:
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
/u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmp
Next, use the scp
command in your on-premises environment, to copy the dump file from the on-premises 11g database to the Cloud virtual machine (VM) hosting the 11g database.
- Before using the
scp
command to copy the export file to the cloud instance, prepare the SSH private key that is paired with the SSH public key that was used to create your target cloud database instance. If you will be running thescp
command from your Oracle home directory, you can place your SSH private key file in that directory. In a command shell, use thechmod 600
command to set the file permissions of the private key file so that only you have access to it.
Note: For the following example, you need to swap in your SSH private key name; this example uses a private key named ua_key_openssh.chmod 600 ua_key_openssh
- Use the
scp
command to copy the export file to the cloud instance.
Note: For the following example, you need to swap in your SSH private key name, your directory paths, and your public cloud IP address.
Note: For speed and simplicity of this tutorial, we are defining thescp
command to point to the pre-existingdpdump
directory in the target Database Cloud Service VM operating system. As a best practice, you may wish to first use themkdir
command to create a new operating system directory in the target Database Cloud Service VM and then point to that new directory.
Note: If this is the first time that you are connecting to the target Database Cloud Service VM operating system, you may receive a warning that the authenticity of the host can’t be established. If you are prompted to continue connecting, replyyes
.scp –i private_key.ppk /u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmp oracle@PUBLIC_CLOUD_IP_ADDRESS:/u01/app/oracle/admin/ORCL/dpdump/export.dmp
Perform the following steps in your Oracle Database Cloud Service account.
-
- Start a SQL*Plus session, and create a Data Pump export directory,
DP_FROM_ONPREM
, in the database.
Note: Remember to replace our genericCLOUD_DATABASE_NAME
with your actual database name. (In this example, the database name isORCL
.)
Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existingdpdump
directory in the operating system. As a best practice, you may wish to first use themkdir
command to create a new operating system directory and then point to that new directory.CREATE OR REPLACE DIRECTORY DP_FROM_ONPREM as '/u01/app/oracle/admin/CLOUD_DATABASE_NAME/dpdump/';
- Set permissions on the
DP_FROM_ONPREM
export directory, grantingREAD
andWRITE
tosys
.GRANT READ, WRITE ON DIRECTORY
DP_FROM_ONPREM to sys;
- Create the users
oecloud
andshcloud
in the cloud database instance.CREATE USER oecloud IDENTIFIED BY oecloud;
CREATE USER shcloud IDENTIFIED BY shcloud;
- Grant
CONNECT
andRESOURCE
tooecloud
andshcloud
.GRANT CONNECT, RESOURCE to oecloud;
GRANT CONNECT, RESOURCE to shcloud;
- Exit from SQL*Plus, and import the tablespace
migrate_11g_11g.dbf
into the Database Cloud Service database, using theimpdp
Data Pump import command.
Note: Remember to replaceLOCAL_DATABASE_NAME
with your 11g cloud database name.
Note: When you invoke theimpdp
command, you are prompted for a password. Provide the “Administrator” password that was used to create the Database Cloud Service database instance.
Note: The 2 errors listed at the end of the import can be ignored.impdp system@LOCAL_DATABASE_NAME dumpfile=export.dmp SCHEMAS=oecloud,shcloud DIRECTORY=dp_from_onprem
- Start a SQL*Plus session, and create a Data Pump export directory,
Verify that the table with data from the 11g on-premises database migrated successfully to the 11g cloud database.
- Log in to SQL*Plus in the 11g cloud database, to verify that the data from the on-premises 11g database has been imported into the 11g cloud database. The schema tables and records should now be accessible in the Database Cloud Service database.