Skip to main content

Data & Intelligence

Migrating Oracle Database into AWS

Oracle Database Migration using Data Pump from an external server into an AWS EC2 instance or RDS

You can use Oracle Data Pump for several scenarios in AWS:

  • Import data from an Amazon EC2 instance with an Oracle database to an Oracle DB instance
  • Import data from a database on an Oracle DB instance to another Oracle DB instance
  • Import data from a database on an Oracle DB instance in a VPC to another Oracle DB instance with or without a VPC
  • Import data from a local Oracle database to an Amazon RDS DB instance

The following process uses Oracle Data Pump and the DBMS_FILE_TRANSFER package. The process connects to an Oracle instance and exports data using Oracle Data Pump. It then uses the DBMS_FILE_TRANSFER.PUT_FILE method to copy the dump file from the Oracle instance to the DATA_PUMP_DIR on the target DB instance that is connected via a database link. The final step imports the data from the copied dump file into the RDS instance.

The process has the following requirements:

  • You must have execute privileges on the DBMS_FILE_TRANSFER package
  • The target DB instance must be version 11.2.0.2.v6 or later
  • You must have write privileges to the DATA_PUMP_DIR directory on the source DB instance
  • You must ensure that you have enough storage space to store the dump file on the source instance and the target DB instance

Note

This process imports a dump file into the DATA_PUMP_DIR directory, a pre-configured directory on all Oracle DB instances. This directory is located on the same storage volume as your data files. When you import the dump file, the existing Oracle data files will use more space, so you should make sure that your DB instance can accommodate that additional use of space as well. Note that the imported dump file is not automatically deleted or purged from the DATA_PUMP_DIR directory. Use UTL_FILE.FREMOVE to remove the imported dump file.

The import process using Oracle Data Pump and the DBMS_FILE_TRANSFER package has the following steps:

  • Step 1: Grant privileges to user on source database
  • Step 2: Use DBMS_DATAPUMP to create a dump file
  • Step 3: Create a database link to the target DB instance
  • Step 4: Use DBMS_FILE_TRANSFER to copy the exported dump file to the Amazon RDS instance
  • Step 5: Import the dump file into a database on the Amazon RDS instance
  • Step 6: Clean up

Step 1: Grant privileges to user on source database

Use SQL Plus or Oracle SQL Developer to connect to the Oracle instance that contains the data to be imported. If necessary, create a user account and grant the necessary permissions.

The following commands create a new user and grant the necessary permissions:

Create user and grant privilage

SQL> create user test1 identified by test123;

SQL> grant create session, create table to test1;

SQL> alter user test1 quota 100M on users;

SQL> grant read, write on directory data_pump_dir to test1;

SQL> grant execute on dbms_datapump to test1;

You can use your own table, or you can create one to test the process.

Step 2: Use DBMS_DATAPUMP to create a dump file

Use SQL Plus or Oracle SQL Developer to connect to the Oracle instance and use the Oracle Data Pump utility to create a dump file. The following script creates a dump file named newtab.dmp in the DATA_PUMP_DIR directory.

Create a dump file in the directory

DECLARE

new1 NUMBER;

BEGIN

new1 := DBMS_DATAPUMP.open( operation => ‘EXPORT’, job_mode => ‘SCHEMA’, job_name=>null);

DBMS_DATAPUMP.ADD_FILE( handle => new1, filename => ‘newtab.dmp’, directory => ‘DATA_PUMP_DIR’, filetype => dbms_datapump.ku$_file_type_dump_file);

DBMS_DATAPUMP.add_file( handle => new1, filename => ‘exp.log’, directory => ‘DATA_PUMP_DIR’, filetype => dbms_datapump.ku$_file_type_log_file);

DBMS_DATAPUMP.METADATA_FILTER(new1,’SCHEMA_EXPR’,’IN (”test1”)’);

DBMS_DATAPUMP.start_job(new1);

END;

/

Step 3: Create a database link to the target DB instance

Next, create a database link between your source instance and your target DB instance. Note that your local Oracle instance must have network connectivity to the DB instance in order to create a database link and to transfer your export file.

If you are creating a database link between two DB instances inside the same VPC or peered VPCs, the two DB instances should have a valid route between them. The security group of each DB instance must allow ingress to and egress from the other DB instance. The security group inbound and outbound rules can refer to security groups from the same VPC or a peered VPC. For more information, see Adjusting Database Links for Use with DB Instances in a VPC.

The following command creates a database link named to_rds to another user at the target DB instance database:

Copy

create database link to_rds connect to USER2 identified by user2pwd

using ‘(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<dns or ip address of remote db>)(PORT=))(CONNECT_DATA=(SID=)))’;

Step 4: Use DBMS_FILE_TRANSFER to copy the exported dump file to an Amazon RDS DB instance

Next, use DBMS_FILE_TRANSFER to copy the dump file from the source database instance to the target DB instance. The following script copies a dump file named newtab.dmp from the source instance to a target database link named to_rds (created in the previous step):

Copy

BEGIN

DBMS_FILE_TRANSFER.PUT_FILE(

source_directory_object       => ‘DATA_PUMP_DIR’,

source_file_name             => ‘newtab.dmp’,

destination_directory_object => ‘DATA_PUMP_DIR’,

destination_file_name         => ‘tab1_copied.dmp’,

destination_database         => ‘to_rds’

);

END;

/

Step 5: Create the Necessary Tablespace on the Target Instance

You must create the tablespace before you can import the data. For more information, seeCreating and Sizing Tablespaces.

Step 6: Use Data Pump to import the data file on the DB instance

Use Oracle Data Pump to import the schema in the DB instance. The first part of the listing shows the format for the data import statement, and the second part shows importing a data file called tab1_copied.dmp. Note that additional options such as REMAP_TABLESPACE might be required.

Copy

impdp <username>@<TNS_ENTRY> DUMPFILE=user1copied.dmp DIRECTORY=DATA_PUMP_DIR full=y

 

impdp copy1@copy1 DUMPFILE=tab1_copied.dmp DIRECTORY=DATA_PUMP_DIR full=y

You can verify the data import by viewing the table on the DB instance.

Copy

SQL> select count(*) from user1.tab1;

Step 7: Clean up

After the data has been imported, you can delete the files you no longer want to keep. You can list the files in the DATA_PUMP_DIR using the following command:

Copy

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

Note

RDSADMIN.RDS_FILE_UTIL.LISTDIR not available for version 11.2.0.2.

The following command can be used to delete files in the DATA_PUMP_DIR that you no longer require:

Copy

exec utl_file.fremove(‘DATA_PUMP_DIR’,'[file name]’);

For example, the following command deletes the file named “test_dbms_lob.txt”:

Copy

exec utl_file.fremove(‘DATA_PUMP_DIR’,’test_dbms_lob.txt’);

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Tahriq Mahmood

More from this Author

Follow Us