Vous êtes sur la page 1sur 8

Database Migration

from
On-Premise to AWS RDS

Page No 1
Table of Contents

Export the data at On-Premise Database using Data pump ..................................................................... 3

Database Link creation for transfer of export dump to AWS RDS ............................................................ 3

To monitor the data pump job progress for the file transfer ................................................................... 3

To connect to a DB instance using SQL Developer ................................................................................... 4

Connecting to Your DB Instance Using SQL*Plus ...................................................................................... 7

To check the data pump file details in the RDS environment .................................................................. 8

To import the export dump file to the RDS environment ........................................................................ 8

To monitor the data pump job progress for the Import ........................................................................... 8

Page No 2
Export the data at On-Premise Database using Data pump
Use the below data pump export command to export the data for specific schemas from the source to
the RDS environment.

expdp system/<Password> schemas=<Schema Name> directory=expdp dumpfile=<dump file name>


logfile=<Log file name> version=<Database version of Target for Import>

Database Link creation for transfer of export dump to AWS RDS


We need to create a database link from source On-premise environment to the Target RDS
environment. This database link would be used to transfer the export dump file from the on-premise
environment to the data pump directory of the RDS environment.

create database link <database link name> connect to <master_user_account> identified by <password>
using '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=<dns or ip address of remote RDS
db>)(PORT=<listener port>))(CONNECT_DATA=(SID=<remote SID>)))'

Test the database link by using the below command.

Select sysdate from dual@<database link name>;

To transfer the export dump file to the RDS environment


To transfer the export dump file from the on-premise environment to RDS environment through data
pump utility, following block needs to be executed.

BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object => 'DATA_PUMP_DIR',
source_file_name => '<export dump file name>',
destination_directory_object => 'DATA_PUMP_DIR',
destination_file_name => '<destination export dump file name>',
destination_database => '<db link name>'
);
END;

To monitor the data pump job progress for the file transfer
The following command provides the status of the data pump job.

select * from dba_datapump_jobs

Page No 3
Now we need to connect to the RDS environment for the Import related activities.

To connect to a DB instance using SQL Developer


1. Start Oracle SQL Developer.
2. On the Connections tab, choose the add (+) icon.

3. In the New/Select Database Connection dialog box, provide the information for your DB
instance:
 For Connection Name, type a name that describes the connection, such as Oracle-RDS.
 For Username, type the name of the database administrator for the DB instance.
 For Password, type the password for the database administrator.
 For Hostname, type or paste the DNS name of the DB instance.
 For Port, type the port number.
 For SID, type the Oracle database SID.

The completed dialog box should look similar to the following.

Page No 4
4. Click Connect.
5. You can now start creating your own user and running queries against your DB instance and
databases as usual. To run a test query against your DB instance, do the following:
a. In the Worksheet tab for your connection, type the following SQL query:

Copy

SELECT NAME FROM V$DATABASE;

b. Click the execute icon to run the query.

Page No 5
SQL Developer returns the database name.

Page No 6
Connecting to Your DB Instance Using SQL*Plus
You can use a utility like SQL*Plus to connect to an Amazon RDS DB instance running Oracle. To connect
to your DB instance, you need its DNS name and port number. For information about finding the DNS
name and port number for a DB instance following steps need to be followed

Example To connect to an Oracle DB instance using SQL*Plus

In the following examples, substitute the DNS name for your DB instance, and then include the port
number and the Oracle SID. The SID value is the name of the DB instance's database that you specified
when you created the DB instance, and not the name of the DB instance.

For Linux, OS X, or Unix:

Copy

sqlplus
'mydbusr@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=dns_name)(PORT=port))(CONN
ECT_DATA=(SID=database_name)))'

For Windows:

Copy

sqlplus
mydbusr@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=dns_name)(PORT=port))(CONNE
CT_DATA=(SID=database_name)))

You will see output similar to the following.

SQL*Plus: Release 12.1.0.2.0 Production on Mon Aug 21 09:42:20 2017

After you enter the password for the user, the SQL prompt appears.

SQL>

Page No 7
To check the data pump file details in the RDS environment
The following command can be used to check the dump file details transferred to the RDS environment.

SELECT filename, filesize, to_char(mtime,'DD-MON-YYYY HH24:MI:SS') FROM


table(rdsadmin.rds_file_util.listdir('DATA_PUMP_DIR'));

Create new user and grant privileges in Amazon RDS

Create separate user apart from the Master user in RDS which was created during the RDS instance
creation. Grant the required privileges to the new user as mentioned below.

create user <user name> identified by <password>;


grant create session, resource to <user name>;
alter user <user name> quota <Required space> on <tablespace name>;
grant READ, WRITE ON DIRECTORY data_pump_dir to <user name>;

To import the export dump file to the RDS environment


To import the dump file to RDS environment through data pump utility, following block needs to be
executed. The REMAP_SCHEMA command needs to be used if the target schema is not same as source.

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => '<Dump File Name>', directory =>
'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''<Source schema name>'')');
DBMS_DATAPUMP.METADATA_REMAP(hdnl,'REMAP_SCHEMA','<Source schema>','<Target schema>');
DBMS_DATAPUMP.START_JOB(hdnl);
END;

To monitor the data pump job progress for the Import


The following command provides the status of the data pump job.

select * from dba_datapump_jobs;

Once the Import is completed, connect to the target user in the RDS environment and check whether
all the objects have been imported correctly.

Page No 8

Vous aimerez peut-être aussi