Oracle Schema backup on AWS
Jump to navigation
Jump to search
Contents
- 1 Specification: Oracle Data Pump Schema Migration Between AWS Instances
Specification: Oracle Data Pump Schema Migration Between AWS Instances
1. Prerequisites
- Source and target Oracle databases running in AWS (RDS, EC2, or similar).
- Network connectivity between your admin workstation and both databases (or work entirely on AWS EC2 if using that).
- Admin privileges to create and use DIRECTORY objects on both source and target.
- Oracle client tools (`expdp`, `impdp`) installed where you run the export/import (on an EC2 server or on your local workstation if it can connect).
- Sufficient disk space on both source and target for the dump files.
2. Create a DIRECTORY Object
You need a DIRECTORY object in both source and target databases.
On each database, as a DBA:
CREATE OR REPLACE DIRECTORY dpdir AS '/tmp/dpdump'; GRANT READ, WRITE ON DIRECTORY dpdir TO your_db_user;
- The
/tmp/dpdump
directory must exist and be writable by the Oracle OS user if using EC2. - On Amazon RDS: You can only use pre-defined directories like
DATA_PUMP_DIR
.
3. Export the Schema with Data Pump (`expdp`)
On the Source DB:
expdp your_db_user/your_password@source_db_service \
schemas=YOUR_SCHEMA \
directory=DPDIR \
dumpfile=your_schema_2024.dmp \
logfile=your_schema_2024_exp.log
- If using Amazon RDS, connect using the endpoint and use
DATA_PUMP_DIR
as the directory.
4. Retrieve the Dump File
A. On EC2-hosted Oracle:
- The dump file is on the EC2 server’s
/tmp/dpdump
directory. - Use
scp
orsftp
to transfer the file to your workstation, or directly to the target server.
B. On Amazon RDS:
- Use the AWS console or RDS procedures to copy the file to and from an S3 bucket:
- Use the
rdsadmin.rdsadmin_util
PL/SQL package to copy files betweenDATA_PUMP_DIR
and S3. - Example:
- Use the
BEGIN rdsadmin.rdsadmin_util.upload_to_s3( p_directory => 'DATA_PUMP_DIR', p_s3_prefix => 's3://your-bucket/dumps/', p_filename => 'your_schema_2024.dmp' ); END; /
- To import, use
download_from_s3
on the target RDS instance.
- To import, use
5. Import the Schema on the Target DB
A. Make the dump file available in the target's dump directory:
- On EC2: Copy file into the target’s dump directory (e.g.,
/tmp/dpdump
). - On RDS: Download from S3 to
DATA_PUMP_DIR
as above.
B. Grant necessary permissions to the import user:
GRANT CREATE SESSION, CREATE TABLE, CREATE SEQUENCE, CREATE VIEW, ... TO your_db_user; GRANT READ, WRITE ON DIRECTORY dpdir TO your_db_user;
C. Run the Import (`impdp`):
impdp your_db_user/your_password@target_db_service \
schemas=YOUR_SCHEMA \
directory=DPDIR \
dumpfile=your_schema_2024.dmp \
logfile=your_schema_2024_imp.log
6. Post-Import Tasks
- Recompile invalid objects if needed:
EXEC DBMS_UTILITY.compile_schema('YOUR_SCHEMA');
- Check object counts, grants, and data.
- Resolve any external dependencies (DB links, directory paths, etc.).
7. Clean Up
- Remove dump/log files from
/tmp
or S3 as appropriate. - Drop DIRECTORY objects if no longer needed (optional).
Special Notes for AWS RDS
- Use only Amazon RDS allowed directories (
DATA_PUMP_DIR
). - Use S3 integration for file movement.
- Procedures: RDS Oracle Data Pump and S3
Summary Table
Step | EC2-Hosted Oracle | Amazon RDS Oracle |
---|---|---|
Directory | Custom path (e.g. /tmp/dpdump) | Use DATA_PUMP_DIR only |
File move | SCP/SFTP | RDS S3 integration |
Tools | expdp/impdp CLI | expdp/impdp CLI or SQL*Plus |