<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.sdmxcloud.org/index.php?action=history&amp;feed=atom&amp;title=Oracle_Schema_backup_on_AWS</id>
	<title>Oracle Schema backup on AWS - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.sdmxcloud.org/index.php?action=history&amp;feed=atom&amp;title=Oracle_Schema_backup_on_AWS"/>
	<link rel="alternate" type="text/html" href="https://wiki.sdmxcloud.org/index.php?title=Oracle_Schema_backup_on_AWS&amp;action=history"/>
	<updated>2026-04-13T15:29:54Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.32.0</generator>
	<entry>
		<id>https://wiki.sdmxcloud.org/index.php?title=Oracle_Schema_backup_on_AWS&amp;diff=7662&amp;oldid=prev</id>
		<title>Dbagley: /* Summary Table */</title>
		<link rel="alternate" type="text/html" href="https://wiki.sdmxcloud.org/index.php?title=Oracle_Schema_backup_on_AWS&amp;diff=7662&amp;oldid=prev"/>
		<updated>2025-05-28T10:54:28Z</updated>

		<summary type="html">&lt;p&gt;‎&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;Summary Table&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 10:54, 28 May 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l140&quot; &gt;Line 140:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 140:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;----&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;----&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;''Need sample scripts for S3 file transfer or want a checklist for a particular AWS scenario? Just let me know!''&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Dbagley</name></author>
		
	</entry>
	<entry>
		<id>https://wiki.sdmxcloud.org/index.php?title=Oracle_Schema_backup_on_AWS&amp;diff=7659&amp;oldid=prev</id>
		<title>Dbagley: Created page with &quot;= Specification: Oracle Data Pump Schema Migration Between AWS Instances =  == 1. Prerequisites ==  * '''Source and target Oracle databases''' running in AWS (RDS, EC2, or sim...&quot;</title>
		<link rel="alternate" type="text/html" href="https://wiki.sdmxcloud.org/index.php?title=Oracle_Schema_backup_on_AWS&amp;diff=7659&amp;oldid=prev"/>
		<updated>2025-05-28T10:42:33Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;= Specification: Oracle Data Pump Schema Migration Between AWS Instances =  == 1. Prerequisites ==  * &amp;#039;&amp;#039;&amp;#039;Source and target Oracle databases&amp;#039;&amp;#039;&amp;#039; running in AWS (RDS, EC2, or sim...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;= Specification: Oracle Data Pump Schema Migration Between AWS Instances =&lt;br /&gt;
&lt;br /&gt;
== 1. Prerequisites ==&lt;br /&gt;
&lt;br /&gt;
* '''Source and target Oracle databases''' running in AWS (RDS, EC2, or similar).&lt;br /&gt;
* '''Network connectivity''' between your admin workstation and both databases (or work entirely on AWS EC2 if using that).&lt;br /&gt;
* '''Admin privileges''' to create and use DIRECTORY objects on both source and target.&lt;br /&gt;
* '''Oracle client tools (`expdp`, `impdp`)''' installed where you run the export/import (on an EC2 server or on your local workstation if it can connect).&lt;br /&gt;
* '''Sufficient disk space''' on both source and target for the dump files.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 2. Create a DIRECTORY Object ==&lt;br /&gt;
&lt;br /&gt;
You need a DIRECTORY object in both source and target databases.&lt;br /&gt;
&lt;br /&gt;
On each database, as a DBA:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CREATE OR REPLACE DIRECTORY dpdir AS '/tmp/dpdump';&lt;br /&gt;
GRANT READ, WRITE ON DIRECTORY dpdir TO your_db_user;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* The &amp;lt;code&amp;gt;/tmp/dpdump&amp;lt;/code&amp;gt; directory must exist and be writable by the Oracle OS user if using EC2.&lt;br /&gt;
* '''On Amazon RDS:''' You can only use pre-defined directories like &amp;lt;code&amp;gt;DATA_PUMP_DIR&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 3. Export the Schema with Data Pump (`expdp`) ==&lt;br /&gt;
&lt;br /&gt;
On the Source DB:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
expdp your_db_user/your_password@source_db_service \&lt;br /&gt;
  schemas=YOUR_SCHEMA \&lt;br /&gt;
  directory=DPDIR \&lt;br /&gt;
  dumpfile=your_schema_2024.dmp \&lt;br /&gt;
  logfile=your_schema_2024_exp.log&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* If using '''Amazon RDS''', connect using the endpoint and use &amp;lt;code&amp;gt;DATA_PUMP_DIR&amp;lt;/code&amp;gt; as the directory.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 4. Retrieve the Dump File ==&lt;br /&gt;
&lt;br /&gt;
=== A. On EC2-hosted Oracle: ===&lt;br /&gt;
* The dump file is on the EC2 server’s &amp;lt;code&amp;gt;/tmp/dpdump&amp;lt;/code&amp;gt; directory.&lt;br /&gt;
* Use &amp;lt;code&amp;gt;scp&amp;lt;/code&amp;gt; or &amp;lt;code&amp;gt;sftp&amp;lt;/code&amp;gt; to transfer the file to your workstation, or directly to the target server.&lt;br /&gt;
&lt;br /&gt;
=== B. On Amazon RDS: ===&lt;br /&gt;
* Use the AWS console or [https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html RDS procedures] to copy the file to and from an S3 bucket:&lt;br /&gt;
** Use the &amp;lt;code&amp;gt;rdsadmin.rdsadmin_util&amp;lt;/code&amp;gt; PL/SQL package to copy files between &amp;lt;code&amp;gt;DATA_PUMP_DIR&amp;lt;/code&amp;gt; and S3.&lt;br /&gt;
** Example:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
BEGIN&lt;br /&gt;
  rdsadmin.rdsadmin_util.upload_to_s3(&lt;br /&gt;
    p_directory =&amp;gt; 'DATA_PUMP_DIR',&lt;br /&gt;
    p_s3_prefix =&amp;gt; 's3://your-bucket/dumps/',&lt;br /&gt;
    p_filename  =&amp;gt; 'your_schema_2024.dmp'&lt;br /&gt;
  );&lt;br /&gt;
END;&lt;br /&gt;
/&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
** To import, use &amp;lt;code&amp;gt;download_from_s3&amp;lt;/code&amp;gt; on the target RDS instance.&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 5. Import the Schema on the Target DB ==&lt;br /&gt;
&lt;br /&gt;
=== A. Make the dump file available in the target's dump directory: ===&lt;br /&gt;
* On EC2: Copy file into the target’s dump directory (e.g., &amp;lt;code&amp;gt;/tmp/dpdump&amp;lt;/code&amp;gt;).&lt;br /&gt;
* On RDS: Download from S3 to &amp;lt;code&amp;gt;DATA_PUMP_DIR&amp;lt;/code&amp;gt; as above.&lt;br /&gt;
&lt;br /&gt;
=== B. Grant necessary permissions to the import user: ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GRANT CREATE SESSION, CREATE TABLE, CREATE SEQUENCE, CREATE VIEW, ... TO your_db_user;&lt;br /&gt;
GRANT READ, WRITE ON DIRECTORY dpdir TO your_db_user;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== C. Run the Import (`impdp`): ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
impdp your_db_user/your_password@target_db_service \&lt;br /&gt;
  schemas=YOUR_SCHEMA \&lt;br /&gt;
  directory=DPDIR \&lt;br /&gt;
  dumpfile=your_schema_2024.dmp \&lt;br /&gt;
  logfile=your_schema_2024_imp.log&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 6. Post-Import Tasks ==&lt;br /&gt;
&lt;br /&gt;
* Recompile invalid objects if needed:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
EXEC DBMS_UTILITY.compile_schema('YOUR_SCHEMA');&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* Check object counts, grants, and data.&lt;br /&gt;
* Resolve any external dependencies (DB links, directory paths, etc.).&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== 7. Clean Up ==&lt;br /&gt;
&lt;br /&gt;
* Remove dump/log files from &amp;lt;code&amp;gt;/tmp&amp;lt;/code&amp;gt; or S3 as appropriate.&lt;br /&gt;
* Drop DIRECTORY objects if no longer needed (optional).&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Special Notes for AWS RDS ==&lt;br /&gt;
&lt;br /&gt;
* Use only Amazon RDS allowed directories (&amp;lt;code&amp;gt;DATA_PUMP_DIR&amp;lt;/code&amp;gt;).&lt;br /&gt;
* Use '''S3 integration''' for file movement.&lt;br /&gt;
* Procedures: [https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html RDS Oracle Data Pump and S3]&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
== Summary Table ==&lt;br /&gt;
&lt;br /&gt;
{| class=\&amp;quot;wikitable\&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Step&lt;br /&gt;
! EC2-Hosted Oracle&lt;br /&gt;
! Amazon RDS Oracle&lt;br /&gt;
|-&lt;br /&gt;
| Directory&lt;br /&gt;
| Custom path (e.g. /tmp/dpdump)&lt;br /&gt;
| Use DATA_PUMP_DIR only&lt;br /&gt;
|-&lt;br /&gt;
| File move&lt;br /&gt;
| SCP/SFTP&lt;br /&gt;
| RDS S3 integration&lt;br /&gt;
|-&lt;br /&gt;
| Tools&lt;br /&gt;
| expdp/impdp CLI&lt;br /&gt;
| expdp/impdp CLI or SQL*Plus&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
''Need sample scripts for S3 file transfer or want a checklist for a particular AWS scenario? Just let me know!''&lt;/div&gt;</summary>
		<author><name>Dbagley</name></author>
		
	</entry>
</feed>