OCI-GovCloud Migration: Data Pump Import

We are migrating from AWS to Oracle Cloud Infrastructure (OCI) during the fall/winter of 2023. Our checklist for success involves the following:

Multi-domain custom URL that accurately permit users to log into the correct applications and allow developers the access that they need for database tools including APEX

Migrate a database pump file from AWS to OCI

Install a database pump file on OCI into specific schemas

Have functioning applications (kind of obvious, huh?)

Be able to send email from Oracle APEX to users

The objective during this stage of our migration involves migrating the database pump file from AWS to OCI.

Table of Contents

    OCI GovCloud

    As mentioned in the lead article in this series, we are migrating multiple commercial applications from AWS to Oracle Cloud Infrastructure using their GovCloud environment. GovCloud meets the U.S. federal government’s standards outlined in FedRamp which stands for Federal Risk and Authorization Management Program. You can read more here: https://www.gsa.gov/technology/government-it-initiatives/fedramp

    Regrettably, my colleague Eli Duvall and I have struggled finding detailed instructions on what works and what does not work in OCI GovCloud. Therefore, we find Oracle’s blogs, Oracle’s instructions, and blogs written by others. A lot of the techniques we found failed. We think our failure is due to the increased security parameters with FedRamp/GovCloud.

    Our process throughout is use the most common techniques, preferably native to OCI. If that fails, find another solution but not introduce third-party tools, apps, or such. If that fails, kick it old school and do it by hand.

    Regrettably, installing a database pump file from an external database such as our AWS has required a significant number of manual steps. We look longingly at the cool short cuts as we step by them to do it the hard-way. So don’t get all judge-y on our efforts. If simple worked, we would do simple.

    Do First, This is Better

    …And did not work for us…

    From our OCI Landing page for our Autonomous Database has a Data Pump feature:

    When clicking in, you then select a Bucket. There is a great article written by That Jeff Smith on this technique:

    We were not successful after several tries. Please try this technique first. It is better, faster, more efficient.

    The Hard And Slow Way

    Step 0 – Confirmation you are ready

    We left off this adventure knowing that our database dump file is located locally in our data_pump_dir on our ADB. If unsure jump back to this article OCI Migration: Copy from S3 to OCI. The test for moving forward with this step is confirming that the desired backup file (*.dmp) exists in the data_pump_dir.

    select * from dbms_cloud.list_files('DATA_PUMP_DIR');

    Note that this file needs to be the same size as the original file. The copy process can be slow and you can watch the file size increase minute-by-minute.

    Step 1 – Prepare your Import Job

    Back in the old days, we ran the database pump feature from the command line of the underlying operating system. With ADB, we do not have access to the operating system. The folks at Oracle created a new package called dbms_datapump. Go enjoy this read:

    https://docs.oracle.com/en/database/oracle/oracle-database/19/arpls/DBMS_DATAPUMP.html#GUID-AEA7ED80-DB4A-4A70-B199-592287206348

    In short, we are going to create a script that generates a data pump job that run immediately in the background. This script goes through several actions including:

    1. Create the database import job with a generated job name
    2. Filter, as needed
    3. Open the backup file (*.DMP) from the data_pump_dir directory
    4. Provide the name of the import log file
    5. Perform any remapping that is needed
      • Tablespace remapping
      • Schema Remapping
    6. Start the job

    What you’ll need

    You’ll need the following data elements for the text files/sql script:

    • Username/Schema name
    • Dump file name
    • Log file name

    Step 2 – Copy Script to Notepad

    Copy this script to your favorite text editor. Modify the variables at the top to suit your needs.

    set serveroutput on
    declare
    	l_username		varchar2(30)	:= 'MYAPP';
    	l_old_ts_name		varchar2(50)	:= 'MY_TS';
    	l_old_schema		varchar2(50)	:= 'OLD_SCHEMA';
    	l_dump_file		varchar2(50)	:= 'MYAPP_20231107.DMP';
    	l_log_file		varchar2(50)	:= 'MYAPP_20231107.log';
    	l_dp_handle		number;
    	l_result		varchar2(32767);
    begin
      -- Open a schema import job.
      l_dp_handle := dbms_datapump.open(
        operation   => 'IMPORT',
        job_mode    => 'SCHEMA',
        remote_link => NULL,
        job_name    => dbms_scheduler.generate_job_name);
    
      -- Specify the schema to be imported.
      dbms_datapump.metadata_filter(
        handle => l_dp_handle,
        name   => 'SCHEMA_EXPR',
        value  => '= ''' || l_username || '''');
    
      -- Specify the dump file name and directory object name.
      dbms_datapump.add_file(
        handle    => l_dp_handle,
        filename  => l_dump_file,
        directory => 'DATA_PUMP_DIR');
    
      -- Specify the log file name and directory object name.
      dbms_datapump.add_file(
        handle    => l_dp_handle,
        filename  => l_log_file,
        directory => 'DATA_PUMP_DIR',
        filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    
      -- Perform a REMAP_TABLESPACE from former to DATA.
      dbms_datapump.metadata_remap(
        handle     => l_dp_handle,
        name       => 'REMAP_TABLESPACE',
        old_value  => l_old_ts_name,
        value      => 'DATA');
    		
      -- Perform a REMAP_SCHEMA from USERS to DATA.
      dbms_datapump.metadata_remap(
        handle     => l_dp_handle,
        name       => 'REMAP_SCHEMA',
        old_value  => l_old_schema,
        value      => l_username);		
    
      dbms_datapump.start_job(l_dp_handle);
    
      dbms_datapump.wait_for_job (
        handle     => l_dp_handle,
        job_state  => l_result);
      
      dbms_output.put_line('l_result=' || l_result);
    end;
    /

    Step 3 – Create your Schema

    Everybody does this a bit differently. You do your way. It ought to include:

    • create user xxx identified by xpasswordx
    • grants as needed

    If you need help, do a little search action on “oracle create user”. I’ve got to assume if you are this deep into running this migration yourself, you likely saved your old scripts for creating users or you can just whip them up.

    Step 4 – Run Script

    As admin in SQL Developer Web connected to your target ADB, paste the script and run. Maybe cross your fingers?

    Step 5 – Confirmation and Clean up

    As with any database migration, we run into problems with grants, synonyms, database links and stuff we forgot we did. We have to budget a half-day or day to confirm that all of our views, packages, and everything compile successfully.

    Posted by Christina Moore

    First database application during undergrad years: 1985. First full-time work in Oracle db: 1995 at FedEx in the land of international shipping and regulatory compliance. First APEX application: 2008 with a U.S. based wholesale grocery distribution outfit (logistics, finance, and regulatory compliance). First commercial APEX application: 2012 (time tracking/invoices, grant management/finances/regulatory compliance. There seems to be a trend with finance, regulatory compliance, logistics.