In this blog post I demonstrate how to import Oracle Database schemas from dump files into Autonomous Data Warehouse (ADW). In this post, I use SQL Developer to run the Data Import Wizard.
Quick links to other posts in this series:
- Export Data into Dump Files
- Transfer Data Dump Files over to Oracle Cloud Object Storage
- Import Data Dump Files into ADW Using SQL Developer (this post)
A couple of pre-requisites before we run the SQL Developer Data Pump Wizard:
- Dump files have already been transferred to Oracle Cloud Object Storage. Refer to this blog post on how to do this for large files that would time out in an internet browser.
- Create ADW Credentials to connect to the Object Store locker:
- Create an Authentication Token in Object Storage in order to be able to access Object Storage from ADW.
The Authentication Token is specific to an Oracle Cloud User. So in the Infrastructure Dashboard upper left menu, navigate to Identity and then Users.
Select the user. In my case it is: firstname.lastname@example.org
Under Resources, select Auth Tokens and click the Generate Token button.
Make sure to copy the generated token once it’s displayed because this won’t be accessible again. We will use this generated token as the Authentication token in the next step to create the ADW Credential to the Object Store.
- Run the following script against ADW to create the Credential to Object Store.
credential_name => ‘ADW_CREDENTIAL’,
username => ‘email@example.com’,
password => ‘<Enter the Authentication token here>’ );
SQL Developer ADW Data Pump Import Wizard
In order to access the Data Pump Import Wizard in SQL Developer, first, display the DBA window from the View menu. Add the connection to the ADW database to import into. Then under that connection, navigate down to Data Pump -> Import Jobs. Right-click and select the Run Data Import Wizard.
There are a few different types of data pump imports. In this example, I chose to do a Schema import. This allows me to merge several schemas from different source databases into the same target ADW.
We need to have the ADW Credentials created in ADW in order to access the Oracle Object Storage. This is the same as the second pre-requisite mentioned earlier in this blog.
In addition, it is assumed that at this point, the DMP file(s) that will be imported into ADW are already available on Oracle Cloud Object Storage. In order for SQL Developer to access the DMP files from Object Storage, we need to create Pre-authenticated Requests for each file that will be used for the import. When we create a Pre-authenticated Request for an Object in Oracle Object Storage, we get the URI to the object that we can then use in the SQL Developer Import Wizard under “Choose Input Files”. To create a Pre-authenticated Request for an object, in this case a DMP file, navigate to Oracle Object Storage, and click on the bucket that contains the files. Next to the DMP file that will be imported, click on the menu and select: “Create Pre-authenticated Request”. Make sure to copy the URI generated so you can use it in SQL Developer. If you are importing from multiple DMP files simultaneously, repeat the create Pre-authenticated Request process for each DMP file and note down the URI for each of the DMP files.
Back to the SQL Developer Import Wizard, set the location of the DMP file(s) by selecting the Credential you created from the drop-down list and providing the pre-authenticated URI to the dump file(s). If the DB dump consists of multiple files, you will need to add a separate row for each of the dmp files under “Choose Input Files” and enter the corresponding pre-authenticated URI of each file.
Example of a pre-authenticated object URI:
The credential connection is successful if you see the DMP schema(s) listed on the following screen. Select the source schema(s) that you would like to import into ADW.
In the following screen, I didn’t choose to re-map schema because I wanted the target schema name to be created in the same way as the source schema name. Note that it’s better not to create the schema manually before running the Import Wizard, and have the Import job create it. I mapped the source tablespace to the tablespace I want to use in ADW, and in this case, it’s called “DATA”.
While the import job is running, we can easily monitor the log by clicking on the import job name from the DBA Import Jobs list. You can set a refresh interval on the Log File window and see any errors and which tables are being loaded with how much data.
The import job duration depends on how large the dump file(s) are. SQL Developer doesn’t have to remain open for the import to continue as the job is already executing on ADW itself and sourcing data from the Object Store.