10 Exporting and Importing Metadata and DataThis chapter describes how to export (unload) from and import (load) into Oracle Database XE. You can export and import metadata (database object definitions), data, or both metadata and data. It contains the following topics:Data can be exported for later importing (loading) into another Oracle database or into a non-Oracle database.
Important: Connections to external data may be disabled on your computer.To connect to data when you open a workbook, you must enable data connections by using the Trust Center bar, or by putting the workbook in a trusted location.
Data that has been unloaded from a non-Oracle database can be loaded into an Oracle database, if the data is in a suitable format for loading.This chapter includes the following topics:.For convenience and the range of features available, you may want to use SQL Developer for export and import operations unless you need to use another tool (command-line utility). Using SQL Developer for Exporting and ImportingSQL Developer provides convenient wizards for exporting and importing metadata and data:.To export metadata or data, or both, use the Export Wizard: click Tools, then Database Export.To import metadata or data, or both, use an appropriate method depending on how the material to be imported was created, or the format of the data to be imported.
This method might be running a script file, or using the Data Import Wizard to import from a data file (such as a.csv file or a Microsoft Excel.xls file).See the following examples of using SQL Developer for performing export and import operations:. Figure 10-1 Export Wizard: Source/Destination.Accept the default values for the Source/Destination page options, except as follows:Connection: Select HR.Show Schema: Deselect (uncheck) this option, so that the HR schema name is not included in CREATE and INSERT statements in the.sql script file that will be created.
(This enables you to re-create the table in a schema with any name, such as one not named HR.)Save As location: Enter or browse to a desired folder on your local hard drive, and specify the file name for the script file. (In the figure, this file is C:tempexport.sql.) The script file containing CREATE and INSERT statements will be created in this location. Note:For explanations of the options on this or any other wizard page, click the Help button.For example, Format has other possible values besides the default insert, which causes SQL INSERT statements to be included to insert the data. Other values include loader to cause SQL.Loader files to be created, and xls to cause a Microsoft Excel.xls file to be created.Click Next.On the Types to Export page, deselect Toggle All, then select only Tables (because you only want to export a table).Click Next.On the Specify Objects page, click Lookup, then double-click the REGIONS table on the left to move it to the right-hand column.
Shows the result of these actions. Figure 10-2 Export Wizard: Specify Objects.Click Next.On the Specify Data page, accept the defaults and click Next.By default, all data from the specified table or tables is exported; however, if you want to limit the data to be exported, you can specify one or more ' WHERE clauses' in the bottom part of this page.On the Summary page, review the information; and if it is what you want, click Finish. (Given what you specified, this causes the export script to be created as C:tempexport.sql.)If you need to make any changes, go back to the appropriate page or pages and make them, and then move forward to the Summary page again.
Example: Importing Metadata and Data Using a Script FileAssume that you wanted to re-create the REGIONS table that you exported in, but in a different schema. This other schema can be an existing one or one that you create.For example, assume that you created a user named NICK following the instructions in. To re-create the REGIONS table in the schema of user NICK by invoking the script in C:tempexport.sql follow these steps using SQL Developer:.If you have not already created a database connection for NICK, create the connection.Open the NICK connection.In the SQL Worksheet for the NICK connection, type the following:@c:tempexport.sql.Click the Run Script icon.The Script Output pane shows that the REGIONS table has been created and four rows have been inserted.In the Connections navigator, expand the Tables node under the NICK connection.
You now see the REGIONS table.Optionally, click the REGIONS table in the Connections navigator, and examine the information under the Columns and Data tabs in the main display area. Example: Exporting Data to a Microsoft Excel FileAssume that you want to export only the data from the REGIONS table, which is part of the HR sample schema, so that the data can be imported into a table with the same column definitions. This might be a REGIONS table in another schema (either in the same Oracle database or another Oracle database).You use the same Database Export wizard, but export only the data, and not the DDL (Data Definition Language statements for creating database objects).To export the data the REGIONS table:.In SQL Developer, click Tools, then Database Export. Shows the first page of the Export Wizard, but with entries reflecting selections that you will make.
Figure 10-3 Export Wizard: Source/Destination Specifying Data Export Only.Accept the default values for the Source/Destination page options, except as follows:Connection: Select HR.Export DDL: Deselect (uncheck) this option. If a.sql script file is generated (which will not happen in this example), it will not contain any CREATE statements, but only INSERT statements.Format: Select xls to have the data saved to a Microsoft Excel.xls file.Save As location: Enter or browse to a desired folder on your local hard drive, and specify the file name for the.xls file. (In the figure, this file is C:tempexport.xls.).Click Next.On the Types to Export page, deselect Toggle All, then select only Tables (because you only want to export data for a table).Click Next.On the Specify Objects page, click Lookup, then double-click the REGIONS table on the left to have it appear in a row in the bottom part of the page. Shows the result of these actions. Figure 10-4 Export Wizard: Specify Objects for Exporting DataBy default, all data from the specified table or tables is exported; however, if you want to limit the data to be exported, you can specify one or more ' WHERE clauses' in the bottom part of this page.Click Next.On the Summary page, review the information; and if it is what you want, click Finish. (Given what you specified, this causes the data in the REGIONS table to be exported to the file C:tempexport.xls.)If you need to make any changes, go back to the appropriate page or pages and make them, and then move forward to the Summary page again.
Example: Importing Data from a Microsoft Excel FileAssume that you wanted to import the data that was exported in, into a new table that has the same column definitions as the original ( REGIONS) table.For example, assume that you created a user named NICK following the instructions in. This user wants to take the exported data, add one row in the Excel file, and import it into a new table that has the same column definitions as the REGIONS table. Figure 10-5 Microsoft Excel File with Exported Data (Modified).Save and close the Microsoft Excel.xls file.In SQL Developer, in the Connections navigator display for NICK, right-lick the NEWREGIONS table and select Import Data.In the dialog box that is displayed, navigate to the c:temp folder, select export.xls, and click Open.In the Data Import Wizard, accept all the defaults; click Next on each page until Summary, and click Finish there. (For information about the options on any wizard page, click the Help button.)The data from the.xls file is loaded into the NEWREGIONS table and is committed. Table 10-2 Import/Export Scenarios and Recommended Options Import/Export ScenarioRecommended OptionYou have to load data that is not delimited. The records are fixed length, and field definitions depend on column positions.SQL.LoaderYou have tab-delimited text data to load, and there are more than 10 tables.SQL.LoaderYou have text data to load, and you want to load only records that meet certain selection criteria (for example, only records for employees in department number 3001).SQL.LoaderYou want to import or export an entire schema from or to another Oracle database. There is no XMLType data in any of the data.Data Pump Export and Data Pump ImportYou want to import or export data from or to another Oracle database.
The data contains XMLType data and contains no FLOAT or DOUBLE data types.Import ( imp) and Export ( exp). Loading Data with SQL.LoaderSQL.Loader loads data from external datafiles into tables of an Oracle database. A particular datafile can be in fixed record format, variable record format, or stream record format (the default).The input for a typical SQL.Loader session is a control file, which controls the behavior of SQL.Loader, and some data, located either at the end of the control file itself, or in a separate datafile.The output of a SQL.Loader session is an Oracle database (where the data is loaded), a log file, a 'bad' file, and potentially, a discard file. The log file contains a detailed summary of the load, including a description of any errors that occurred during the load. The bad file contains records that were rejected, either by SQL.Loader or by the Oracle database. The discard file contains records that were filtered out of the load because they did not match any record-selection criteria specified in the control file.
Methods SQL.Loader Uses to Load DataSQL.Loader uses three different methods to load data, depending on the situation: conventional path, direct path, and external tables.Conventional PathA conventional path load is the default loading method. It executes SQL INSERT statements to populate tables in an Oracle database. This method can sometimes be slower than other methods because extra overhead is added as SQL statements are generated, passed to Oracle, and executed.
It can also be slower because when SQL.Loader performs a conventional path load, it competes equally with all other processes for buffer resources.Direct PathA direct path load does not compete with other users for database resources. It eliminates much of the Oracle database overhead by formatting Oracle data blocks and writing them directly to the database files, bypassing much of the data processing that normally takes place. Therefore, a direct path load can usually load data faster than conventional path.
However, there are several restrictions on direct path loads that may require you to use a conventional path load. For example, direct path load cannot be used on clustered tables or on tables for which there are transactions pending.See for a complete discussion of situations in which direct path load should and should not be used.External TablesAn external table load creates an external table for data that is contained in a datafile. The load executes INSERT statements to insert the data from the datafile into the target table.
An external table load allows modification of the data being loaded by using SQL functions and PL/SQL functions as part of the INSERT statement that is used to create the external table.See for more information on external tables. SQL.Loader FeaturesYou can use SQL.Loader to do the following:.Load data across a network. Example: Using SQL.LoaderIn the following example, a new table named dependents will be created in the HR sample schema. It will contain information about dependents of employees listed in the employees table of the HR schema.
After the table is created, SQL.Loader will be used to load data about the dependents from a flat data file into the dependents table.This example requires a data file and a SQL.Loader control file, which you will create in the first two steps.Create the data file, dependents.dat, in your current working directory. You can create this file using a variety of methods, such as a spreadsheet application or by simply typing it into a text editor. It should have the following content:100,'Susan, Susie',Kochhar,17-JUN-1997,daughter,101,NULL,102,David,Kochhar,02-APR-1999,son,101,NULL,104,Jill,Colmenares,10-FEB-1992,daughter,119,NULL,106,'Victoria, Vicki',Chen,17-JUN-1997,daughter,110,NULL,108,'Donald, Donnie',Weiss,24-OCT-1989,son,120,NULL,This file is a CSV (comma-separated values) file in which the commas act as delimiters between the fields. The field containing the first name is enclosed in double quotation marks in cases where a variant of the official name is also provided—that is, where the first name field contains a comma.Create the SQL.Loader control file, dependents.ctl, in your current working directory. You can create this file with any text editor.
Exporting and Importing with Data Pump Export and Data Pump ImportThe Data Pump Export utility exports data and metadata into a set of operating system files called a dump file set. The Data Pump Import utility imports an export dump file set into a target Oracle database.A dump file set is made up of one or more disk files that contain table data, database object metadata, and control information.
The files are written in a proprietary, binary format, which means that the dump file set can be imported only by the Data Pump Import utility. The dump file set can be imported to the same database or it can be moved to another system and loaded into the Oracle database there.Because the dump files are written by the database, rather than by the Data Pump client application, you must create directory objects for the directories to which files will be written.
A directory object is a database object that is an alias for a directory in the host operating system's file system.Data Pump Export and Import enable you to move a subset of the data and metadata. This is done by using Data Pump parameters to specify export and import modes, as well as various filtering criteria.You can also perform exports and imports over a network. In a network export, the data from the source database instance is written to a dump file set on the connected database instance. In a network import, a target database is loaded directly from a source database with no intervening dump files. This allows export and import operations to run concurrently, minimizing total elapsed time.Data Pump Export and Import also provide a set of interactive commands so that you can monitor and modify ongoing export and import jobs.
Example: Using Data Pump Export and Data Pump ImportIn this example, suppose that you want to make some changes to the HR sample schema and then test those changes without affecting the current HR schema. You could export the HR schema and then import it into a new HRDEV schema, where you could perform development work and conduct testing. Note:The Export and Import utilities do not support the FLOAT and DOUBLE data types. If your data contains these types and does not contain XMLType data, you must use Data Pump Export and Import, described in.When you run the Export utility against an Oracle database, objects (such as tables) are extracted, followed by their related objects (such as indexes, comments, and grants), if any. The extracted data is written to an export dump file.
The dump file is an Oracle binary-format dump file that can be read only by the Import utility. The version of the Import utility cannot be earlier than the version of the Export utility used to create the dump file. Note:Dump files generated by the Export ( exp) utility can only be imported by the Import ( imp) utility; they cannot be imported with the Data Pump Import ( impdp) utility.Like Data Pump Import and Export, data exported with the Export utility can be imported with the Import utility into the same or a different Oracle database.See for further information about the Export and Import utilities and for examples of how to use them.Scripting on this page enhances content navigation, but does not change the content in any way.
1: I have downloaded the mysql-connector-java-5.1.24-bin.jarOkay.2: I have created a lib folder in my project and put the jar in there.Wrong. You need to drop JAR in /WEB-INF/lib folder. You don't need to create any additional folders.3: properties of project-build path-add JAR and selected the JAR above.Unnecessary. Undo it all to avoid possible conflicts.4: I still get java.sql.SQLException: No suitable driver found for jdbc:mysql//localhost:3306/mysqlThis exception can have 2 causes:. JDBC driver is not in runtime classpath. This is to be solved by doing 2) the right way.JDBC URL is not recognized by any of the loaded JDBC drivers. Indeed, the JDBC URL is wrong, there should as per the be another colon between the scheme and the host.
Yes, for a Java EE web application the JAR file must physically be dropped in /WEB-INF/lib folder of the project, which should be already prepared for long by the IDE if you created the project the right way. Just one step. Drop the JAR in /WEB-INF/lib. No need to create the folder structure yourself. No need to fiddle with project's properties which would possibly make things worse.
By the way, do you now get a ClassNotFoundException or SQLException? Those are entirely different exceptions with each a very clear own cause.–Apr 1 '13 at 19:45.