Moving Data

Methods:

  • SQL*Loader
  • External Tables
  • Data Pump

SQL*Loader

Entry level bulk-load utility that is taking data from feeder file and moving into Oracle table.

  • Conventional path: standard INSERTs and COMMIT to commit the entire data
  • Direct path: injection from source data to table (much faster), not working if there are constraints on the target table, there is no UNDO functionality

There are log files produced when SQL*Loader is used: bad file is created wen there is a constraint violation and reject file is made for all the rows that were not affected by the operation.

sql*plus — create table for the example
create table regions (
id number(2),
regname varchar2(20)
);

reg.txt file
01, North
02, South
03, East
04, West

reg.ctl
load data
infile ‘reg.txt’ — file from which we take data
badfile ‘reg.bad’
discardfile ‘reg.dsc’
append — add o existing rows
into table regions
fields terminated by ‘,’
trailing nullcols — any missing fields in the file will be replaced with the NULL value
(id integer external(2),
regname) — describing the columns

in CMD we navigate to the directory where the feeder file is and we invoke sql*loader with appropriate parameters. User should have SYSDBA or SYSOPER privileges
O:\feeder>sqllder user=userName/userPassword@serverID control=reg.ctl direct=true

User also needs to have insert permissions on the table where the data in inserted.

Directory Objects

Those ar File System directories (folders) referenced in Oracle.
They are created by setting UTL_FILE_DIR parameter.
They are necessary to use UTL_FILE package, external tables and Data Pump.

External Tables

It presents an alternative to using sql*loader in building tables from external data.
It is a directory object. One can query the external tables but can not create indexes on them which is not good when the table has lots of data. One can not do DML statements to external data, but can add data using Data Pump.

Data Pump

It is mostly used to make hgh performance DB to DB movement.
It is started with users processes: expdp and impdp

An example of DataPump import:

SQL> CREATE USER testuser IDENTIFIED BY ***PASSWORD***
DEFAULT TABLESPACE TEST
QUOTA UNLIMITED ON TEST;

SQL> CREATE OR REPLACE DIRECTORY test_dir AS ‘/newdrive/dump’;

SQL> GRANT READ, WRITE ON DIRECTORY test_dir TO testuser;

SQL> GRANT CONNECT TO testuser;

SQL> GRANT RESOURCE TO testuser;

OS> impdp testuser/***PASSWORD***@orcl DIRECTORY=test_dir DUMPFILE=’staging_db_expdp_dbplan_20130628.dmp’ SCHEMAS=testuser
logfile=’impdpStaging.log’ remap_tablespace=DATA_BG:TEST remap_tablespace=SRCTSMALL:TEST

Example of external table

checking existing utl_file
SQL> show parameter utl_file

setting utl_file
SQL> alter system set utl_file_dir=’O:\feeder’ scope=both;

checking existing directories
SQL> select * from dba_directories;

creating directory
SQL> create directory feeder as ‘O:\feeder’;

depts.txt
01, accounting
02, IT
03, admin
04, marketing
05, maintenance

SQL
create table depts (
deptno number(2),
deptname varchar2(16))
organization external (
type oracle_loader
default directory feeder
access parameters
(records delimited by newline
badfile ‘depts.bad’
discardfile ‘depts.dsc’
logfile ‘depts.log’
fields terminated by ‘,’
missing field values are null)
location (‘depts.txt’));

Example of Data Pump

SQL
create table dptable as select * from all users;

oem -> Data Movement -> Export to Export files -> Tables…

The Data Pump API is exposed through the PL/SQL package DBMS_DATAPUMP

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: