Forgot your password?    
+ Reply to Thread
Results 1 to 3 of 3

Thread: Oracle 10g Datapump.

  1. #1
    Oracle Administrator
    Join Date
    Dec 2011
    Posts
    92

    Oracle 10g Datapump.

    In order to work with oracle data pump we will have to first unlock the scott user and create a directory object. The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system.

    First startup the database and its associated services.
    Then from the root prompt, do the following.

    Code:
    su - oracle
    export ORACLE_SID=orcl
    sqlplus
    / as sysdba
    
    ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
    
    
    CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/orcl';
    GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
    Note: Directory must be created physically before executing the above command.

    TABLE BASED IMPORT AND EXPORT

    Code:
    $ export ORACLE_SID=orcl
    $ expdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expEMP_DEPT.log
    
    $ export ORACLE_SID=orcl
    $ impdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impEMP_DEPT.log
    CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
    # All loads all the metadata as well as data from the source dump file.
    # DATA_ONLY only loads row data into the tables no database objects are created.
    # METADATA_ONLY only creates database objects, no data is inserted.

    SCHEMA BASED IMPORT AND EXPORT

    Code:
    $ export ORACLE_SID=orcl
    $ expdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expSCOTT.log
    
    $ export ORACLE_SID=orcl
    $ impdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impSCOTT.log
    CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
    # All loads all the metadata as well as data from the source dump file.
    # DATA_ONLY only loads row data into the tables no database objects are created.
    # METADATA_ONLY only creates database objects, no data is inserted.

    FULL DATABASE IMPORTS AND EXPORTS

    For this to work you must login as the system user not as the sys user.

    Code:
    $ export ORACLE_SID=orcl
    $ expdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expDB10G.log
    
    $ export ORACLE_SID=orcl
    $ impdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impDB10G.log
    CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
    # All loads all the metadata as well as data from the source dump file.
    # DATA_ONLY only loads row data into the tables no database objects are created.
    # METADATA_ONLY only creates database objects, no data is inserted.

    In the above commands you can also add a additional clause that is job_name.

    Example syntax :

    $ expdp system/password full=y directory=TEST_DIR dumpfile=DB10G.dmp job_name=qq



    You can view the datapump jobs from the following view:

    Sql> select * from dba_datapump_jobs;


    Even if you exit the prompt or press ctrl+c at the command prompt
    Or exit from the client side the datapump jobs will continue to run at the server.


    To reattach with the running job enter the following command.

    $ expdp system/password attach=qq

    If the import or export job is to be stopped temporarily then type the following command.

    press CTRL+C
    Export> STOP_JOB=IMMEDIATE
    Are you sure you wish to stop this job ([y]/n): y



    In order to resume the job do the following.

    export ORACLE_SID=orcl
    $expdp hr/hr ATTACH=qq
    Export> CONTINUE_CLIENT



    Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.

    IMPORT SCENARIOS:

    REMAP_SCHEMA: loads all the objects from the source schema into the target schema.
    $ expdp system/sys schemas=scott directory=test_dir dumpfile=scott.dmp
    $ impdp system/sys remap_schema=scott:hr directory=test_dir dumpfile=scott.dmp

  2. #2
    Expert Oracle Administrator
    Join Date
    Oct 2011
    Location
    New Delhi, India
    Posts
    427
    Why are directory objects needed?

    They are needed to ensure data security and integrity. Otherwise, users would be able to read
    data that they should not have access to and perform unwarranted operations on the server.

    What makes Data Pump faster than original Export and Import?

    There are three main reasons that Data Pump is faster than original Export and Import. First,
    the Direct Path data access method (which permits the server to bypass SQL and go right to
    the data blocks on disk) has been rewritten to be much more efficient and now supports Data
    Pump Import and Export.* Second, because Data Pump does its processing on the server
    rather than in the client, much less data has to be moved between client and server. Finally,
    Data Pump was designed from the ground up to take advantage of modern hardware and
    operating system architectures in ways that original Export/ and Import cannot. These factors
    combine to produce significant performance improvements for Data Pump over original
    Export and Import

    How much faster is Data Pump than the original Export and Import utilities?

    For a single stream, Data Pump Export is approximately 2 times faster than original Export
    and Data Pump Import is approximately 15 to 40 times faster than original Import. Speed
    can be dramatically improved using the PARALLEL parameter.

    Why is Data Pump slower on small jobs?

    Data Pump was designed for big jobs with lots of data. Each Data Pump job has a master
    table that has all the information about the job and is needed for restartability. The overhead
    of creating this master table makes small jobs take longer, but the speed in processing large
    amounts of data gives Data Pump a significant advantage in medium and larger jobs.

    Are original Export and Import going away?

    Original Export is being deprecated with the Oracle Database 11g release. Original Import
    will always be supported so that dump files from earlier releases (release 5.0 and later) will be
    able to be imported. Original and Data Pump dump file formats are not compatible.

    Are Data Pump dump files and original Export and Import dump files compatible?

    No, the dump files are not compatible or interchangeable. If you have original Export dump
    files, you must use original Import to load them.

    How can I monitor my Data Pump jobs to see what is going on?

    In interactive mode, you can get a lot of detail through the STATUS command.
    In SQL, you can query the following views:
    • DBA_DATAPUMP_JOBS - all active Data Pump jobs and the state of each job
    • USER_DATAPUMP_JOBS – summary of the user’s active Data Pump jobs
    • DBA_DATAPUMP_SESSIONS – all active user sessions that are attached to a Data Pumpjob
    • V$SESSION_LONGOPS – shows all progress on each active Data Pump job

    Can I use Oracle Enterprise Manager with Data Pump?

    Yes, OEM supports a fully functional interface to Data Pump.

    Can I use gzip with Data Pump?

    Because Data Pump uses parallel operations to achieve its high performance, you cannot pipe
    the output of Data Pump export through gzip. Starting in Oracle Database 11g, the
    COMPRESSION parameter can be used to compress a Data Pump dump file as it is being
    created. The COMPRESSION parameter is available as part of the Advanced Compression
    Option for Oracle Database 11g.


    Interactive Command-Line Mode - Commonly used options

    • See the status of the job. All of the information needed to monitor the job’s execution is available.
    • Add more dump files if there is insufficient disk space for an export file.
    • Change the default size of the dump files.
    • Stop the job (perhaps it is consuming too many resources) and later restart it (when more resources become available).
    • Restart the job. If a job was stopped for any reason (system failure, power outage),
    you can attach to the job and then restart it.
    • Increase or decrease the number of active worker processes for the job. (Enterprise Edition only.)
    • Attach to a job from a remote site (such as from home) to monitor status.

  3. #3
    Oracle Administrator
    Join Date
    Feb 2012
    Location
    New Delhi, India
    Posts
    53
    Thanks .....

+ Reply to Thread

Similar Threads

  1. difference between normal exp/imp and datapump expdp/impdp ?
    By devesh_ocp in forum Core Database Administration and Monitoring
    Replies: 3
    Last Post: 08-11-2012, 01:28 PM
  2. Script to Monitor datapump sessions
    By Amit in forum Backup, Restoration & Recovery, RMAN, Flashback, Datapump, Export, Import
    Replies: 0
    Last Post: 07-15-2012, 07:53 PM
  3. Oracle 10g Datapump.
    By ajaychandi in forum Backup, Restoration & Recovery, RMAN, Flashback, Datapump, Export, Import
    Replies: 0
    Last Post: 03-25-2012, 02:02 PM
  4. DATAPUMP EXCLUDE/INCLUDE parameters
    By Amit in forum Backup, Restoration & Recovery, RMAN, Flashback, Datapump, Export, Import
    Replies: 0
    Last Post: 11-04-2011, 02:22 PM

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

DBA Lounge (P) Ltd. deals in Oracle Technologies on Consulting, Resourcing, Corporate Training


Online and corporate training available on Oracle Database 11g, Oracle 11g Real Application Cluster (RAC), Oracle Applications 11i/R12, Oracle Fusion Middleware 11g, Oracle Identity Management-OIM, Oracle Internet Directory 11g-OID, Oracle Business Intelligence Enterprise Edition-OBIEE, Oracle Golden Gate, Oracle Access Management-OAM, Oracle Internet Directory-ODS, Oracle Identity Analytics Architecture-OIA Statistics