File=cmis565_lab8.htm; updated July 11, 2013


Oracle Database Administration – Lab #8

Backup and Recovery (10 points)


Situation:  You will demonstrate your proficiency as a new DBA in the area of backup and recovery procedures.  You may elect to complete the laboratory individually, or you may team up with another laboratory partner.  If you team up with a partner, you will select your own partner and you will be graded as a group and you will submit a single document as evidence of your team's work. 


Use Notepad, Wordpad, or Microsoft Word to document your solution to the tasks in the laboratory. 


Note on Data Pump Export Utility:

·        Sometimes, in order for the data pump export utility to work, it may be necessary to allocate memory to the STREAMS_POOL_SIZE cache parameter.  You can set this cache to anything other than zero.  Normally this is not necessary if you're using the SGA_TARGET parameter and allowing Oracle to automatically allocate cache memory.

·        If you get an error message regarding the streams pool, then ensure your database is shut down and edit your init.ora file to add the parameter setting:  STREAMS_POOL_SIZE = 1024

·        This sets the cache to 1KB.  Restart the database.




Tasks for this Lab


1.  Perform two database exports (2 points).


a.      Create a subdirectory off of your /u02/student subdirectory named exports.  Set the permission level for the exports subdirectory (use chmod 775 exports) – example: 


mkdir /u02/student/dbockstd/exports


b.      Ensure your database is open, then exit SQLPlus.   Use the Data Pump Export utility to complete an export of your entire database including data and metadata.   Display all output displayed on the computer screen as a result of running the export.  Show all commands you executed to set up the export. 

c.      Print out a copy of the contents of the parameter file used for the export.


IMPORTANT: Confirm that the export.dmp file exists on your exports subdirectory and that you received the successful termination of the export message.



2.  Complete full cold backup (2 points).  You will simulate backup of the complete system by backing up to disk.  Your database should be running in NOARCHIVELOG mode.


a.      Use disk drive /u02 as your backup disk drive.  Create a subdirectory named backup.  Set the permission level for the exports subdirectory (use chmod 775 backup) – example:


mkdir /u02/student/dbockstd/backup


Inside your backup directory create three directories, each named u01, u02, and u03 and set the permission level to 775 for each of these directories. 


b.      List all files that comprise your database to your lab report by using the commands provided in the class notes to SELECT the information from the data dictionary (for example, you can execute a SELECT from the V$DATAFILE view to create part of the listing).  Include the disk drive and directory location for each file – this listing is ABSOLUTELY CRITICAL.  The listing should NOT include scripts, dumps, etc – only include the control files, datafiles (tablespaces), redo log files, and parameter files.  Do NOT use the operating system level ls command.

c.      Shutdown the database so as to provide a consistent shutdown (This step is critical—if you skip it, your backup will not be valid).

d.      Copy all required and recommended files to the appropriate directory within the backup subdirectory by using the LINUX cp (short for copy) command.  Show all of your cp commands.  This example provides the command used to all files located on disk drive /u01 to the backup directory named u01 that is located on disk drive /u02.   This command will copy all directories in the oradata directory of /u01.  You can delete those directories (such as adump, archive, etc) from your backup directory as they will not be needed.


cp -R /u01/student/dbockstd/oradata /u02/student/dbockstd/backup/u01


e.      Confirm that the backup took place.  You might also note any differences in ownership of files between your original subdirectory (or subdirectories) where your database files are located and the backup subdirectory.  The backup subdirectory files all belong to you as your login account name where your database files on your original subdirectories belong to the owner named Oracle.  Since both you and Oracle are members of the group named dba, you must ensure that the group dba has read and write privileges for the database files if you use the backup.


3.  Performing online partial database backup (2 points). This enables a partial backup while the database is available to users.  Use a hot backup approach to backup the tablespace named DATA01.  Note that this tablespace may include one or more data files in your database.


a.      Exit the database to the operating system (do not use host).  Create a subdirectory named partialbackup to hold partial database backups in disk drive /u02. Set the permission level for the exports subdirectory (use chmod 775 partialbackup) – example:


mkdir /u02/student/dbockstd/partialbackup


b.      Put the database into ARCHIVELOG mode.  This may require you to modify your init.ora file.  Connect as SYS or as the DBA of your database and startup the database.  Verify the ARCHIVELOG mode by querying the v$database table as above.  You MUST be in ARCHIVELOG mode to continue. 


c.      Begin the partial database backup with the ALTER TABLESPACE command.




Host to the operating system and copy the datafile(s) that comprises the DATA01 tablespace to your partialbackup subdirectory.  You may need to set permission access privileges for this new subdirectory if you forgot to do so earlier. Verify that the file(s) was copied satisfactorily.


Exit back to SQL*Plus and issue the command to end the backup.




d.      Backup the control files to protect against loss of all copies of the control file.  This is necessitated by any command that changes the database configuration.  Backup your control files to a file named backup.ctl on the partialbackup subdirectory with the ALTER DATABASE command.


Note: To recover a control file: (1) shut down the instance in either normal or immediate mode, (2) copy the good control file using a new name, (3) edit the p-file to delete the name of the damaged control file and add the filename of the copy, and (4) restart the instance. You will not recover a control file during this laboratory.

4.  Perform a Database Import (2 points).  Now you will assume that part of the database has been damaged, specifically the Invoice and Invoice_Details tables.  You will drop these damaged tables and recover by importing the tables from the database export that was created earlier with the Database Pump Export utility.


a.      Connect as the DBA for your database because the DBA user created and therefore owns the tables that are damaged.   To demonstrate that the tables are restored, first alter the tables as they currently exist by inserting some new rows.  Insert a new test row in the Invoice table and a corresponding row in the Invoice_Details table.  Query these two tables to display the data you entered.


b.      Now you will assume that the tables are damaged - execute commands to drop the Invoice and Invoice_Details tables.  This will also drop any indexes that exist for these two tables, but it will not drop the cluster index.  Drop the tables in the order shown here because of referential integrity constraints.


DROP TABLE Invoice_Details;


c.      Select a user to do the import (do not use SYSDBA).  I selected the user dbock.  Grant this user the DATAPUMP_IMP_FULL_DATABASE role. 


d.      Exit to the LINUX prompt and use the impdp (Data Pump Import) utility to import the Invoice and Invoice_Details tables. 


Create a parameter file with the parameter values used to specify this import job.  I named my file imp4d.par and used these parameters.  The first line gives the job a name.  The second line specifies the tables to be imported.  The third line specifies the directory (internal to SQL) that stores the export file—this directory was created above in 1b.  The fourth line specifies the name of the dumpfile that contains the export—this was a full export of the database that you created above in 1b.  You may need to set the permissions on the parameter file to 775.







Issue the command to do the import.  This example shows a command for a user named dbock.  The o/s will ask for the password for dbock interactively.  Show all of the output displayed to your computer screen as the import executes.


impdp dbock parfile=imp4c.par


e.      Connect to the database as the DBA user.  Use SELECT statements to query the two tables – the rows you added earlier in part 4a above should not be present – this demonstrates that you have recovered to a specific point in time.


5.  Destroy and restore the database (2 points).  You will simulate a catastrophic failure, such as a fire. 


a.      Destroy your entire database by deleting all of the files that are part of the database in your original subdirectories – this includes the following:  all control files, all tablespace datafiles, all on-line redo.log files (not the archived ones), and the init.ora file. 


List the contents of your directories in your lab report to show they are empty.  Don't forget to shutdown your database instance.  You can use a shutdown abort since you've destroyed the database.


b.      Restore the backup files to their original subdirectories.  An example command (this is typed on one line) to restore one of the files is shown here.  Ensure that you restore all of the files.  : 


cp /u02/student/dbockstd/backup/u01/oradata/USER350system01.dbf  



c.      Set permissions for the restored files on your original subdirectories to 775 to enable Oracle to access them if this is necessary.

chmod 775 *


d.      Connect to your database as SYS and startup the database in open mode.  Show the output of the startup command to demonstrate the database is restored.


Any errors at this point will be due to missing files, inadequate operating system level permissions, or similar errors.




6.  Shut down the database (-1 point if you forget to shutdown). 


Do not leave your database running between laboratory sessions.