Database 11g

Monday, March 14, 2016

Mining Particular Records of a table in Oracle Using Logminer.


We will create one scenario which MINES the DELETED records from Archive-Logs using Logminer Utility.


>> Check total records before deleting emp table.

SQL> select count(*) from emp;

  COUNT(*)
----------------
      6144

>> Now delete the records from emp table.
>> 2048 records deleted form the emp table of SCOTT schema.


SQL> delete from emp where ename='SCOTT';

2048 rows deleted.

>> NOw check the total records after deleting.

SQL> select count(*) from emp;

  COUNT(*)
------------
      4096


> In your productdion enviorment Developer will give you date and time when rows are deleted.


 Lets Start Logminer Scenario 


>>  Create any directory at os level, I created as 'DEMO1' which will use by the                                                           DBMS_LOGMNR_D.BUILD procedure which generate SQL statements.


C:\Users\oracle> mkdir demo1

C:\Users\oracle> sqlplus / as sysdba

SQL*Plus: Release 11.2.0.3.0 Production on Mon Feb 29 09:44:46 2016
Copyright (c) 1982, 2011, Oracle.  All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options


>> Map the above 'DEMO1' directory at database level by setting initialization parameter                                         'utl_file_dir' and bounce the database.


SQL> alter system set utl_file_dir='C:\Users\oracle\demo1' scope=spfile;

System altered.

SQL> startup force;

ORACLE instance started.

Total System Global Area      8351150080 bytes
Fixed Size                               2271832 bytes
Variable Size                           1577059752 bytes
Database Buffers                      6761218048 bytes
Redo Buffers                            10600448 bytes
Database mounted.
Database opened.


>> Oracle stores information requires at the time of recovery. So you need additional information to                         analyze redo log files by adding supplemental log data.


SQL> ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;

Database altered.


>> Run this pl/sql procudeure because Logminer engine uses data dictionary which will extracted in this file        'DEMO1' which you created.


SQL> exec dbms_logmnr_d.build('dfile','C:\Users\oracle\demo1');

PL/SQL procedure successfully completed.


> We deleted the rows at 10:26am at that time the sequence number of logfile is 1575,1576 & 1577 as shown        in this query output.


SQL> select  to_char(first_time,'dd-mon-yy hh24:mi:ss'),sequence#,thread# from v$log_history;


TO_CHAR(FIRST_TIME               SEQUENCE#   THREAD#
------------------ ---------- --------------------------------------------
07-mar-16 11:00:40                          1565                 1
07-mar-16 11:01:40                          1566                 1
07-mar-16 11:42:48                          1567                 1
07-mar-16 11:42:51                          1568                 1
08-mar-16 15:27:54                          1569                 1
09-mar-16 10:10:04                          1570                 1
09-mar-16 17:01:43                          1571                 1
12-mar-16 19:12:52                          1572                 1
12-mar-16 19:37:57                          1573                 1
12-mar-16 19:38:42                          1574                 1
13-mar-16 10:24:28                          1575                 1
13-mar-16 11:30:33                          1576                 1
13-mar-16 11:55:14                          1577                 1


>> you can observe the time from 10:24 to 11:55 (3) ARCHIVES are generated with different                             sequence number

1. O1_MF_ 1 _ 1575_CGHDVX _.ARC
2. O1_MF_ 1 _ 1576_CG7IDVX _.ARC
3. O1_MF_ 1 _ 1577_CG8jEVX _.ARC

>> Apply these archive logfiles to start the Mining processes.


SQL> exec dbms_logmnr.add_logfile('K:\fast_recovery_area\otstest\ARCHIVELOG\2016_03_13\O1_MF_ 1 _              1575_CGHDVX _.ARC');

PL/SQL procedure successfully completed.


SQL> exec dbms_logmnr.add_logfile('K:\fast_recovery_area\otstest\ARCHIVELOG\2016_03_13\O1_MF_ 1 _              1576_CG7IDVX _.ARC');

PL/SQL procedure successfully completed.


SQL> exec dbms_logmnr.add_logfile('K:\fast_recovery_area\otstest\ARCHIVELOG\2016_03_13\O1_MF_ 1 _             157_CG8jEVX _.ARC');

PL/SQL procedure successfully completed.



>> Start Logminer now, oracle will pool the Archvied logfiles provided through add_logfile in                 v$logmnr_contents view.


SQL> exec dbms_logmnr.start_logmnr(dictfilename=>'C:\Users\oracle\demo1\dfile');


>> Now Query the deleted records which is populated in v$logmnr_contents view.


SQL> select username,table_name,sql_redo from v$logmnr_contents where seg_name='EMP';

USERNAME       TABLE_NAME              SQL_REDO        
--------------------------------------------------------------------------------------------------------------------------
SCOTT                          EMP                                            
                                                           delete from "SCOTT"."EMP" where "EMPNO" = '7844' and                                                                                     "ENAME" = 'TURNER' and "JOB" = 'SALESMAN'                                                                                                    and "MGR" = '7698' and "HIREDATE" =                                                                                                                TO_DATE('08-SEP-81', 'DD-MON-RR') and                                                                                                            "SAL" = '1500' and "COMM" = '0' and "DEPTNO"                                                                                                 ='30' and ROWID='AAASZHAAEAAAACXAAJ';                                                                
                                                                         
 
SCOTT                          EMP                    delete from "SCOTT"."EMP" where "EMPNO" = '7499'                                                                                             and "ENAME" = 'ALLEN' and "JOB"
                                                                  = 'SALESMAN' and "MGR" = '7698' and "HIREDATE" =                                                                                       TO_DATE('20-FEB-81', 'DD-MON-RR') and                                                                                                             "SAL" = '1600' and "COMM" = '                                                                                                                               300' and   "DEPTNO" = '30' and                                                                                                                               ROWID = 'AAASZAAACXAAB';


>> You can observe here how scott has deleted records, Sorry Guys i cant show you all the 2048                            records because of space issue thats why i pick only 2 records.

>> Now spool this complete query output in notepad and handover to developers they  will analyze                         Constraints,Data-types,indexes and then they will insert back these records in the table.



Happy Learning....        

                                                                                                     

2 comments: