Tuesday, August 7, 2012

Logminer: An useful Oracle utility


Another hectic day in life of DBA. Whenever any critical issues come up we all are up on our toes. 

Well the problem description: The system was generating archive logs at an alarming rate. Even though the archive_lag_target is set is 900s the archive logs were getting generated very frequently and with huge size causing the mountpoint on which the archives were stored to get full frequently thereby halting the database.
We dug into the database to find out if there are any long running queries in the database and to surprise we found there isn’t any. Then we went on to search if there are any deadlocks in the system. All in vain. No deadlocks in the system. Tried almost all the possible first hand tactics to analyze the database but all in vain. Then why so much archives were generated and that too at an alarming rate. There must be something going on in the database. So then we decided to go on and analyze the archives that were generated.

Archives logs are not readable to naked eyes. They are in encrypted form. Hence you need to use the Oracle provided utility of LOGMINER to analyze the online redo log files and archived log files. I wont go into much detail in explaining about LOGMINER utility. You can get a lot of documentation on internet regarding this utility. Just google it. 

In this blog I would be more concentrating on how to use this utility and analyze the log files.
The first and foremost requirement while using this utility is to check whether the database is in archive log mode or not? For using the logminer utility the DB should be in archive log mode.


Then Get the list of all the archives present in your database. Here I am altering the session to required data format. It becomes easy for analysis.


Now Set the initialization parameter, UTL_FILE_DIR,in the initialization parameter file. For example, to set UTL_FILE_DIR to use “C:\app\dict” as the directory where the dictionary file is placed, enter the following in the initialization parameter file. Please note that UTL_FILE_DIR is a static parameter. Hence scope needs to be defined as SPFILE and DB restart is needed for the same. Nonetheless it is advised that you do all the analysis part on the TEST Server. Just copy the required Archive log files from Production to Test setup and analyze them.

Now as the parameter is set, we move on with the actual part.
Execute the PL/SQL procedure DBMS_LOGMNR_D.BUILD. Specify a filename for the dictionary and a directory path name for the file. This procedure creates the dictionary file. For example, enter the following to create the file dict..ora in “C:\app\dict”. Make sure the directory is already created before creating the dictionary file.

Now register all the logfiles  that you need to analyze.

Check if all the archives to want to analyze are registered in the database or not


Now start the logminer session.


Finally you have done it. Here you have to note that the contents analyzed by the logminer are stored on the temporary view  V$LOGMNR_CONTENTS which is session specific. So before you disconnect from session create a permanent table from this view so that you can analyze the contents later on too.


Volla!!!! The table is created. Now  you can query this table and find the suitable information. In my case some nasty module of user was continuously firing DML statements. The statements lasted for very short duration of time. Hence were not visible any any of the tricks we used for analysis before.


7 comments: