Monitoring Oracle Change Data Capture Queues

I shared a shell script that will check Oracle CDC Queues to make sure they aren't going stale. If your CDC consumer has not picked up changes in 48-72 hours then something may be wrong. The scripts can be customized to alert at any interval, hopefully this will serve as a good 'nudge' to get you going in the right direction.

Everything you need is here.

If you are an OEM user you can probably grab the SQL I shared and modify it for use with OEM User Defined Metrics or alerting.


Unloading data from Oracle?

I recently fielded a question about getting data out of Oracle quickly. Without much detail on the systems involved here is my answer:


I need to migrate data from Oracle to MySQL quickly (in less than 1 day). What are my options?

My answer:

Oracle does not supply an out-of-the-box unload utility. Keep in mind without comprehensive info about your environment (oracle version? server platform? how much data? what datatypes?) everything here is YMMV and you would want to give it a go on your system for performance and timing.

Oracle IN Condition and Multi Column Subqueries

I keep coming across a construct in some legacy SQL that has been causing all kinds of performance issues for us. I guess you could call it using the IN condition with multi-column subqueries. I located the syntax for the IN condition here but it doesn't really get into much detail about using this construct. Here is an example of the subquery (it returns the lowest salary and department_id for each department):

Quickie script to run dbv on your database...

This script will generate dbv commands into a shell script and then execute the shell script. run-dbv.sql:
set head off
set lines 200
set feedback off
set define off

select 'dbv file='||name||' blocksize = '||block_size||
       ' LOGFILE=FILE-'||FILE#||'.LOG' from v$datafile;
spool off

host chmod 755
spool dbv_results.log
host ./
spool off
Output will be created as separate log files. You can run it and review results like this:
$ sqlplus "/ as sysdba" @run-dbv.sql

Recording Oracle System Stats for historical analysis...

If you are experimenting with gathering system statistics it might be helpful to archive your current settings and any intermediate settings you come up with along the way.

There is a way to save stats to a table using DBMS_STATS.CREATE_STAT_TABLE and gathering with DBMS_STATS.GATHER into that table, but the format is cryptic and it is nice to have the descriptive parameter names tagging along with the data. (In a future post I will cover format of the CREATE_STAT_TABLE format).

The current system stats info is held in the sys.aux_stats$ table. Since the format is a little wacky, I came up with the following table to hold stats and the following insert statement to populate it after every gathering of system stats.

Now you can easily query the values of old stats in the SYSTEM_STATS_HISTORY table:

Orion IO Test Tool

I ran across Orion in a Kevin Closson blog post. From the OTN site:

"ORION (Oracle I/O Calibration Tool) is a standalone tool for calibrating the I/O performance for storage systems that are intended to be used for Oracle databases. The calibration results are useful for understanding the performance capabilities of a storage system, either to uncover issues that would impact the performance of an Oracle database or to size a new database installation. Since ORION is a standalone tool, the user is not required to create and run an Oracle database. "

Taking TKProf to the next level ... TVD$XTAT...

For the past few weeks I have been using the TriVaDis eXtended Tracefile Analysis Tool as a replacement for TKPROF.

TKPROF is available with all Oracle installations but is has some drawbacks. Early versions of TKPROF do not provide any information about bind variables. It also tends to lose detail because it aggregates data. When performing analysis it is unwise to make inferences about detailed execution based on aggregates. TVD$XTAT provides aggregates but also provides detail on each SQL Statement so you can zero in on the statement of interest.

Old versions of Oracle... don't forget about Oracle EDelivery

I was recently looking for old (not ancient, just old) install media to support a database migration. The usual collection of download links on OTN usually just has the last 2 releases. If you need older versions (in this case 9i) then hit up the EDelivery site and you should be able to find them.

If you need a much older version (like 8i or gasp! 7) I have heard you may be able to obtain it through an SR if you have a current support contract but luckily I have not had to test that route.

Copying Oracle JVM Permissions

Now that I am working in an environment where I am responsible for more database instances it seems like data is always moving around. Machines are being retired, platforms are changing, all kinds of fun stuff. I often have to move using regular old Oracle import/export utilities. These utilities aren't the best at some of the more esoteric Oracle features so some 'backfilling' of objects or permissions may be required. Here are some queries that are helpful when moving a schema that has Oracle Java Virtual Machine (JVM) Dependencies:

oracle.sysman.emSDK.emd.comm.OperationException: could not write to targets.xml.bak file

I had some trouble adding a new database to an OEM Grid installation this weekend.

I didn't find too many references to the above error so I wanted to post about it here. Of course Metalink was all doom and gloom. Note 745795.1 says to stop the agent, run fsck on the AGENT_HOME filesystem, then start the agent. After that you should be able to add new targets.

targets.xml lives in AGENT_HOME/sysman/emd. At the OS prompt I changed to that directory I was able to create new files there but for some reason the above error kept popping up when Grid tried to save a new target.


Subscribe to oracle