Monday 29 October 2007

Database auditing

Finding out how your database is performing and what activities took place has traditionally been an historical activity. By that I mean actions against the database have been recorded in the logs and then later – perhaps the following day – these logs have been examined to find out exactly what happened. The advantage of this is that you have a fairly good record of all activities that occurred and it doesn’t use up too many valuable MIPS. The downside is that you never know what is happening currently and you may not be getting enough detail about what happened recorded in your log.

The alternative is to run trace utilities – and for DB2, for example, there are a number of traces that can be run. The good thing about traces is that they can help to identify where a performance problem is occurring. However, they also have a high CPU overhead. Not that you would, but if you run DB2’s global trace with all the audit classes started, IBM reckons this will add 100% CPU overhead. Even running just all the audit trace classes adds and estimated 5% CPU overhead.

So why are we worried about auditing what’s going on in our database? It’s the growth in regulations. In the USA there’s the Sarbanes-Oxley Act (SOX) and also the Payment Card Industry Data Security Standard (PCI-DSS). Both of these can affect what a company needs to audit. An audit is meant to identify whether procedures are in place, whether they are functioning as required, and whether they are being updated as necessary. In the event that one of these is not happening, the audit should be able to make recommendations for improvement.

It’s also important, with database auditing software, that it doesn’t have to be used by the DBA or anyone else who maintains the database. Pretty obviously, if the DBA was making changes to the data or browsing records he wasn’t authorized to look at, when he ran the auditing software, he could remove all information about those activities and no-one would be any the wiser.

So, to summarize, a successful database auditing tool would have to work in real-time and not historically. It would not have to impact on performance. It would have to comply with the latest regulations. And it would have to be able to audit the actions of the DBA and other super users.

There’s one other characteristic that would be useful. Having identified in real-time actions that violated corporate policies (like changing the payroll data!) it should then respond with a policy-based action – like an alert.

No comments: