IT Security teams and IT auditors typically demand to enable the database audit trace functionality. This requirement often collides with real-world IT, because the implementation and management cost of it is heavily underestimated. To look deeper into the reasons of the high implementation cost for this particular function together with the security benefits of database audit traces, lets review some background information.


Databases are build to organize, store and retrieve data extremly fast. When IT talks of databases, they now typically mean a certain type: relational RDBMS with a schema-based data organization. These databases are the backbone of most business applications today and therefore in focus for IT security because they hold the raw data of the business.
From early on, the database vendors main focus had to be speed and scalability. The filtering and combination of database records at ever increasing numbers with a rising complexity required the usage of every trick available: faster hardware, data caching and indexing, query, schema and datatype optimisation are some of them.
Database security functions such as providing fine-grained access control (defining who has read, write and delete access to data records) exist in all major database products. However the recording of database usage, the audit trace, is typically turned ‘off’ by default for three reasons:

  1. the performance impact
  2. the potential audit data volume
  3. the usefulness of unfiltered access data

Its is not all fault of the database vendors, though. Application vendors often think that implementing an audit trail on the application layer is enough. They typically fail to provide instructions how to secure the database layer. No wonder, because its a lot of documentation work, all while supporting multiple database vendors and customized client installations.
Let’s summarize:

Database audit must be tailored to each applications unique data design, the database vendor provided tools are hard to use and there is typically no information from application vendors on how to audit the database schemas they designed.

The cost of database audit

To highlight the possible negative impact of database audit, a quote from Adrian Lane’s article Database security best practices: Tuning database audit tools:

Database audit tools have their quirks, and each one of them can perform miserably without taking the time to properly plan for the auditing process. It’s not uncommon for auditing or tracing to cause more than 50% degradation in database performance. That means seemingly simple auditing will end up slowing down the database, filling up table space, collecting too many events and creating problems for yourself in maintenance and report generation.

So, simply turning on the database audit function with a default to record *everything* will not help security, but becomes a problem for database operations and application usability. I have seen it myself happening, regardless of the database vendor being Oracle, Microsoft, or others.

If we can’t have everything, we must filter. This requires somebody to determine which database event involves what risk and is therefore worth recording. Such decision making requires detailled knowledge about the database schema, the users, the expected data access and update patterns. It involves security considerations about the risks for that particular application. Example of such security considerations are the evaluation of SQL injection risks through web access and the review of direct end-user DB access through data-mining tools. This database audit tuning process includes the need for reviewing the settings from time to time, especially when database schemas are updated or the application usage changes.

Let’s summarize:

Database audit setup and its tuning, which is the refinement of events that are relevant, is a manual and repeated process with a learning curve – for each application.

Establishing a security requirement for having database audit traces with its time-consuming setup and tuning process however is a problem for many companies. With the mentioned RDMBS being being the standard data backend for most business applications, it is typical to have a increasingly large number of databases around. Spending the time to analyse, implement and refine database audit rules for 20, 30 or more applications is a huge, and often impossible task.

Addressing the Issue technically?

What about recent attempts to address this issue technically? For example Imperva’s approach with the SecureSphere Database Activity Monitoring product? There, the database auditing task is “outsourced” from the native database to a separate software agent that either “listens” to network traffic or sits inline to the database access.
However, it is a comprehensive software package that creates new complexity through its many tempting features that require to focus on databases and the Imperva product in to much detail. Could that product go over the top by not just being a simple tool helping address the performance and complexity issues? is the gained efficiency of dealing with multi-vendor database audit functions wasted through management of a additional, complex application?

Addressing the Issue

The only sure way to address the database audit effort is to set priorities:

  1. which applications are critical (typically those in regulatory audit scope)?
  2. Which degree of audit trail is the minimum baseline required?
  3. How do I ensure audit data is reviewed for security violations?


It is a hard decision were to put the money: a new third party software product that requires extra integration work, or dilligently working through the list of proritized databases using the vendors native audit functions. Typically, the last approach is the one taken first: because it is freely available and the extra cost is hidden.
Spend time on defining a database audit baseline with a description of the mandatory events to be recorded, and having a process to implement this baseline when new applications are rolled out.
Plan time for the database operations team to review audit data, audit filters and the learning curve involved with analyzing each applications data design.