In-Depth

Effective Data Auditing for Regulatory Compliance: Options and Considerations

Auditing can do more than just help you meet a host of new regulations. Last week we outlined several significant business benefits. This week we explore your options. (Second in a two-part series.)

Last week I explained how a data auditing solution can provide benefits beyond simply meeting government regulations. This week I'll explain three of the major approaches to data auditing, and suggest the preferred audit approach—non-trigger tracking at the data source.

Data Auditing Options

Current approaches to data auditing are subject to common pitfalls that may, over time, create potential compliance risk or increase the costs of implementing compliance with business requirements or regulations. The most common approaches include application modification, mid-tier portals, and trigger-based collection at the data source.

Application modification:

This entails changing the source code of every application that might be used to access the data of interest. Each application is changed so that it captures data modification and viewing information and stores it for further processing.

The application modification approach has several implications. First, each application must be modified (or, if that is not possible such as with legacy applications, the application must be replaced). Unfortunately, planning, implementing, and testing these changes is costly and time-consuming, and it is difficult to guarantee complete coverage.

Furthermore, access outside of the modified applications (e.g., via a database administrative console) is not captured, implying incomplete coverage, and changes to permissions and schema cannot be captured by this means.

Mid-tier portal:

Some application architectures funnel access to data through a shared portal that is responsible for backend access. This portal could be modified to capture and store data access information.

While it avoids modifying individual applications, the portal approach has substantial drawbacks. A mid-tier portal only works for portal-enabled applications. The portal cannot capture access outside of that passing through the portal, thus leaving a gaping back-door vulnerability. Another downside with the portal approach: it cannot capture changes to permissions and schema.

Trigger-based collection at the data source:

Most users dread the traditional way of capturing data modifications, using triggers (special-purpose application logic) on the database. Triggers have a number of drawbacks:

  • they are often hard to write correctly

    li>they add substantial runtime performance overhead (because they execute in line with transactions, reducing throughput)

  • fear of this overhead leads DBAs to minimize the number of modifications recorded or the period over which they are recorded

  • they cannot capture data views or changes to schema and permissions

A Preferred Audit Approach: Non-Trigger Tracking at the Data Source

In this approach, non-trigger audit agents are associated with each database server containing important data. These audit agents are responsible for harvesting information about data-related activity, and because they operate at the database server, they capture all relevant data activity, regardless of the application used, including direct backdoor access. Applications need not be modified to accommodate this approach.

The audit agent harvests information through two primary means. It can read the database transaction log, which each database maintains in the normal course of its operation, to gather data modifications and other activity. Using the transaction log does not interfere with the timely execution of transactions, because the analysis can be time-shifted or carried out on machines other than the one hosting the target database. In addition, the agent can use the database’s built-in event notification mechanism to obtain additional information, such as permission changes and data viewing activities.

The non-trigger audit agent approach offers maximum coverage without disruption to applications and systems. Unlike application modifications, mid-tier portals, and triggers, this approach is easy and cost-effective to install and maintain.

Conclusion

The ideal solutions for auditing data activity depend on an effective data capture capability. The best approach minimizes performance overhead while consolidating a complete audit of data access across multiple servers and providing active monitoring and alerting. A number of approaches may be considered, though many have shortcomings that negatively affect system performance and/or require additional technical resources.

In parallel with the development of policies, procedures, and technical requirements for data integrity, it is critical to identify a solution that effectively captures and audits data access. Enterprises considering development and deployment of a data auditing solution should evaluate multiple issues before an approach is selected:

  • Is data-access capture complete, eliminating backdoors through which users (including DBAs) may access data without being detected?

  • What is the cost of deployment, especially across multiple servers?

  • Does the solution use triggers, which affect performance?

  • Does the solution require time consuming, expensive, and often incomplete application modification?

  • Is it easy to administer and maintain?

  • Is there a single console for configuration and scheduling across multiple database platforms?

  • Is there a common repository and long-term archival support?

  • Does it provide a complete modification history?

  • Does the solution “alert” on critical database changes (schema and permissions)?

  • Does the approach support multiple platforms?

  • Is the approach flexible enough to meet evolving requirements from regulators and business partners?

Enterprise-class database audit software provides the ability to capture a wide range of data-related activity, consolidate, and manage this information across multiple servers, review and analyze it in a variety of ways, create reports about the activity at various levels of detail, and send timely notifications about certain kinds of detected activity. The non-trigger audit agent approach offers maximum coverage without disruption to your applications and systems. Prudent organizations are implementing these strong solutions to meet today’s demanding data auditing requirements.

About the Author

As Lumigent's founding CTO, Dr. Mazer co-developed the company's vision, products, and capital. He has 20 years of experience at early-stage and established companies, and is an inventor and expert witness in several software technology areas. Dr. Mazer has led R&D programs for the Defense Advanced Research Projects Agency (DARPA), OSF, and Digital Equipment Corporation. Dr. Mazer received his Ph.D. in computer science from the University of Toronto.

Must Read Articles