Improving data quality without breaking the bank

By Published on .

Most Popular
Today, most corporations are tightening the screws on their corporate cash drawers. In attempts to conserve funds yet still reach stated revenue goals, businesses are turning to their existing customer bases to generate incremental revenues. This is a logical step as the costs to cross-sell additional products and/or services to existing customers are far less than those required to attract new customers. Obviously, this puts tremendous pressure on the accuracy of databases.

As one-to-one relationship marketing becomes more the norm, databases are expanding at exponential rates. In fact, corporate databases double in size roughly every six to nine months. These two factors alone should be sufficient stimulus for businesses to get their databases in order in a cost-effective manner. The average business database contains a staggering amount—15% to 40%—of bad data. That means roughly one in four pieces of marketing material mailed is worthless.

Take a look at upfront costs. The right data-quality tool will save a company a considerable amount of money throughout the marketing process, so the pivotal element in the decision-making process would be the software. Question is, which software? A sound database is the foundation for all subsequent business activities. Opting for the “Wal-Mart route” in the software selection process, while initially inexpensive, might not be the most prudent choice. Data-quality software varies greatly, and gaps will most certainly exist that ultimately prove this software even more costly on the back end. So don’t be penny-wise and pound-foolish when selecting data-quality software. Select the solution that best addresses your specific needs in the short and long term.

Once you’ve decided on the software to use, several strategies are available that further reduce the cost of leveraging a fully optimized database.

A company demanding absolutely the highest level of security could install the software behind the corporate firewall on its own servers. While the company will ultimately realize savings in the long-term, with a customer “prem” solution, other related costs need to be factored into the cost equation such as: head count, cap-x, power, insurance, etc. Some costs, such as additional head count, can be deflected with no compromise in quality by outsourcing database management to a partner.

However, the most cost-effective method is to host your database at a “hardened” remote facility and access the data via an SaaS (software as a service) solution. In this scenario, virtually all cap-x costs are eliminated, and cost is now calculated on a usage basis. This ASP approach is becoming more attractive to companies because it offers high security combined with low overhead.

Once the front end is fixed, that’s when the real savings happen. Let’s assume a company’s 10 million-record database consists of 15% bad data. The company uses the database to conduct synchronized direct response mail and one-to-one relationship marketing campaigns. It touches the entire customer base quarterly. That’s an annual total of 40 million pieces of printed material at an average unit cost of approximately $1.05 each (labor, material, postage). Shaving 10 points off the bad data percentage would translate into a gross savings exceeding $4 million, which is in addition to the savings already realized on the front end.

When viewed in the abstract, improving data quality without breaking the bank isn’t so much about money spent for data quality as it is about money saved with data quality.

Bob Orf is president-CEO of DataMentors Inc. (, a data-quality, business intelligence analytics software development company.

In this article: