Ask any direct marketer to pinpoint what makes a successful campaign and the likely answer will be the quality of the data. It sounds simple in theory, but in reality it's a different story, if recent research is anything to go by. A survey from data services provider QAS revealed that the UK is the worst performing country when it comes to data quality, with the retail and financial services sectors named as the biggest culprits.
What's more, in separate findings, only eight per cent of the UK organisations polled validate the information they collect, while 34 per cent admitted that they fail to validate any of the data they gather, despite recognising the negative impact this has on their businesses, brands and budgets.
More than half of those surveyed said they validated address and postcode information prior to putting it on their systems, but admitted that most other customer contact information is stored unchecked.
Evidence of such poor data quality is inevitably putting a strain on the data-cleaning sector, as the increasing inaccuracy of name and address information is becoming harder to match against a base source of data. Martin Doyle, managing director of data-cleaning software provider DQ Global, believes the biggest challenge facing the sector is a general apathy and abdication of responsibility for data quality.
"Clearly there is something wrong when two out of 100 responses to a marketing campaign is deemed acceptable - this equates to a 98 per cent failure rate," he says.
Public confidence in security too is at an all-time low, fuelled by a spate of data protection failures by government organisations, such as HMRC and the Ministry of Defence. The potential impact of these security lapses on data quality has not been underestimated, and the fear is that a growing number of consumers will be unwilling to part with personal details, or at least become more wary of doing so.
"Recent security issues have contributed to consumers' reluctance to part with their personal data, making it increasingly difficult to locate volumes of quality data," says Richard Lees, chairman of data-cleaning supplier Database Group. "As an industry, finding a reliable, quality resource that can be replenished has become a real challenge."
Poor data quality is also taking its toll on the environment, contributing to an increase in postal waste. While many acknowledge that the data-cleaning sector is playing an important role in helping businesses meet the DM industry's recycling targets, others say more companies need to recognise further the importance of investing in data quality.
Data integrity
Steve Day, managing director of data bureau UKChanges, says: "Data cleansing and management - from address quality and completeness, through suppression and enhancement and on to targeting and analysis - must be fundamental processes for all communication campaigns if the DM industry is to challenge its reputation as a polluter."
So, with data-cleaning specialists having greater difficulty in substantiating the accuracy of names, how are they rising to the challenge? Some providers argue that the best way to address poor data cleaning is to make sure data is correct at the point of entry.
"If data is accurately captured and complies with a set of business rules before entering a database, it is much more likely to meet data-quality standards," says Colin Rickard, managing director of data-quality provider DataFlux. "Such rules act as a firewall around a real-time, operational CRM system and flag up any inconsistencies, problems or duplicates before the information is allowed into the wider system."
The problem, however, appears to be defining what these data-quality standards are. The Data Protection Act requires data owners to keep their databases accurate, but does not specify how this should be done. It's not surprising then, that different techniques are used by clients and suppliers to assess levels of data quality, meaning a 'one-size-fits-all' formula is virtually impossible to achieve and maintain (see box, page 28).
Clients, meanwhile, believe that data-cleaning suppliers need to demonstrate greater honesty in how often and how robustly they clean their data.
"Data quality is utterly transparent: a client can run data through industry suppression files themselves and identify exactly how many records are picked up in each," says one financial services direct marketing manager. "To develop a sustainable client relationship, it is not in the interests of suppliers to bump saleable volumes up through a failure to cleanse regularly - it's a false economy."
There are also calls for greater validation, both at the original point of capture and at ongoing intervals, as well as proof of data accuracy.
"Cleansing is not just about ad-hoc or periodic hygiene exercises," says Mark Chipperfield, head of data management at BBC TV Licensing. "More validation at the original point of capture and regular ongoing validation, with a source that can be proven, needs to be improved upon."
Chipperfield adds that commercial data vendors should publish their data-quality measures to demonstrate what they are doing to maintain and improve the quality of the data they supply. "The cost equation must now include considerations of the environmental impact of wastage (from dirty data) as well as basic return on investment and reputation. All these are now closely aligned."
Improved quality
The suppression market plays a pivotal role as data quality continues to come under the spotlight, but do existing products go far enough? In January, the Supply of Information (Register of Deaths) Regulations came into force, enabling death registration data to be used for the prevention of fraud and to protect consumers against identity theft. The DMA says it will continue to lobby for this data to be used for general suppression purposes (see box, left).
For the moment at least, the industry consensus on suppression files is that they do offer good coverage. With a plethora of products available, however, and the same information packaged up and sold in different ways, it can be difficult for clients to make the right choice.
"There are too many suppression products for end users in the marketplace," says Richard Webster, group communications director at data owner DLG. "This is causing confusion for clients, who may end up using them all, even if they don't actually need to."
Jon Cano-Lopez, managing director of bureau and data services provider Ai, says that accuracy figures on suppression files should be made publicly available to help clients decide which the best products for them are.
"With the number of products available, clients occasionally do wonder about their levels of precision," he says. "Accuracy figures should be published - no file is going to be 100 per cent correct, but if it could prove it was 98 or 99 per cent accurate, this would certainly help."
Some clients believe that keeping in regular touch with customers is often the best way of maintaining accurate records and suppressing the names of those who do not want to be contacted. "Brands need to keep in contact with customers and make it easy for them to see and update the data you hold on them," says BBC TV Licensing's Chipperfield. "Make it an easy process and ideally offer them some incentive. Poorly directed communications impact on reputation and that includes not just names and address, but what you are assuming about a customer's or prospect's profiles and previous transactions."
With the spotlight on data quality likely to remain for some time yet, it's clearly an issue that few in the data-cleaning industry can afford to ignore. A code of practice with defined and recommended data-quality standards may well give both clients and suppliers some degree of comfort, as well as improving the overall reputation of the sector, but putting this into practice still remains a grey area.
What is clear is that growing consumer awareness makes DM data quality a 'must-have' rather than a 'nice-to-have'. With Electoral Roll opt-outs on the increase, resulting in nearly 40 per cent of voters choosing not to have their details available for commercial purposes, and a growing number of consumers unwilling to share their personal details with third parties, there's little doubt that those in the business of data cleaning have their work cut out for them.
POWER POINTS
- Poor quality data is a problem in the UK, fuelled by firms passing the buck and general apathy
- A number of high-profile data breaches have dented public confidence in the security of their personal data
- Both suppliers and clients are looking at ways of improving data, including the setting up of minimum quality standards
THE PUBLIC DEATH REGISTER: BEST FOR SUPPRESSION?
NO - Mark Roy, chief executive, The REaD Group
"On the face of it, using publicly available information on registered deaths for suppression looks very appealing - receiving mail for someone who has died can be very distressing and it makes the DM industry look dumb, uncaring and insensitive.
But what happens if the data falls into the wrong hands? If fraudsters were able, for example, to access the data, crack encryption codes and assume the identity of deceased people, it would be manna from heaven for them. Such misuse of data would have a severe impact on the suppression industry.
Furthermore, the Public Death Register itself isn't an absolute set of data - around 21 per cent of all deaths, for example, are registered with a coroner and not all the data is fed through as quickly as it can be.
Recent legislation that came into force enabling the Public Death Register to be used for list-cleaning purposes as a means to prevent fraud is welcome but it should not be used for suppression purposes. There are perfectly good products available for that function."
YES - Rosemary Smith, chairwoman, DMA and managing director, RSA Direct
"Public death registration data should be released for suppression purposes - it's what the DMA has been lobbying for over 20 years. It is as near to an absolute data set as you can get - in terms of its reach it's the holy grail. Any other solution would be incomplete.
The restrictions that have been applied since the law came into force (stipulating that the Register can only be used to prevent fraud) are disappointing and there's also a very steep price tag attached, which rules out all but the biggest users, such as financial services providers.
But it is a great step forward and the DMA will seek to determine the degree to which the restrictions are immovable - there may be another change in law to permit the data being used for wider purposes. There are suppression products on the market that are good and which do their job well, but it's a shame that death registration data cannot be used in a more liberal environment."
A STANDARD FOR DATA QUALITY - PIPEDREAM OR REALITY?
There's little doubt that both clients and data cleaning practitioners are in favour of a standard for data quality. Yet the recent demise of the Business List Audit, a scheme created by the DMA to tackle the high rates of decay in business-to-business data, will have done little to instil confidence that standards in data quality can be achieved.
The scheme folded in January, with the DMA citing a lack of support as the main cause. A mere 12 business-to-business data suppliers signed up to the audit out of an estimated 250 providers, despite promotion through advertising, events and PR.
Clients in the business-to-consumer space are heavily reliant on the Postcode Address File (PAF). The comprehensive postcode data file does, however, have its flaws. "PAF doesn't elaborate on a number of areas, such as new builds and blocks of flats," says the head of database management at one large broadcaster. "A universal standard would be the ideal."
A one-size-fits-all methodology has been mooted in the past but is a non-starter, according to Steve Day, managing director of data bureau UK Changes. "The DMA has looked at benchmarking performance among suppliers. Unfortunately the varying nature of project types, data uses, collection methods and the order in which processes are done have made it extremely difficult to establish and maintain a relevant grading scale," he says.
Many companies also employ procedures to remove joke names and swear words from their mailers, which can produce entirely different results among companies that undertake this process.
The difficulty of ensuring compliance among the myriad of data owners is seen as another barrier towards developing a standard for data quality.
But perhaps the core reason why a data quality standard remains an unrealistic prospect is that every database would need to have a common structure and format.
"A standard could be difficult to implement due to the differing techniques that companies or their data cleaning bureaux employ," says Dale Pilling, general manager at data services company Blueberry Wave.
"If there were standards imposed surrounding the quality of a delivery address, it would be difficult to define these, as address verification software packages have their own definition of whether or not a record is actually mailable, making an acceptable standard hard to define in the first place."