Market research refuseniks are a very real problem. Growing pressure on consumers' time, a lack of trust in those who commission studies and a growing cynicism toward marketing in general mean that the number of people prepared to take part in surveys is dwindling.
Clients, meanwhile, have their own problems: they need swifter and more cost-effective solutions. So it's hardly surprising - given rising web penetration and reduced telecoms costs - that online research is rapidly gathering converts.
Industry bible Inside Research has been tracking the rise of online market research spend for eight years in the US and six in Europe. It predicts growth of 18% this year in both regions, to 拢515m in the US and 拢73m in Europe.
Global expenditure on all forms of online research is expected to hit 拢674m this year, up 20% on 2003.
Consumers seem to react more favourably to this type of survey. Control and convenience play a part - respondents can choose to opt in when they want to take part and opt out when they don't.
As for clients, some regard online panels as nothing more than enhanced online databases. But there are certain features that set them apart.
'They are groups of respondents who have been selected, and the relationship is managed on a daily basis,' says Martin Oxley, a director at TNS Interactive Europe. 'They are opt-in respondents who are willing to undertake research.'
There are two types of panel. Access panels provide samples for questionnaire-style information rather than metered data. Online panellists are invited by email to take part, with a link to the web survey. Proprietary panels, meanwhile, are set up or commissioned by a client firm and usually consist of customers of that company. Many are managed as online panels.
Doug Read, executive director at Metro, has been using online panels for the past 18 months. The paper's 'Urban Life' survey has proved valuable on several levels. It has been particularly useful as a sales support, allowing Read to offer existing or potential clients selected access to 4100 predominantly ABC1 white-collar adults aged 18-44 - a notoriously difficult segment to survey. The panels are also useful as story generators, with those who take part promised that their work will be reported in the paper.
Every month, 'Urban Life' respondents can win a 拢1000 prize or one of 300 HMV vouchers. Response rates of between 80% and 95% have convinced Read of the panel's value. 'This is a combination of how enthusiastic people are to take part and what we do to maintain this enthusiasm,' he says.
The best online panels make respondents feel they belong to a club where their input can make a difference. Unlike their offline counterparts, these panels cover a broad range of topics rather than focusing on just one area. They often stem from a number of different sources, including internet service providers and mailings. 'They are scalable,' says Oxley, 'whereas using a single source means group sizes are limited.'
These panels are more durable and often more responsive. But are they representative? This issue is worrying the industry so much that Esomar, the European market research body, has set up an international working party to draft an additional section on online panels to its guidelines on conducting marketing and opinion research. This, it hopes, will help to identify properly recruited and maintained panels.
John O'Brien, a member of Esomar's professional standards board, is chairing the party. He has two main concerns. 'One is whether as an industry we treat respondents properly and don't abuse their willingness to participate. The other is whether we are using scientifically sound methods and are sure that statistical samples are not misrepresented,' he says.
According to O'Brien, some members of the party would like to see monitoring of quality control standards. 'But Esomar is not minded to audit, qualify or check,' he adds. The organisation can deal with breaches of its code, but it has to be notified of misdemeanours, as it doesn't have the manpower or budget to police the industry on its own. As a result, market research companies need self-regulatory procedures in place to ensure accurate and usable results.
Nunwood Consulting is working for a major retail company that wants greater insight into customer spending patterns and motivations. It has more than 200 people involved in its online panel at any one time. 'Best practice can be ensured by asking attitudinal and demographic questions,' says Clare Bruce, Nunwood's managing director. 'We ask panel members to answer questions on topics unrelated to the subject to guarantee we get the right people. Once we have secured those with a range of attitudes, we conduct the research.'
Survey junkies
Further problems can arise in trying to discriminate between good and bad panels. As Dr Tamsin Addison, head of research at RSM Robson Rhodes Business Consulting, points out: 'We are often struggling to ensure that the sample not only contains the right type of quota respondents, but also avoids the internet survey junkie.'
The issue of representation has been studied by Research International, along with sister companies Millward Brown and online panel operation Lightspeed Online Research - all units of WPP's Kantar group. They have been investigating how online users differ from the general population.
'If we're talking about chocolate or detergents, there is no difference,' says David Walker, a director at Research International. 'But if we look at technology categories, such as use of digital cameras or the internet, then online users cannot represent use on a national basis.'
Clients may, however, want to speak only to sub-groups of respondents who have access to this sort of technology. 'Here,' Walker continues, 'online research offers an extremely effective method of access'. This type of research offers a direct route to many niche audiences - but not all. For example, over-50s web users, or 'silver surfers', are still not sufficiently numerous or representative in this country, though in the US there is anecdotal data that healthcare websites have become more popular than online pornography, due to the number of elderly visitors.
Such limitations can present problems. 'We recently had to interview men with erectile problems,' says Terry Prue, a senior partner at HPI Research. 'The web offers anonymity, but no panel could provide a representative sample.'
Multiple sources
Recruitment is the key issue. Andrew Cayton, senior vice-president for business development at Lightspeed, agrees that it is important not to recruit from a single source. He uses multiple partners, testing them to assess how successful they are in bringing people into a panel and how much these consumers participate. Panellists, meanwhile, are screened extensively, with the ensuing information securely captured and kept confidential.
Panel management is also important. If panel members have a different demographic profile to the overall population, 'this bias is corrected by data modelling such as quota controls and weighting on the final data', says Paul Milsom, a director at BMRB International.
It all goes to show that a light, but thorough, touch is needed to ensure quality research online. Even the bad press and new regulations surrounding spam won't necessarily affect online panels.
'Typically these are households or individuals recruited on the basis that they give their permission for survey-related mail,' explains Lucy Green, international marketing and communications director of Nielsen//NetRatings.
Online panels may have their limitations - they'll never be as effective for qualitative or new-product work where visual contact is vital - but they do fit the lifestyle of today's consumers. Not to mention the budgets of clients.
CASE STUDY - NORWICH UNION
Mike Harmer, head of customer insight at Norwich Union, admits that he had qualms when he first started using online panels. As he points out, these methodologies are in their infancy, and as a researcher he needed proof.
Norwich Union is unlikely to abandon traditional studies into customer satisfaction and product testing, but it is always looking for new routes to deliver its requirements. Online panels have given the insurance company a quick and effective way of conducting customer research and opening up a new communication channel.
'We wanted a two-way dialogue, which would ultimately help us to understand our customers better,' he says. 'We also wanted a strategic understanding of their current needs and the way insurance can provide security and peace of mind in their lives.'
In the past, such studies were conducted offline. But the telephone survey used meant that they sometimes included flaws and could take a long time.
Online research is cheaper and can mean real-time results.
In addition, the 500-strong access panel, recruited and managed by Lightspeed, delivers accurate and actionable responses, so that ensuing decisions come from a sound foundation.
'Because our main survey demands a general understanding of customer needs, we don't need any specialised respondents,' continues Harmer. ' But I know Lightspeed can find different segments and I can see how that may be useful in future research.'
Online panels have won him over in the past six months. 'We are identifying ways in which we can provide great insight into the business, quickly and efficiently,' he says.
ONLINE PANELS - PROS AND CONS
PROS
1. Clients and data analysts can see results being compiled in real time.
2. Online surveys save time and money compared with face-to-face interviews.
3. Consumers welcome surveys that they can fill in when they want, and often need no incentive to do so.
4. A more relaxed environment leads to better quality, honest and reasoned responses.
5. Panellist background data allows immediate access to key target audiences unrestricted by geography.
CONS
1. Online panels' demographic profile can differ from that of the general population.
2. If questionnaires take longer than 20 minutes to fill in, quality can suffer or they might not be fully completed.
3. Poor recruitment and badly managed panels can damage the data.
4. Technical problems such as browser incompatibility can mean panellists give up.
5. Programming costs are higher than for offline questionnaires.