Audience Dialogue

Know Your Audience: chapter 8
Mail surveys

Sometimes the most appropriate way to do a survey is by mail. If most of the following conditions apply, a mail survey could be the best type to use:

It's generally not worthwhile to do a mail survey with the general public. Most people simply won't answer, so you won't be able to determine how representative your results are. But when small specialized populations are to be surveyed, mail surveys can be very effective.

The biggest problem with mail surveys is a low response rate. In my experience, the minimum response rate for producing valid results is about 60%, but many mail surveys achieve less than 30% return. To overcome this, you need to make it easier and more rewarding for people to respond.

Most of this chapter also applies to other types of self-completion questionnaires - such as those distributed at events (see the chapter on event surveys), and questionnaires left by interviewers for respondents to fill in, to be collected when the interviewer returns.

Making it easy

People who are willing to return a mail questionnaire may not get around to doing so without some prompting. For this reason it's normal to offer respondents some encouragement to mail their questionnaires back.

1. Include a return envelope

The first method of encouragement is an easy way to get the questionnaire back: a business reply or freepost envelope, addressed to the originator of the survey. Freepost licences are easy to obtain (in most countries), and the only costs involved are associated with printing envelopes (in a format strictly specified by the postal authority. In Australia, freepost letters cost 2 cents for each letter returned, on top of the 45 cents postage. In several other countries, the proportions are much the same. The freepost charge is paid only when an envelope goes through the mail, so if you send out 1000 questionnaires and only get 500 back, you are only charged for 500.

If you put stamps on all the return envelopes, this normally produces a slightly higher response rate than freepost (because some people don't like a stamp to be wasted), but it will cost you a lot more than using freepost.

Another freepost method is that for respondents to supply their own envelope and address it themselves. This saves them only the cost of a stamp. However, it is much more convenient for a return envelope to be included with the questionnaire — otherwise respondents use many shapes and sizes of envelopes, and it can cost more than the cost of an envelope to unfold and flatten questionnaires. Also, if you supply a pre-addressed freepost envelope with the questionnaire, this is one less reason for respondents to delay returning their questionnaires.

2. Give a deadline

The second incentive seems trivial, but we have found it to be surprisingly effective. Simply print, near the beginning of the questionnaire, something like this:

=====================================================

Please try to return this questionnaire within 7 days

=====================================================

The shorter the request, the better it seems to work. Though some people ignore such appeals, many take notice of this one, and it will help you get more questionnaires back, sooner.

3. Offer an incentive

Surveys that don't use interviewers tend to have much lower response rates than surveys where the interviewer speaks to the respondent. It's much easier to ignore a questionnaire that comes in the mail than to ignore a real interviewer. Therefore, mail surveys need to use incentives, to boost the response rate. There are two types of incentive, which I call psychological and financial.

An psychological incentive is a way of making people feel good about filling in a questionnaire - e.g. "If you like to use our products, please help us improve them by completing this questionnaire."

A financial incentive is money or goods given to the respondent. This can be in two forms: either every respondent is given a small gift, or every respondent is given what amounts to a lottery ticket, and a chance to win a large prize. In some countries, the lottery incentive is illegal. In others, a special licence must be obtained from the authorities - which may take months.

After experimenting with incentives of various types and sizes, I have reached two conclusions:

1. A small chance of winning a large amount works better than the certainty of a small amount. It's also much less work to give out one large prize than lots of tiny ones.

Judging the size of the incentive is something of an art: if the incentive is too small, many people won't bother to respond. But if you're offering a huge prize for one lucky respondent, some respondents will send in multiple questionnaires - probably not with identical answers, some even answered at random - simply to try to win the prize. Once I worked on a project with a prize of a holiday in Europe. When we sorted the file in order of respondents' addresses, we found that some had made multiple entries with slightly different versions of their address. In other households, improbably large numbers of people had returned questionnaires - and some of their names looked suspiciously like cats and dogs!

Very large prizes produce diminishing returns: when the prize is doubled, the response rate rises only a few percent. An amount that I've found effective is the value of about a day's wages for the average respondent. For most people, this would be a pleasant amount to win, but would not make them distort their answers.

Offering several small prizes doesn't work as well as one large prize - unless respondents can clearly see that they have a higher chance with a small prize - for example, one prize in each village surveyed.

Take care that the prize offered is something which won't discourage potential respondents who already have one. A poorly chosen prize can affect the accuracy of a survey. For example, a survey in the late 1980s set out to measure the musical taste of the Australian population. A compact disc player was offered as a prize. As the people who were most interested in music probably owned a CD player already, the survey would have underestimated Australians' interest in music.

In wealthy countries, the most effective kinds of prizes are often small luxury items that cannot be stored up. Vouchers for restaurant meals are often very effective. An incentive that worked well in a survey of rich businessmen was the chance to meet others of their kind: we offered a meal in a high-priced restaurant for 10 respondents. They were so keen to meet each other that some even offered to pay for their meals if they didn't win a prize!

Don't begrudge spending money on rewards: it's usually more than saved by the number of questionnaires not printed and mailed out to achieve an equal number of responses.

2. It's best to use two different kinds of reward at the same time: psychological incentives as well as financial ones. By psychological incentives, I mean reasoned appeals to complete a questionnaire. These arguments can appeal to either self-interest or philanthropy - sometimes both. For example:

Because people who don't use a service much also tend not to respond to surveys about it, it's a good idea to add another emotional appeal, perhaps something like this:

Another type of psychological incentive is a promise to tell respondents about the survey's results. This is simplest to fulfil when you have regular contact with respondents, and notifying them of results could be as simple as putting an article in their next newsletter. One point in favour of this type of incentive is that it can work with people who are unaffected by other types of incentive.

Psychological incentives work well with regular users of a media service, but non-users don't usually have any emotional attachment. Non-users usually respond well to financial incentives, but regular users respond little to financial incentives - unless the prize is very large. That's why using both types of incentive will produce a better balanced sample than using only one type.

Questionnaire design for mail surveys

Compared with questionnaires for telephone surveys (which have to be read and completed only by interviewers), self-completion questionnaires need much more care taken with their design.

Before having a questionnaire printed and mailed out, it's essential to test it thoroughly with 5 to 10 respondents: some old, some young, some not very bright. Don't use your office staff, or your friends or relatives: they know too much about your intentions. Go to strangers, who have never heard of your survey before - but if the questionnaire is only for listeners to your station, obviously the strangers should also be listeners. Sit beside them while they fill in the draft version, and ask them to think aloud.

You need to make sure that:

You'll find that even after you have produced many mail questionnaires, occasional problems still occur. Having 5 to 10 testers fill in your questionnaire is an excellent way of improving the survey - as long as you take notice of any problems they have, and any misunderstandings. If the testing results in extensive changes to the questionnaire, find a new lot of testers - 5 is usually enough, the second time.

Tell them exactly how to answer

In every question, make it clear to respondents how many answers are expected, using instructions like this - placed after the question, but before the possible answers.

Please tick one box.

Please tick all answers that apply to you.

Please give at least one answer, but no more than three.

Use a consistent method for answering. If you ask respondents to answer some questions by ticking, and others by circling code numbers, you'll find that some put ticks where they should have circled codes — and vice versa. To make it easy for most respondents, use ticks wherever possible.

In some countries (including Australia, Japan, and Ethiopia) a tick means "Yes" and a cross means "No." In other countries, such as the USA, a cross means "This applies" - which is almost the same as Yes. People from countries where a cross means No can get confused and give the opposite answers to the questionnaire writer's intention - so before printing a questionnaire for a country you don't know well, be certain which convention applies there.

One method I've used that works surprisingly well is to ask respondents to write in code numbers. Most of them write legible numbers, and this greatly speeds up the data entry. When you have a series of questions, all with the same set of possible answers, this method works well. Here's an example.

Here are some questions about programs on FM99. Please give your opinion of each program by writing one of these numbers on the line after its name:
  1 if you like it very much
  2 if you like it a little
  3 if you don't like it
  4 if you have no opinion about it
Program 1 ___
Program 2 ___
Program 3 ___

If this isn't clear to your questionnaire testers, you can add an example, before the first program name, e.g.

For example, if you have never heard of Program 2, write 4 on the line, like this

Program 1 4

The problem with giving examples in the questionnaire is that sometimes this encourages people to give the answer shown in the example. For that reason, it's best to choose some sort of non-answer (e.g. "does not apply") for the example - and only add an example if some testers don't understand the question.

A common mistake is to use two sets of scales that run in opposite directions. In the above example, an answer code of 1 meant "like it very much". What would happen if that set of questions was followed by this one?

These questions are about different radio stations. Please rate each station on a scale of 1 to 10, where 1 means you never listen, and 10 means you listen whenever possible.

After answering the previous set of questions, respondents may have forgotten the meanings attached to the answer codes, but they will remember that 1 is the best answer. Now they find a set of questions where 1 is the worst answer. This is guaranteed to confuse people! You can see this when some respondents cross out their first set of answers, and write beside those a reverse set of numbers, with their correct answers. Though some respondents realize they've made a mistake, others probably don't. So take care to keep the answer scales consistent - the numbers are quite arbitrary.

Also, avoid getting people to rank items, e.g.

Please write a number beside each announcer, to show how much you prefer that announcer. Put 1 beside the one you like most, 2 beside the one you like second best, and so on, down to 10 for the one you like least.

Respondents hate this! It's very easy to forget one item, and then have to change all the rest to fit in. The more items there are to be ranked, the more difficult it is. Though most people might be able to give their top few and bottom few choices, the middle ranks are often arbitrary. It's much better to use a rating scale - e.g. 1 = like very much, 2 = like a little, 3 = don't like... and so on. With a rating scale, respondents need to make only one judgement at a time, without looking at all the other items.

The problem with rating scales is that some people give different patterns of rating. Women often give higher ratings than men do, and older people are more lenient than young people. But do women and older people really like most things more than men do - or are they simply more generous when filling in questionnaires? Of course, it's impossible to say; so you might as well use the rating scale, because it's easier for respondents. (Always remember the importance of high response rates in mail surveys.)

Open-ended questions

Leave plenty of space for open-ended answers. Respondents find it annoying when trying to write in an answer in too small a space. Valuable information can be lost. Using dotted lines (or other faint lines) for respondents to write on helps make the comments more readable. Space these lines about 6mm vertically. For most people, this is not quite enough for comfort, and it seems to force them to write more legibly than if more space is allowed.

There's also a danger of leaving too much space for open-ended answers. This encourages some respondents to write at great length - and usually miss the point of the question. Three or four dotted lines is usually plenty.

Multi-column layouts

If you're trying to reduce the number of pages in a questionnaire, you'll notice that a lot of space is wasted for questions with a lot of short multiple responses. For example:

Which of these age groups are you in?
Please tick one box.
[ ] 10-14
[ ] 15-19
[ ] 20-24
[ ] 25-34
[ ] 35-44
[ ] 45-54
[ ] 55-64
[ ] 65 or over
[ ] Can't remember

That's 11 lines: a large proportion of a page. You can save space by putting several answers on one line. But do it like this:

Which of these age groups are you in? Please tick one box.
[ ] 10-14 [ ] 35-44
[ ] 15-19 [ ] 45-54
[ ] 20-24 [ ] 55-64
[ ] 25-34 [ ] 65 or over
more [ ] Can't remember

Note the arrow at the foot of the first column: it is used to draw the respondent's eye upwards and to the right. This doesn't matter with a question like age group - because people know their age and will be looking for the group, but for other types of question they might not notice some of the possible answers. If all a question's answers are different lengths, you'll probably have to use a single line for each - don't change the number of columns within a single question. Too confusing!

Don't lay a question out like this:

Which of these age groups are you in? Please tick one box.

[ ] 10-14 [ ] 15-19 [ ] 20-24 [ ] 25-34 [ ] 35-44 [ ] 45-54 [ ] 55-64 [ ] 65 or over [ ] Can't remember

You save a little space, but it's hard to read. There's too much danger of somebody ticking the box on the right of the age group instead of the box on the left.

Code numbers

In the above examples, I didn't include code numbers for each possible answer. In the lower layout, adding code numbers would have made it even messier, but in the upper layout can take code numbers without losing clarity. I recommend that answer codes always be printed on questionnaires; this will ensure more accurate data entry. Using a regular 2-column layout, the code numbers would look like this:

Which of these age groups are you in?
Please tick one box.
1[ ] 10-145[ ] 35-44
2[ ] 15-196[ ] 45-54
3[ ] 20-247[ ] 55-64
4[ ] 25-348[ ] 65 or over
more 9[ ] Can't remember

The code numbers don't need to be large, and they can be in a different type from the questionnaire. As long as the layout looks clear, respondents hardly notice code numbers.

Multi-page questionnaires

If the questionnaire is printed on both sides of a single piece of paper, some people won't notice the second side. You might think that one way to avoid this problem is to have a half-finished question at the foot of page 1, and to continue it at the top of page 2 - but I've tried this, and it doesn't work well. When a question is on one page and its possible answers on another, some people don't bother answering it. The best method to encourage people to turn the page is to put "More..." below the last question on a page - but don't put this word in a corner where it can be overlooked. Have it in bold type, immediately after the preceding question - not out near the right-hand margin.

Missing a page is only a problem with 2-page questionnaires. When a questionnaire has more than 2 pages, respondents can feel that there are more pages after the first.

Order of questions

Make sure the first few questions are interesting, and easy to answer. Multiple-choice questions asking about people's opinions are often good to put at the beginning. If the questions seem interesting enough, and not difficult, many people will begin to complete the questionnaire as soon as they receive it.

Open-ended questions shouldn't be too near the beginning, unless the answers will need very little thinking. If people have to think too hard about an answer, they'll often put the questionnaire away to fill in later - and perhaps never return to it.

If there are any questions which some people might be reluctant to answer, or might find difficult, put them somewhere near the middle of the questionnaire - but preferably not at the top of a page. Some people read questionnaires from the back to the front, while others look first at the top of a page. With a self-completion questionnaire (unlike a spoken interview) you have no control over the order in which respondents read the questions.

Make sure the return address is printed on the questionnaire, so that a respondent who has lost the business reply envelope is still able to return the questionnaire. This address should go at the end of the questionnaire: as soon as the respondent finishes filling it in, the address is immediately visible.

Style of presentation

The standard of design and printing of the questionnaire will convey a certain meaning to respondents. The questionnaire should look thoroughly professional, but not so lavishly produced that respondents will assume it is advertising material and throw it away unread. To help readers with poor eyesight in poor light, the type should be at least 10 points, and there should be strong contrast between the ink and the paper colour. Using high-quality type and coated paper can compensate for small lettering. A long questionnaire should appear shorter and easier to fill in than it really is.

The writing style of a questionnaire is important to get right. It shouldn't be boring or official in tone, but nor should it be so flippant as to encourage incorrect answers. The best style seems to be more of a spoken style, rather than a written one. A mail questionnaire should sound like spoken conversation when read aloud.

And remember that you're writing a questionnaire, not a form. Write questions, not commands. A typical official form, designed to save paper, is often ambiguous. What do you make of this "question" - which recently saw in an amateur questionnaire:

Children ......

What should you answer?

If the question you want answered is "How many children under 16 live in your household?" write exactly that. Saving a little space doesn't compensate for wrong answers.

Another common error is "Name …….."

Name of what? It will surprise you how many sorts of names people will write on mail questionnaires. If you mean "Your name," why not say so? Most of these problems can be eliminated by pilot testing, if you ask the testers to try and deliberately misunderstand the questions.

Identifying questionnaires

If you already have some data about each respondent, which you want to combine with answers from his or her questionnaire, you will need to identify who returned each questionnaire. The usual way of doing this is to print or write a serial number on each questionnaire before sending it out. You can buy stamping machines, resembling date stamps, which automatically increase the number every time you stamp them.

Another way is to stick a name and address label on each questionnaire, and use window envelopes for sending questionnaires out, so that the same label serves two purposes. When you do this, test the placement of the label on the questionnaire, and where the paper is folded - otherwise part of the name and address may not be visible through the window in the envelope.

For some types of question, respondents may not answer honestly if they know their responses are not anonymous. We have found that it's safe to encourage people to obliterate their name and/or serial number if they are worried about anonymity: usually less than 2% take up this offer. It's also a good idea to mention your privacy policy on the questionnaire, e.g. "We will not give out your personal details to any other organization."

There can be problems with unidentified questionnaire. For example, somebody with strong opinions might photocopy 100 questionnaires and return them all. But if each questionnaire has a unique number, this copying is easily discovered. Also, anonymous questionnaires are generally not filled in as well as numbered ones. For highly sensitive questions, anonymity is desirable, but mail surveys may not be the best method.

An important reason for numbering questionnaires is that, without identification, any reminders become much less effective and more expensive. If every questionnaire is numbered, you can check it off when it is returned, and send a reminder letter to people who have not returned their questionnaires. When questionnaires are not numbered, you will need to send reminder letters to the whole sample, even though most people may have already returned questionnaires.

The covering letter

Don't post out a questionnaire with no explanation of why you're sending it. You'll get a better response if you include a covering letter, supplying this information:

It helps if the covering letter is signed by a person the respondents have heard of. For example, when members of an organization are surveyed, the letter could be signed by the organization's president.

Research has found it makes almost no difference to the response rate if the covering letter is personally signed (as opposed to a printed signature) or personally addressed (as opposed to "Dear club member", or whatever). As both of these personal touches create a lot of extra effort, they might as well be avoided.

Printing and posting questionnaires

Mail surveys can involve handling formidable amounts of paper. When sending out a typical questionnaire, each outgoing envelope usually contains three items: the questionnaire, the covering letter, and a freepost or business reply envelope to return the completed questionnaire in. Attention paid to details - such as the exact way in which questionnaires are folded - can save a lot of time later.

If a questionnaire is printed on A4 size paper, and folded into three equal parts vertically (to become 99 by 210 mm), it will fit easily into the largest size envelope which still attracts postage at the basic rate: 120 by 237 mm. (This applies in Australia, but many other countries' postal regulations are similar.) The reply envelope should be intermediate in size between these two: small enough to fit in the outgoing envelope without folding, but large enough that the completed questionnaire can go in it without re-folding. We use 110 by 220 mm freepost envelopes, which are just right in this situation.

Packaging and posting out questionnaires can be a lot of effort. If you have plenty of helpers but little money, you can call for volunteers to do the mailing, and create a production line: one person numbering the questionnaires, one folding, one packing the envelopes, and so on. Fewer mistakes are made this way.

If money is no problem, there's a much easier method: use a mailing house: a company which organizes mail advertising. All you need to do, when you use a mailing house, is supply one master copy of the questionnaire and one covering letter, plus a list of names and addresses of recipients,. The mailing house will print the questionnaires and covering letters, do the packing (often using machines), and even post it all out for you. The cost of the printing and enveloping is usually about half the cost of the outgoing postage.

Reminder letters

To get a decent response rate in mail surveys, some system of reminders is usually essential. It's best to prepare for reminders from the start: they need to be budgeted for, and ready to go at the right moment.

To know when to send a reminder, keep a record of the number of completed questionnaires arriving in the mail each day. At some point, usually about two weeks after the initial mailing (depending on on the efficiency of your country's postal system), the daily number will start to fall away markedly. Now is the time to send out reminders.

There are two methods of sending reminders. If the questionnaires are not individually identified (e.g. numbered), you have no way of knowing who has sent one back, and who hasn't. Therefore you have to sent a reminder letter to all the respondents. For example:.

We posted a questionnaire to you on 31 May. If you haven't already filled it in and returned it, please do so by 30 June, so that your answers can be included in the survey results. If you have lost the questionnaire, please ring us on 826-953, and we'll send you another copy.

If the questionnaires are identified, it's easier and cheaper. You can send a specific letter only to the people from whom you haven't yet received a completed questionnaire:

We posted a questionnaire to you on 31 May, but we haven't yet received it back from you. Please fill it in and return it by 30 June, so that your answers can be included in the survey results. If you have lost the questionnaire, please ring us on 826-953, and we'll send you another copy. If you have already sent it in, we thank you.

Notice the difference between this and the earlier reminder: "we haven't yet received it back from you" - much more specific (and more effective) than the more loosely worded reminder above. But to be able to do this, you must keep track of which questionnaires have come back, and which haven't.

One of the easiest ways to do this is to produce a large sheet of paper with every questionnaire number on it. You can easily fit 500 numbers on one piece of A4 paper: 10 numbers across (if they are short), and 50 lines down. When each questionnaire is returned, cross out its number. At any time, the numbers not crossed out are the ones which need reminders.

Alternatively, you can use a computer database program - such as Epi Info or Filemaker Pro. If you are already using a database to print the names and addresses, it's easy to add a few more fields: the date the questionnaire was returned, date of the first reminder, date of the second reminder, date a replacement questionnaire was sent, and so on. However, if you don't already have a database, it's quicker to cross the numbers off a piece of paper.

If you have phone numbers of most potential respondents, and enough helpers available, a telephone reminder will work more quickly than a mailed one.

Several weeks later, there may still be a lot of questionnaires outstanding, so it could be time for a second reminder. At this stage, the numbers involved are relatively small, and the chances are the non-respondents have lost their questionnaires and/or return envelopes. So it's usual to send out a new questionnaire and return envelope with the second reminder.

A reminder questionnaire normally has the same identification number (or name) as the original one, but there is a way of distinguishing it from the original questionnaire. For example, the reminder questionnaire could be printed on a different colour of paper, or have a suffix added to the identification number. This prevents any person who returns two questionnaires from being counted twice.

Each successive reminder produces fewer and fewer responses, so at some stage you will need to give up — or, if you're very keen to get responses, to try a telephone call, or a visit to the household, or some other non-mail means of contact.

Receiving completed questionnaires

At the other end of a survey, when the completed questionnaires start arriving in the mail, mass paper-handling is again required.

The stages are:

1. Count the number of questionnaires coming back each day.

2. Open the envelopes, and unfold the questionnaires. This sounds too trivial to mention, but it can take many hours if the sample is large. If you delay this stage, you won't be able to send out reminder letters on time.

3. If the questionnaires are numbered, check off each one - e.g. cross its number off master list, showing that this respondent won't need a reminder letter. If you are using a database program, enter the details of the questionnaires that have just arrived.

4. Type the answers from completed questionnaire into the computer program you are using for data entry. You could leave this stage till all the questionnaires have come in - but the sooner you do it, the sooner you'll find if there are any problems which need to be fixed.

5. Store the questionnaires in numerical order. If some respondents have obliterated or cut off their questionnaire numbers, give these questionnaires a new number, above the highest number sent out. For example, if the highest numbered questionnaire was 1000, assign new numbers starting from 2000. Usually, no more than about 3% of respondents obliterate their numbers, even if they are told they can do so. If this possibility is not mentioned in the covering letter, less than 1% obliterate the numbers.

At some stage you have to call an end to the survey, and process the results. A principle I've found useful is to stop when the response rate increases by less than 1% in a week. Of course, if there is some external reason for a deadline, you may have to stop processing newly arrived questionnaires long before the full response is reached.

Mail surveys: summary

The mail survey technique is one of the easiest ways of getting valid information, but only when all the conditions are right, when questionnaires are clear and easy to follow, and when there are enough incentives to produce a satisfactory response rate. Unless the response rate is over 60%, the information from the survey will probably be useless, because you won't know how different the non-respondents are. It's therefore vital to make it as easy as possible for respondents to complete and return the questionnaires, and to follow up the non-respondents with reminder letters.

Fax surveys

I've had recently good success at doing surveys by fax. Of course, this is only feasible when nearly the whole surveyed population has fax machines, but most businesses in developed countries can now receive faxes.

If you have a computer with fax software and mail-merge software, you don't even need to print out the questionnaires. You can prepare the questionnaire with a word processing program, set up a mailing list of recipients' fax numbers, and send all the questionnaires from your computer. If you then make another copy of the mailing list, and delete entries from it whenever a questionnaire is returned, you can use this shorter mailing list to send reminders. As a short fax is (in most countries) cheaper than sending a letter, it's economic to send quite a lot of reminders, thus increasing response rates.

Some people get annoyed when unwanted faxes use up their paper, so these surveys should be kept short: preferably one page. As it's then very easy for somebody to fill in the questionnaire, and fax it straight back to you, the responses for fax surveys come back much more quickly than for mail surveys.

Perhaps this is because fax surveys are a novelty. If they become common, the response rates could quickly become very low indeed. The secret is to keep the questionnaires short and interesting, and to keep sending reminders.