Questions About Cancer? 1-800-4-CANCER

Clear & Simple: Developing Effective Print Materials for Low-Literate Readers

  • Updated: 02/27/2003

Step 5: Pretest and Revise Draft Materials

Pretesting is a qualitative measure of audience response to a product. As previous sections have suggested, it is critical to pretest draft messages and visuals with members of the intended audience. Pretesting helps ensure that materials are well understood, responsive to audience needs and concerns, and culturally sensitive. Leonard and Cecilia Doak discuss pretesting, which they call "learner verification," as a tool for identifying elements in a communication that need to be changed to make it more appropriate for the intended audience.

Although funds may not be available for extensive pretesting, some pretesting is essential to ensure that materials are culturally relevant and understandable to the target audience.

Pretesting is an important step in developing materials for any audience, and references such as Making Health Communication Programs Work explain general principles and provide "how-to" information. The focus of this section, however, is the special requirements for pretesting low-literacy materials.

Pretesting Variables: What To Test for

According to Leonard and Cecilia Doak, the basic purpose of pretesting materials with low-literate audiences is to find out whether the readers understand your messages. In addition to comprehension, three other factors are important: audience attraction to the product, its acceptability to them, and their personal involvement with the material. In discussing materials with participants, key issues to probe for include the following:


Does the respondent understand what the material is recommending and how and when to do it? Is anything unclear, confusing, or hard to believe? What meaning does the respondent attach to key words? to symbols and abbreviations? to visuals? Important aspects include:
  • Suitability of the words used: "What does the educational piece mean when it says to eat balanced meals? How do you do that?"
  • Distinguishing key details: "Which vegetables have lots of fiber?"
  • Meaning or relationship of visuals to text: "Looking at this picture, how will you cut down on fat in your soups or stocks when cooking?"


What kind of feelings does the material generate-- enthusiasm? just OK? or a "turnoff"? For example, "Are the people in the material attractive to you? Is there anything you don't like about the people (or pictures) in this material? How about the color and the layout of the material?"


Is the material compatible with local culture? Realistic? Would it offend people in any way? Are the hairstyles, clothing, etc., appropriate?
  • Suitability for both sexes and all ages: "Is this mostly for men (women)?"
  • Supportive of ethnic practices: "Do you think your friends and neighbors would be willing to cook their foods this way?"
  • Personal Involvement or Relevance. Can the respondent see himself or herself carrying out the actions called for in the materials?
The Doaks also stress that not every word, picture, or idea has to be tested. The most important points to verify are that the reader understands what he/she should do and how to do it. Comprehension often depends on understanding fundamental vocabulary terms and visuals. For example, the word "diet" may be understood by different people to mean different things; applying the wrong definition could lead to the wrong behavior. When key content is conveyed in word/picture association or tables/lists, it is critical to test comprehension of these formats and to ensure that the reader has the skills needed to use them.

Organizing a Pretest

Pretesting with low-literate audiences involves some unique logistical considerations. This section discusses some issues to keep in mind while organizing your pretest.

When to Pretest

At either or both of these stages of product development:
  • Rough draft of copy and graphic concepts using a manuscript of the product, with a few examples of potential illustrations.
  • Preliminary typeset, laid-out version of the product with rough graphics in place.

Where to Pretest

  • Clinic/hospital waiting rooms.
  • Doctor's office.
  • Client's home.
  • Community facilities (e.g., church, senior center).
  • Adult Basic Education and English as a Second Language classes.
  • Agency/organization's facilities (e.g., Social Security office, WIC center, job training centers).


Some researchers believe it is best to test a product in the same type of environment in which a reader will be using the material. For example, if a patient will be reading a factsheet in a noisy, busy clinic, be sure that test readers have the same distractions.


  • Individual interviews (10-20 minutes each)
Advantages: Provide "cleanest" results; less chance of one respondent biasing another; good for short materials.

Scheduling may not allow speedy completion.
  • Group Interviews (8 to 10 people, 30-60 minutes)
Advantages: Better for longer pieces (booklets, kits); group discussion may elicit valuable information not contained in interview questions.

Need to organize in advance; need a trained group facilitator to conduct. Session must be paced to maintain attention.

Number of Participants

Ideally, at least 25 to 50 members of the intended audience should review your product. If time or budget is limited, however, it is far better to test the material with 5 to 10 people than not to test it at all.
You can feel comfortable with fewer participants if the product is:
  • Very simple.
  • Very short, needing only a brief attention span to complete.
  • Directed at one homogeneous target group.
You need more participants if the product is relatively:
  • Complex.
  • Long, demanding a sustained attention span.
  • Culturally diverse.

Determining the Reading Level of Pretest Participants

Are you pretesting your materials with the right readers? Having pretest participants who have the same characteristics as the low-literate audience you are trying to reach is critical to the validity of your pretest results. Recruiting participants through groups or settings that include people with limited-literacy skills is a logical starting point. But the only way to be sure your pretest volunteers read at the same level as your intended audience is to test their reading skills. The Wide Range Achievement Test (WRAT) is used to measure reading levels and the Cloze technique is used to measure comprehension. To avoid offending or causing discomfort to those whose reading ability you are testing, you can integrate a WRAT or a Cloze test into the pretest interview. For example, in a recent pretest conducted by the National Cancer Institute, the interviewers introduced the WRAT test as the last part of the pretest. They stated, "Thank you for helping with the questions on the chemotherapy booklet. We need your help with one last part-- a word list. This will take only a few minutes. The word list will help us know how difficult the words are in the chemotherapy booklet." This integrated approach spared the participant the pressure or potential embarrassment of "failing a reading test."

The Wide Range Achievement Test

The Wide Range Achievement Test is based on word recognition and does not measure comprehension or vocabulary. An efficient way to determine reading levels, it takes only a short time to administer.

The WRAT has the reader look at a written word and say that word out loud. Pronouncing the word correctly shows that the reader recognizes the word. The WRAT focuses on recognition because, at the most basic level, if a person does not recognize a word, comprehension is impossible.

The test involves listening to the participant read from a prepared list of words, arranged in increasing order of difficulty (see sample list below). The test is over after the reader mispronounces ten words, and the test administrator notes the level at which the last mispronunciation occurred. The "stop" level equates to a grade level of reading skills. You can compare this level with the reading level of your intended audience to see if your pretest readers are a good representative "match." The Low Literacy Publications Section at the end of this guide provides contact information for obtaining WRAT testing instruments.

The Cloze Technique

This procedure measures the reader's ability to comprehend a written passage. Because it requires readers to process information, it may take up to 30 minutes to administer.

In a Cloze test, text appears with every fifth word omitted. The reader tries to fill in the blanks. This task demonstrates how well he/she understands the text. The reader's ability to identify the correct word also reflects his/her familiarity with sentence structure.

While packaged Cloze tests are available, Leonard and Cecilia Doak's Teaching Patients With Low Literacy Skills explains how to make up and score a Cloze test yourself, based on the materials you are pretesting. Their book also discusses use of the WRAT to assess reading levels.

Conducting Pretest Interviews: Special Considerations

Just as effective low-literacy products are designed to meet audience needs, effective pretesting also must be tailored to participants with limited-reading skills. Those experienced in pretesting with a low-literate audience offer the following suggestions:
  • Give only one product to each individual or group even if you are testing a series of publications. This ensures the respondent's maximum concentration level for each product.
  • Distance yourself from the product and assure participants that you want their honest assessment. "If you don't," one pretester explains, "this audience is more likely than higher level readers to avoid criticizing. Perhaps it's because they aren't sure they understand or perhaps they aren't comfortable explaining negative responses."
  • Make sure participants understand that they are not being tested, the materials are. Explain this clearly when you introduce the materials and the participant's role. People concerned about their literacy skills need this reassurance.
  • Pay attention to audience interest in the proceedings. Change the pace, if necessary, to maintain attention levels. When attention was waning in a large group discussing product text, for example, the facilitator split the group in half to allow more one-to-one interaction and switched the focus to illustrations.
  • Choose people to recruit and interview pretest participants who are culturally sensitive and who have good social skills. Unless potential participants feel at ease with the interviewing staff, they may not agree to take part.

Using Pretest Results To Revise Materials

Pretest results, usually in the form of a report with recommendations, are not a blueprint for revisions. They raise issues and point out problems; the solutions are up to you. Common issues about low-literacy products include the following:

Should the product be scrapped and begun again when pretesting raises concerns?

Much of the time, simple to extensive revisions can "fix" problems that pretesting uncovers. You should start over only when the majority of responses indicate fundamental problems, such as:
  • Most readers were completely lost and could not identify the key behavioral actions the material was designed to convey.
  • The medium interferes with the message (e.g., a large "chart" poster is too complicated to understand).
  • The format is altogether unappealing or off target (e.g., readers do not want a booklet, they want a video; a tip sheet gives only "how-to" information, but readers really want background information).
  • The product is culturally inappropriate, with no relevance to the audience.

If one or only a few respondents raise a concern, are revisions necessary?

This is a judgment call. Your answer will depend on:
  • How many respondents you had. If you only tested the material with 10 people, each response needs to be considered seriously; with 50 participants, percentages can be relied on more comfortably.
  • The nature of the comment. Any remark that shows lack of understanding of a key concept should receive careful attention. If one person did not grasp a point, others may have problems as well. Idiosyncratic comments about a product's appeal or personal relevance are less of a concern, however. No format will please everyone in any audience.

Should respondents' suggestions for revising a product be followed?

Pretest participants are "experts" in what they can understand and accept in a product; they are not experts in materials' design. Most of the time, professional judgment is needed to devise an effective way to address reader concerns.

Many writers like to attend pretest group sessions or read interview questionnaires themselves rather than relying only on the pretest report. To get the most useful results, writers often contribute to questionnaire development as well. That way, they can draw special attention to specific wording or format choices they want an audience to validate.

Questions You May Have About Pretesting

My timeframe is very tight. How can I fit pretesting into the schedule?

Pretesting need not be an elaborate, time-consuming research project. How long it takes depends on how quickly you work, how elaborate your internal review process is, how large your test group sample is, and the schedules of the sites or organizations involved. The best approach is to include time for pretesting in the initial product development schedule. If time is short, plan a small-scale test and begin the preparatory steps right away. Much of the preliminary work in pretesting can be completed long before the actual testing period. For example, arranging with a group or site to get access to audience representatives can begin during concept development. You can begin to draft your questionnaire while a product is still going through internal review, with any needed modifications made later.
"At first I had to educate my organization about the critical role pretesting plays in low-literacy product development," one writer remembers. "Now that everyone has seen how valuable it is, pretesting is no longer a 'refinement' we consider using in special circumstances. We make time for it in every low-literacy product development process."

How can I get access to people from the target audience for pretesting?

Many agencies work with the literacy organizations listed in the Resources section of this guide. State directors of Adult Basic Education and English as a Second Language programs often are good sources.

You also can cooperate with groups that represent your particular target audience. In pretesting booklets about HIV and AIDS, for example, the National Institute for Allergy and Infectious Diseases (NIAID) worked with organizations such as People With AIDS and community clinics that are part of NIAID's research network. Several professionals interviewed for this guide suggested that offering incentives, such as small cash payments, or reduced waiting time for an appointment, can help motivate individual participation. Overworked and underfunded literacy programs also are more able to provide timely assistance if your agency can compensate them for their recruiting time.

Professional intermediaries distribute our pamphlets to their clients. Should they be part of pretesting?

Professional intermediaries and advisory groups can be an important source of insights about your product's utility. Product developers interviewed for this guide cautioned, however, that responses of audience members should take precedence over professional views if you receive conflicting opinions. It also is important not to rely on professional review alone; no matter how close professionals are to their clients, they are not low-literate themselves.