Usability Testing for Survey Research

Authors:

Language: English
Cover of the book Usability Testing for Survey Research

Keywords

Accuracy; Accuracy efficiency; Agile analysis approach; Assessment testing; Attention; Cognitive testing; Concurrent; Conditional probes; Confusion; Constructed frame; Context of use; Coverage error; Debriefing; Detailed analysis approach; Dry run; Ease of use; Efficiency; Existing frame; Exploratory testing; Eye tracking; Fixation count; Fixation duration; Formative testing; Goals; Implicit data; In-the-field; Iterative testing; Key components of usability; Laboratory; Learnability; Logging observations; Measurement error; Measures; Mental model; Mobile sled; Moderating; Moderator's guide; Navigation; Neutral feedback; Nonresponse error; Note-taker; Note-takers; Observational data; Observers; Paper prototype; Participant incentives; Participant selection; Performance measures; Pilot testing; Pretesting; Prioritizing findings; Probe; Probing; Product; Prompt; Qualitative; Qualitative data; Quantitative; Quantitative data; Recall; Recruitment; Remote; Respondent-survey interaction; Response formation model; Retrospective; Saccades; Sample size; Sampling error; Satisfaction; Satisficing; Scenario; Screen recording; Screen sharing; Screening criteria; Script; Scripted probes; Self-report data; Spontaneous probe; Summative testing; Survey feedback; Task; Testing focus; Think-aloud; Usability; Usability metrics; Usability model for surveys; Usability testing; Usability testing continuum; Users; Validation testing; Verification testing; Visual design and layout; Wireframe

42.81 €

In Print (Delivery period: 14 days).

Add to cartAdd to cart
Publication date:
Support: Print on demand

Usability Testing for Survey Research provides researchers with a guide to the tools necessary to evaluate, test, and modify surveys in an iterative method during the survey pretesting process. It includes examples that apply usability to any type of survey during any stage of development, along with tactics on how to tailor usability testing to meet budget and scheduling constraints.

The book's authors distill their experience to provide tips on how usability testing can be applied to paper surveys, mixed-mode surveys, interviewer-administered tools, and additional products.

Readers will gain an understanding of usability and usability testing and why it is needed for survey research, along with guidance on how to design and conduct usability tests, analyze and report findings, ideas for how to tailor usability testing to meet budget and schedule constraints, and new knowledge on how to apply usability testing to other survey-related products, such as project websites and interviewer administered tools.

1. Usability and Usability Testing2. Respondent–Survey Interaction3. Adding Usability Testing to the Survey Process4. Planning for Usability Testing5. Developing the Usability Testing Protocol6. Think Aloud and Verbal-Probing Techniques7. Conducting Usability Sessions8. Analyzing and Reporting Results

Emily Geisen is the manager of RTI’s cognitive/usability laboratory and specializes in designing and evaluating survey instruments to improve data quality and reduce respondent burden. In addition, Ms. Geisen teaches a graduate course on Questionnaire Design at the University of North Carolina (UNC), Chapel Hill. In her tenure at RTI, she had conducted hundreds of usability tests on a variety of projects from the Survey of Graduate Students and Postdoctorates to the 2020 Census questionnaires. She was the 2010 conference chair for the Southern Association for Public Opinion Research (SAPOR) and the 2009–2011 secretary of the Survey Research Methods Section of the American Statistical Association. She is the 2016-18 American Association for Public Opinion Research (AAPOR) Membership and Chapter Relations Communications sub-chair. Ms. Geisen developed a short course on Usability Testing for Survey Researchers that was taught at the 2011 annual SAPOR conference; the 2016 AAPOR annual conference; the 2016 International Conference on Questionnaire, Design, Development, Evaluation and Testing; and UNC’s Odum Institute. Ms. Geisen also teaches an Introduction to Focus Groups course at the Odum Institute. Ms. Geisen received her B.A. in Psychology and Statistics at Mount Holyoke College, and received her M.S. in Survey Methodology in 2004 from the University of Michigan’s Program in Survey Methodology where she was an Angus Campbell fellow. While attending the University of Michigan, she also worked at the Institute for Social Research.
Jennifer Romano Bergstrom has over a decade of experience planning, conducting and managing user-centered research projects. At Facebook, she leads user experience (UX) research for Privacy and Safety Check (prior: Facebook Lite; Videos). She leads, conducts, and manages UX studies across multiple teams simultaneously and collaborates across disciplines to understand the user experience. Jen specializes in experimental design, implicit l
  • Explains how to design and conduct usability tests and analyze and report the findings
  • Includes examples on how to conduct usability testing on any type of survey, from a simple three-question survey on a mobile device, to a complex, multi-page establishment survey
  • Presents real-world examples from leading usability and survey professionals, including a diverse collection of case studies and considerations for using and combining other methods
  • Discusses the facilities, materials, and software needed for usability testing, including in-lab testing, remote testing, and eye tracking