Atjaunināt sīkdatņu piekrišanu

E-grāmata: Advances in Questionnaire Design, Development, Evaluation and Testing

Edited by , Edited by , Edited by , Edited by , Edited by , Edited by
  • Formāts: EPUB+DRM
  • Izdošanas datums: 24-Oct-2019
  • Izdevniecība: John Wiley & Sons Inc
  • Valoda: eng
  • ISBN-13: 9781119263647
Citas grāmatas par šo tēmu:
  • Formāts - EPUB+DRM
  • Cena: 114,15 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Bibliotēkām
  • Formāts: EPUB+DRM
  • Izdošanas datums: 24-Oct-2019
  • Izdevniecība: John Wiley & Sons Inc
  • Valoda: eng
  • ISBN-13: 9781119263647
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

A new and updated definitive resource for survey questionnaire testing and evaluation

Building on the success of the first Questionnaire Development, Evaluation, and Testing (QDET) conference in 2002, this book brings together leading papers from the Second International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2) held in 2016.  The volume assesses the current state of the art and science of QDET; examines the importance of methodological attention to the questionnaire in the present world of information collection; and ponders how the QDET field can anticipate new trends and directions as information needs and data collection methods continue to evolve.

Featuring contributions from international experts in survey methodology, Advances in Questionnaire Design, Development, Evaluation and Testing includes latest insights on question characteristics, usability testing, web probing, and other pretesting approaches, as well as:   

  • Recent developments in the design and evaluation of digital and self-administered surveys
  • Strategies for comparing and combining questionnaire evaluation methods
  • Approaches for cross-cultural and cross-national questionnaire development
  • New data sources and methodological innovations during the last 15 years
  • Case studies and practical applications

Advances in Questionnaire Design, Development, Evaluation and Testing serves as a forum to prepare researchers to meet the next generation of challenges, making it an excellent resource for researchers and practitioners in government, academia, and the private sector. 

List of Contributors xvii
Preface xxiii
Part I Assessing the Current Methodology for Questionnaire Design, Development, Testing, and Evaluation 1(116)
1 Questionnaire Design, Development, Evaluation, and Testing: Where Are We, and Where Are We Headed?
3(22)
Gordon B. Willis
1.1 Current State of the Art and Science of QDET
3(8)
1.2 Relevance of QDET in the Evolving World of Surveys
11(5)
1.3 Looking Ahead: Further Developments in QDET
16(3)
1.4 Conclusion
19(1)
References
20(5)
2 Asking the Right Questions in the Right Way: Six Needed Changes in Questionnaire Evaluation and Testing Methods
25(22)
Don A. Dillman
2.1 Personal Experiences with Cognitive Interviews and Focus Groups
25(4)
2.2 My 2002 Experience at QDET
29(4)
2.3 Six Changes in Survey Research that Require New Perspectives on Questionnaire Evaluation and Testing
33(9)
2.4 Conclusion
42(1)
References
43(4)
3 A Framework for Making Decisions About Question Evaluation Methods
47(28)
Roger Tourangeau
Aaron Maitland
Darby Steiger
Ting Yan
3.1 Introduction
47(1)
3.2 Expert Reviews
48(3)
3.3 Laboratory Methods
51(4)
3.4 Field Methods
55(4)
3.5 Statistical Modeling for Data Quality
59(4)
3.6 Comparing Different Methods
63(4)
3.7 Recommendations
67(2)
References
69(6)
4 A Comparison of Five Question Evaluation Methods in Predicting the Validity of Respondent Answers to Factual Items
75(16)
Aaron Maitland
Stanley Presser
4.1 Introduction
75(1)
4.2 Methods
76(3)
4.3 Results
79(5)
4.4 Discussion
84(1)
References
85(6)
5 Combining Multiple Question Evaluation Methods: What Does It Mean When the Data Appear to Conflict?
91(26)
Jo d'Ardenne
Debbie Collins
5.1 Introduction
91(1)
5.2 Questionnaire Development Stages
92(1)
5.3 Selection of Case Studies
93(2)
5.4 Case Study 1: Conflicting Findings Between Focus Groups and Cognitive Interviews
95(2)
5.5 Case Study 2: Conflicting Findings Between Eye-Tracking, Respondent Debriefing Questions, and Interviewer Feedback
97(3)
5.6 Case Study 3: Complementary Findings Between Cognitive Interviews and Interviewer Feedback
100(4)
5.7 Case Study 4: Combining Qualitative and Quantitative Data to Assess Changes to a Travel Diary
104(6)
5.8 Framework of QT Methods
110(1)
5.9 Summary and Discussion
110(4)
References
114(3)
Part II Question Characteristics, Response Burden, and Data Quality 117(170)
6 The Role of Question Characteristics in Designing and Evaluating Survey Questions
119(34)
Jennifer Dykema
Nora Cate Schaeffer
Dana Garbarski
Michael Hout
6.1 Introduction
119(1)
6.2 Overview of Some of the Approaches Used to Conceptualize, Measure, and Code Question Characteristics
120(7)
6.3 Taxonomy of Question Characteristics
127(5)
6.4 Case Studies
132(9)
6.5 Discussion
141(6)
Acknowledgments
147(1)
References
148(5)
7 Exploring the Associations Between Question Characteristics, Respondent Characteristics, Interviewer Performance Measures, and Survey Data Quality
153(40)
James M. Dahlhamer
Aaron Maitland
Heather Ridolfo
Antuane Allen
Dynesha Brooks
7.1 Introduction
153(4)
7.2 Methods
157(17)
7.3 Results
174(8)
7.4 Discussion
182(9)
Disclaimer
191(1)
References
191(2)
8 Response Burden: What Is It and What Predicts It?
193(20)
Ting Yan
Scott Fricker
Shirley Tsai
8.1 Introduction
193(4)
8.2 Methods
197(5)
8.3 Results
202(4)
8.4 Conclusions and Discussion
206(4)
Acknowledgments
210(1)
References
210(3)
9 The Salience of Survey Burden and Its Effect on Response Behavior to Skip Questions: Experimental Results from Telephone and Web Surveys
213(16)
Frauke Kreuter
Stephanie Eckman
Roger Tourangeau
9.1 Introduction
213(3)
9.2 Study Designs and Methods
216(3)
9.3 Manipulating the Interleafed Format
219(5)
9.4 Discussion and Conclusion
224(2)
Acknowledgments
226(1)
References
227(2)
10 A Comparison of Fully Labeled and Top-Labeled Grid Question Formats
229(30)
Jolene D. Smyth
Kristen Olson
10.1 Introduction
229(7)
10.2 Data and Methods
236(7)
10.3 Findings
243(10)
10.4 Discussion and Conclusions
253(1)
Acknowledgments
254(1)
References
255(4)
11 The Effects of Task Difficulty and Conversational Cueing on Answer Formatting Problems in Surveys
259(28)
Yfke Ongena
Sanne Unger
11.1 Introduction
259(3)
11.2 Factors Contributing to Respondents' Formatting Problems
262(5)
11.3 Hypotheses
267(1)
11.4 Method and Data
268(7)
11.5 Results
275(3)
11.6 Discussion and Conclusion
278(3)
11.7 Further Expansion of the Current Study
281(1)
11.8 Conclusions
282(1)
References
283(4)
Part III Improving Questionnaires on the Web and Mobile Devices 287(184)
12 A Compendium of Web and Mobile Survey Pretesting Methods
289(26)
Emily Geisen
Joe Murphy
12.1 Introduction
289(1)
12.2 Review of Traditional Pretesting Methods
290(4)
12.3 Emerging Pretesting Methods
294(14)
References
308(7)
13 Usability Testing Online Questionnaires: Experiences at the U.S. Census Bureau
315(34)
Elizabeth Nichols
Erica Olmsted-Hawala
Temika Holland
Amy Anderson Riemer
13.1 Introduction
315(1)
13.2 History of Usability Testing Self-Administered Surveys at the US Census Bureau
316(1)
13.3 Current Usability Practices at the Census Bureau
317(3)
13.4 Participants: "Real Users, Not User Stories"
320(3)
13.5 Building Usability Testing into the Development Life Cycle
323(4)
13.6 Measuring Accuracy
327(4)
13.7 Measuring Efficiency
331(4)
13.8 Measuring Satisfaction
335(2)
13.9 Retrospective Probing and Debriefing
337(2)
13.10 Communicating Findings with the Development Team
339(1)
13.11 Assessing Whether Usability Test Recommendations Worked
340(1)
13.12 Conclusions
341(1)
References
341(8)
14 How Mobile Device Screen Size Affects Data Collected in Web Surveys
349(26)
Daniele Toninelli
Melanie Revilla
14.1 Introduction
349(1)
14.2 Literature Review
350(2)
14.3 Our Contribution and Hypotheses
352(3)
14.4 Data Collection and Method
355(6)
14.5 Main Results
361(7)
14.6 Discussion
368(1)
Acknowledgments
369(1)
References
370(5)
15 Optimizing Grid Questions for Smartphones: A Comparison of Optimized and Non-Optimized Designs and Effects on Data Quality on Different Devices
375(28)
Trine Dale
Heidi Walsoe
15.1 Introduction
375(1)
15.2 The Need for Change in Questionnaire Design Practices
376(2)
15.3 Contribution and Research Questions
378(2)
15.4 Data Collection and Methodology
380(6)
15.5 Main Results
386(6)
15.6 Discussion
392(5)
Acknowledgments
397(1)
References
397(6)
16 Learning from Mouse Movements: Improving Questionnaires and Respondents' User Experience Through Passive Data Collection
403(24)
Rachel Horwitz
Sarah Brockhaus
Felix Henninger
Pascal J. Kieslich
Malte Schierholz
Florian Keusch
Frauke Kreuter
16.1 Introduction
403(1)
16.2 Background
404(5)
16.3 Data
409(1)
16.4 Methodology
410(5)
16.5 Results
415(5)
16.6 Discussion
420(3)
References
423(4)
17 Using Targeted Embedded Probes to Quantify Cognitive Interviewing Findings
427(24)
Paul Scanlon
17.1 Introduction
427(4)
17.2 The NCHS Research and Development Survey
431(2)
17.3 Findings
433(12)
17.4 Discussion
445(3)
References
448(3)
18 The Practice of Cognitive Interviewing Through Web Probing
451(20)
Stephanie Fowler
Gordon B. Willis
18.1 Introduction
451(1)
18.2 Methodological Issues in the Use of Web Probing for Pretesting
452(1)
18.3 Testing the Effect of Probe Placement
453(2)
18.4 Analyses of Responses to Web Probes
455(4)
18.5 Qualitative Analysis of Responses to Probes
459(1)
18.6 Qualitative Coding of Responses
459(3)
18.7 Current State of the Use of Web Probes
462(3)
18.8 Limitations
465(1)
18.9 Recommendations for the Application and Further Evaluation of Web Probes
466(2)
18.10 Conclusion
468(1)
Acknowledgments
468(1)
References
468(3)
Part IV Cross-Cultural and Cross-National Questionnaire Design and Evaluation 471(100)
19 Optimizing Questionnaire Design in Cross-National and Cross-Cultural Surveys
473(20)
Tom W. Smith
19.1 Introduction
473(1)
19.2 The Total Survey Error Paradigm and Comparison Error
474(3)
19.3 Cross-Cultural Survey Guidelines and Resources
477(1)
19.4 Translation
478(2)
19.5 Developing Comparative Scales
480(3)
19.6 Focus Groups and Pretesting in Cross-National/Cultural Surveys
483(1)
19.7 Tools for Developing and Managing Cross-National Surveys
484(1)
19.8 Resources for Developing and Testing Cross-National Measures
485(1)
19.9 Pre- and Post-Harmonization
486(2)
19.10 Conclusion
488(1)
References
488(5)
20 A Model for Cross-National Questionnaire Design and Pretesting
493(28)
Rory Fitzgerald
Diana Zavala-Rojas
20.1 Introduction
493(1)
20.2 Background
493(2)
20.3 The European Social Survey
495(1)
20.4 ESS Questionnaire Design Approach
496(1)
20.5 Critique of the Seven-Stage Approach
497(1)
20.6 A Model for Cross-National Questionnaire Design and Pretesting
497(4)
20.7 Evaluation of the Model for Cross-National Questionnaire Design and Pretesting Using the Logical Framework Matrix (LFM)
501(11)
20.8 Conclusions
512(2)
References
514(7)
21 Cross-National Web Probing: An Overview of Its Methodology and Its Use in Cross-National Studies
521(24)
Dorothee Behr
Katharina Meitinger
Michael Braun
Lars Kaczmirek
21.1 Introduction
521(2)
21.2 Cross-National Web Probing - Its Goal, Strengths, and Weaknesses
523(3)
21.3 Access to Respondents Across Countries: The Example of Online Access Panels and Probability-Based Panels
526(1)
21.4 Implementation of Standardized Probes
527(5)
21.5 Translation and Coding Answers to Cross-Cultural Probes
532(1)
21.6 Substantive Results
533(3)
21.7 Cross-National Web Probing and Its Application Throughout the Survey Life Cycle
536(2)
21.8 Conclusions and Outlook
538(1)
Acknowledgments
539(1)
References
539(6)
22 Measuring Disability Equality in Europe: Design and Development of the European Health and Social Integration Survey Questionnaire
545(26)
Amanda Wilmot
22.1 Introduction
545(1)
22.2 Background
546(2)
22.3 Questionnaire Design
548(5)
22.4 Questionnaire Development and Testing
553(7)
22.5 Survey Implementation
560(3)
22.6 Lessons Learned
563(3)
22.7 Final Reflections
566(1)
Acknowledgments
567(1)
References
567(4)
Part V Extensions and Applications 571(198)
23 Regression-Based Response Probing for Assessing the Validity of Survey Questions
573(20)
Patrick Sturgis
Ian Brunton-Smith
Jonathan Jackson
23.1 Introduction
573(1)
23.2 Cognitive Methods for Assessing Question Validity
574(3)
23.3 Regression-Based Response Probing
577(2)
23.4 Example 1: Generalized Trust
579(1)
23.5 Example 2: Fear of Crime
580(1)
23.6 Data
581(5)
23.7 Discussion
586(2)
References
588(5)
24 The Interplay Between Survey Research and Psychometrics, with a Focus on Validity Theory
593(20)
Bruno D. Zumbo
Jose-Luis Padilla
24.1 Introduction
593(2)
24.2 An Over-the-Shoulder Look Back at Validity Theory and Validation Practices with an Eye toward Describing Contemporary Validity Theories
595(7)
24.3 An Approach to Validity that Bridges Psychometrics and Survey Design
602(4)
24.4 Closing Remarks
606(2)
References
608(5)
25 Quality-Driven Approaches for Managing Complex Cognitive Testing Projects
613(26)
Martha Stapleton
Darby Steiger
Mary C. Davis
25.1 Introduction
613(1)
25.2 Characteristics of the Four Cognitive Testing Projects
614(1)
25.3 Identifying Detailed, Quality-Driven Management Approaches for Qualitative Research
615(1)
25.4 Identifying Principles for Developing Quality-Driven Management Approaches
616(1)
25.5 Applying the Concepts of Transparency and Consistency
617(1)
25.6 The 13 Quality-Driven Management Approaches
618(14)
25.7 Discussion and Conclusion
632(2)
References
634(5)
26 Using Iterative, Small-Scale Quantitative and Qualitative Studies: A Review of 15 Years of Research to Redesign a Major US Federal Government Survey
639(32)
Joanne Pascale
26.1 Introduction
639(2)
26.2 Measurement Issues in Health Insurance
641(4)
26.3 Methods and Results
645(15)
26.4 Discussion
660(3)
26.5 Final Reflections
663(1)
References
664(7)
27 Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey
671(26)
Robin L. Kaplan
Brandon Kopp
Polly Phipps
27.1 Introduction
671(1)
27.2 The Sleep Gap
672(2)
27.3 The Present Research
674(4)
27.4 Study 1: Behavior Coding 67S
27.5 Study 2: Cognitive Interviews
678(4)
27.6 Study 3: Quantitative Study
682(4)
27.7 Study 4: Validation Study
686(3)
27.8 General Discussion
689(3)
27.9 Implications and Future Directions
692(1)
References
692(5)
28 Questionnaire Design Issues in Mail Surveys of All Adults in a Household
697(26)
Douglas Williams
J. Michael Brick
W. Sherman Edwards
Pamela Giambo
28.1 Introduction
697(1)
28.2 Background
698(1)
28.3 The NCVS and Mail Survey Design Challenges
699(5)
28.4 Field Test Methods and Design
704(2)
28.5 Outcome Measures
706(2)
28.6 Findings
708(8)
28.7 Summary
716(1)
28.8 Discussion
716(3)
28.9 Conclusion
719(1)
References
720(3)
29 Planning Your Multimethod Questionnaire Testing Bento Box: Complementary Methods for a Well-Balanced Test
723(26)
Jaki S. McCarthy
29.1 Introduction
723(2)
29.2 A Questionnaire Testing Bento Box
725(8)
29.3 Examples from the Census of Agriculture Questionnaire Testing Bento Box
733(10)
29.4 Conclusion
743(1)
References
744(5)
30 Flexible Pretesting on a Tight Budget: Using Multiple Dependent Methods to Maximize Effort-Return Trade-Offs
749(20)
Matt Jans
Jody L. Herman
Joseph Viana
David Grant
Royce Park
Bianca D.M. Wilson
Jane Tom
Nicole Lordi
Sue Holtby
30.1 Introduction
749(3)
30.2 Evolution of a Dependent Pretesting Approach for Gender Identity Measurement
752(7)
30.3 Analyzing and Synthesizing Results
759(5)
30.4 Discussion
764(2)
Acknowledgments
766(1)
References
766(3)
Index 769
PAUL C. BEATTY is Chief of the Center for Behavioral Science Methods at the U.S. Census Bureau.

DEBBIE COLLINS is a Senior Research Director at the National Centre for Social Research, UK.

LYN KAYE is a consultant in Survey Research Methods, and previously Statistics New Zealand's Senior Researcher.

JOSE???LUIS PADILLA is Professor of Methodology of Behavioral Sciences at University of Granada, Spain.

GORDON B. WILLIS is Cognitive Psychologist at the National Cancer Institute, National Institutes of Health, USA.

AMANDA WILMOT is a Senior Study Director at Westat, USA.