Nonsampling errors and their implication for estimates of current cancer treatment using the Medical Expenditure Panel Survey

Jeffrey M. Gonzalez, PhD, Office of Survey Methods Research, U.S. Bureau of Labor Statistics
Lisa B. Mirel, MS, Office of Analysis and Epidemiology, National Center for Health Statistics, Centers for Disease Control and Prevention
Nina Verevkina, PhD, Department of Health Policy & Administration, The Pennsylvania State University


Survey nonsampling errors refer to the components of total survey error (TSE) that result from failures in data collection and processing procedures. Evaluating nonsampling errors can lead to a better understanding of their sources, which in turn, can inform survey inference and assist in the design of future surveys. Data collected via supplemental questionnaires can provide a means for evaluating nonsampling errors because it may provide additional information on survey nonrespondents and/or measurements of the same concept over repeated trials on the same sampling unit. We used a supplemental questionnaire administered to cancer survivors to explore potential nonsampling errors, focusing …


No Comments

Question Order Experiments in the German-European Context

Henning Silber, GESIS - Leibniz-Institute for the Social Sciences, Germany
Jan Karem Höhne, University of Göttingen, Germany
Stephan Schlosser, University of Göttingen, Germany


In this paper, we investigate the context stability of questions on political issues in cross-national surveys. For this purpose, we conducted three replication studies (N1 = 213; N2 = 677; N3 = 1,489) based on eight split-ballot design experiments with undergraduate and graduate students to test for question order effects. The questions, which were taken from the Eurobarometer (2013), included questions on perceived performance and identification. Respondents were randomly assigned to one of two experimental groups which received the questions either in the original or the reversed order. In all three studies, respondents answered the questions about Germany and the …


, ,

No Comments

The effect of interviewers’ motivation and attitudes on respondents‘ consent to contact secondary respondents in a multi-actor design

Jette Schröder, GESIS – Leibniz Institute for the Social Sciences, Germany
Claudia Schmiedeberg, University of Munich (LMU), Germany
Laura Castiglioni, University of Munich (LMU), Germany


In surveys using a multi-actor design, data is collected not only from sampled ‘primary’ respondents, but also from related persons such as partners, colleagues, or friends. For this purpose, primary respondents are asked for their consent to survey such ‘secondary’ respondents. The existence of interviewer effects on unit nonresponse of sampled respondents in surveys is well documented, and research increasingly focuses on interviewer attributes in the non-response process. However, research regarding interviewer effects on unit nonresponse of secondary respondents, more specifically, primary respondents’ consent to include secondary respondents into the survey, is sparse. We use the German Family Panel (pairfam) …


, , , ,

No Comments

A Case Study of Error in Survey Reports of Move Month Using the U.S. Postal Service Change of Address Records

Mary H. Mulry, U.S. Census Bureau, Washington, DC
Elizabeth M. Nichols, U.S. Census Bureau, Washington, DC
Jennifer Hunter Childs, U.S. Census Bureau, Washington, DC


Correctly recalling where someone lived as of a particular date is critical to the accuracy of the once-a-decade U.S. decennial census. The data collection period for the 2010 Census occurred over the course of a few months: February to August, with some evaluation operations occurring up to 7 months after that. The assumption was that respondents could accurately remember moves and move dates on and around April 1st up to 11 months afterwards. We show how statistical analyses can be used to investigate the validity of this assumption by comparing self-reports and proxy-reports of the month of a move in …


, , ,

No Comments

Measuring the survey climate: the Flemish case

Sara Barbier, Centre for Sociological Research, University of Leuven, Belgium
Geert Loosveldt, Centre for Sociological Research, University of Leuven, Belgium
Ann Carton, Research Centre of the Flemish Government, Belgium


Researchers in several countries have regularly reported decreasing response rates for surveys and the need for increased efforts in order to attain an acceptable response rate: two things that can be seen as signs of a worsening survey climate. At the same time, differences between countries and surveys with regard to the actual level and evolution of response rates have also been noted. Some of these differences are probably linked to differences in the survey content or design. This may hinder the study of the evolving survey climate over time, based on different surveys in different countries, because more readily …


, , , , , ,

No Comments

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License