The effect of question preface on response rates to a telephone survey of sexual experience

1994 ◽  
Vol 23 (2) ◽  
pp. 203-215 ◽  
Author(s):  
Michael W. Wiederman ◽  
David L. Weis ◽  
Elizabeth Rice Allgeier
2021 ◽  
Vol 1 (1) ◽  
pp. 21-32
Author(s):  
Robert Tortora

This paper reviews response trends over 24 consecutive quarters of a National Random Digit Dial telephone survey. Trends for response rates and refusal rates are studied as well as the components of response rate, namely, contact, cooperation and completion rates. In addition other rates, including answering machine, busy and no answer are studied. While refusal rates declined over the six year period, contact and cooperation rates significantly declined causing response rates to decline. Answering machine rates and busy rates also showed a significant increase over time. Finally, correlation’s among the variables of interest are presented. The response rate is negatively correlated with the busy rate, the answering machine rate and the no answer rate. Implications of the above trends are discussed.


Mathematics ◽  
2021 ◽  
Vol 9 (17) ◽  
pp. 2035
Author(s):  
Álvaro Briz-Redón

The respondent burden refers to the effort required by a respondent to answer a questionnaire. Although this concept was introduced decades ago, few studies have focused on the quantitative detection of such a burden. In this paper, a face-to-face survey and a telephone survey conducted in Valencia (Spain) are analyzed. The presence of burden is studied in terms of both item non-response rates and careless response rates. In particular, two moving-window statistics based on the coefficient of unalikeability and the average longstring index are proposed for characterizing careless responding. Item non-response and careless response rates are modeled for each survey by using mixed-effects models, including respondent-level and question-level covariates and also temporal random effects to assess the existence of respondent burden during the questionnaire. The results suggest that the sociodemographic characteristics of the respondents and the typology of the question impact item non-response and careless response rates. Moreover, the estimates of the temporal random effects indicate that item non-response and careless response rates are time-varying, suggesting the presence of respondent burden. In particular, an increasing trend in item non-response rates in the telephone survey has been found, which supports the hypothesis of the burden. Regarding careless responding, despite the presence of some temporal variation, no clear trend has been identified.


1994 ◽  
Vol 23 (2) ◽  
pp. 200-206 ◽  
Author(s):  
Sharon I. Gripp ◽  
A.E. Luloff ◽  
Robert D. Yonkers

Response rates are one indicator of a survey's data quality, as a great deal of importance has been placed on the mail survey's response rate. However, a telephone survey's response rate usually is not reported. Even if one is reported, the numbers used in the calculation are rarely defined making the response rate interpretation unclear. Using a recent telephone survey of Pennsylvania dairy managers, this paper demonstrates how telephone survey data should be reported. Essentially, every research report should include a discussion of how the survey was conducted, a disposition table, and well-defined formulas used to calculate response rates.


2013 ◽  
Vol 66 (12) ◽  
pp. 1417-1421 ◽  
Author(s):  
Renee N. Carey ◽  
Alison Reid ◽  
Timothy R. Driscoll ◽  
Deborah C. Glass ◽  
Geza Benke ◽  
...  

2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Jorge Matías-Guiu ◽  
Pedro Jesús Serrano-Castro ◽  
José Ángel Mauri-Llerda ◽  
Francisco José Hernández-Ramos ◽  
Juan Carlos Sánchez-Alvarez ◽  
...  

Descriptive epidemiology research involves collecting data from large numbers of subjects. Obtaining these data requires approaches designed to achieve maximum participation or response rates among respondents possessing the desired information. We analyze participation and response rates in a population-based epidemiological study though a telephone survey and identify factors implicated in consenting to participate. Rates found exceeded those reported in the literature and they were higher for afternoon calls than for morning calls. Women and subjects older than 40 years were the most likely to answer the telephone. The study identified geographical differences, with higher RRs in districts in southern Spain that are not considered urbanized. This information may be helpful for designing more efficient community epidemiology projects.


2020 ◽  
Vol 228 (1) ◽  
pp. 14-24 ◽  
Author(s):  
Tanja Burgard ◽  
Michael Bošnjak ◽  
Nadine Wedderhoff

Abstract. A meta-analysis was performed to determine whether response rates to online psychology surveys have decreased over time and the effect of specific design characteristics (contact mode, burden of participation, and incentives) on response rates. The meta-analysis is restricted to samples of adults with depression or general anxiety disorder. Time and study design effects are tested using mixed-effects meta-regressions as implemented in the metafor package in R. The mean response rate of the 20 studies fulfilling our meta-analytic inclusion criteria is approximately 43%. Response rates are lower in more recently conducted surveys and in surveys employing longer questionnaires. Furthermore, we found that personal invitations, for example, via telephone or face-to-face contacts, yielded higher response rates compared to e-mail invitations. As predicted by sensitivity reinforcement theory, no effect of incentives on survey participation in this specific group (scoring high on neuroticism) could be observed.


Sign in / Sign up

Export Citation Format

Share Document