scholarly journals Facilitators and barriers to successful recruitment into a large comparative effectiveness trial: a qualitative study

2019 ◽  
Vol 8 (10) ◽  
pp. 815-826
Author(s):  
Stephanie Behringer-Massera ◽  
Terysia Browne ◽  
Geny George ◽  
Sally Duran ◽  
Andrea Cherrington ◽  
...  

Background: Recruitment of participants into research studies, especially individuals from minority groups, is challenging; lack of diversity may lead to biased findings. Aim: To explore beliefs about research participation among individuals who were approached and eligible for the GRADE study. Methods: In-depth qualitative telephone interviews with randomized participants (n = 25) and eligible individuals who declined to enroll (n = 26). Results: Refusers and consenters differed in trust and perceptions of risk, benefits and burden of participation. Few participants understood how comparative effectiveness research differed from other types of trials; however, some features of comparative effectiveness research were perceived as lower risk. Conclusion: We identified facilitators and addressable barriers to participation in research studies.

2011 ◽  
Vol 25 (3) ◽  
pp. 191-209 ◽  
Author(s):  
Maria C. Katapodi ◽  
Laurel L. Northouse

The increased demand for evidence-based health care practices calls for comparative effectiveness research (CER), namely the generation and synthesis of research evidence to compare the benefits and harms of alternative methods of care. A significant contribution of CER is the systematic identification and synthesis of available research studies on a specific topic. The purpose of this article is to provide an overview of methodological issues pertaining to systematic reviews and meta-analyses to be used by investigators with the purpose of conducting CER. A systematic review or meta-analysis is guided by a research protocol, which includes (a) the research question, (b) inclusion and exclusion criteria with respect to the target population and studies, © guidelines for obtaining relevant studies, (d) methods for data extraction and coding, (e) methods for data synthesis, and (f ) guidelines for reporting results and assessing for bias. This article presents an algorithm for generating evidence-based knowledge by systematically identifying, retrieving, and synthesizing large bodies of research studies. Recommendations for evaluating the strength of evidence, interpreting findings, and discussing clinical applicability are offered.


Sign in / Sign up

Export Citation Format

Share Document