Discussion: Evaluating Research Questions

Discussion: Evaluating Research Questions

Discussion: Evaluating Research Questions

Post a critique of the research study in which you:

Evaluate the research questions and hypotheses.

The Research Questions and Hypotheses Checklist serves as a guide for your evaluation. Please do not respond to the checklist in a Yes/No format in writing your Discussion post.

Identify the type of quantitative research design used and explain how the researchers implemented the design.
Analyze alignment among the theory, problem, purpose, research questions and hypotheses, and design.
Be sure to support your Main Issue Post and Response Post with reference to the week’s Learning Resources and other scholarly evidence in APA Style.

ORDER CUSTOM, PLAGIARISM-FREE PAPER

ABSTRACT Background: Effective teaching is key in preparing students to become successful evidence- based healthcare professionals. The effectiveness of graduate evidence-based practice (EBP) pedagogy is not often a subject of research studies.

Purpose: The purpose of this study was to determine how faculty from the 50 top graduate nursing schools in the United States perceived the effectiveness of EBP courses for graduate nursing students.

Methods: A descriptive cross-sectional design was used to explore faculty perception of the effectiveness of EBP courses. A web-based survey was used for data collection. A total of 45 questionnaires were subjected to statistical analysis. Discussion: Evaluating Research Questions

Results: The mean perception of the effectiveness of EBP courses for the whole sample, on a scale from 1 to 7, was 5.58 (min. 4.29; max. 6.73), a higher score signifying higher perceived effectiveness. The highest rated item concerned a school’s access to different databases. The strongest correlations were found between the total score and the scores for items describing students’ opportunities to strengthen and apply their EBP skills (rs = .66). The internal consistency of the Perception of Effectiveness of EBP Courses scale, based on standardized Cronbach’s alpha, was .84, which signifies strong internal consistency. Faculty perceived themselves as most competent at the following EBP skills: (a) “Asking questions regarding patients’ care” (6.56), (b) “Considering patient preferences when implementing EBP” (6.40), and (c) “Critically appraising the relevant body of evidence to address clinical questions” (6.40).

Discussion: To strengthen the effectiveness of EBP courses, students should have more oppor- tunities to implement their EBP knowledge and skills after completing EBP courses.

Linking Evidence to Action: Evaluation of faculty perceptions of the effectiveness of EBP courses can help to guide the development of nursing school curricula that better integrate EBP. Further evaluation of the psychometric properties of the instrument used to measure perception of the effectiveness of EBP courses is required along with objective measures of faculty knowledge and skills in teaching EBP.

BACKGROUND AND SIGNIFICANCE Worldwide, evidence-based practice (EBP) has emerged as a major healthcare initiative (Thiel & Ghosh, 2008). One of the most consistent findings in health service research is the gap between best practice (as determined by scientific evidence) and actual clinical care (Flores-Mateo & Argimon, 2007). To accel- erate the translation of research findings into clinical practice, two major outcomes must be achieved: (a) Advanced practice and direct care nurses must acquire sufficient EBP knowledge and skills as well as strong beliefs about the value of EBP in clinical settings, and (b) educators must teach their students the EBP process to instill in them lifelong skills and the motivation to deliver the highest quality of care (Melnyk, Fineout-Overholt, Feinstein, Sadler, & Green-Hernandez, 2008).

Findings from a recent national survey by Melnyk, Fineout- Overholt, Gallagher-Ford, and Kaplan (2012) indicated that nurses surveyed across the country are ready for and do value EBP. The majority of participants who responded to the survey reported wanting to gain more knowledge and skills in order to deliver evidence-based care in their institutions. Nurses cited a top requirement for helping them implement EBP in daily practice as education.

For students to become evidence-based healthcare profes- sionals, the teaching of EBP has to be effective (Spek, Wolf, Dijk, & Lucas, 2012). Little has been published about teach- ing EBP to nursing students (Stiffler & Cullen, 2010). Though there are systematic reviews and meta-analyses of teaching EBP in schools of medicine, there is a dearth of research in nursing,

Worldviews on Evidence-Based Nursing, 2014; 11:6, 401–413. 401 C© 2014 Sigma Theta Tau International

Perception of Effectiveness of EBP Courses

especially in regard to graduate-level EBP pedagogy. According to Fineout-Overholt and Johnston (2005), further research is needed to assess effective teaching and evaluation strategies for EBP.

Teaching EBP to nursing students is usually based on the basic steps of EBP. Melnyk and Fineout-Overholt (2011) added two more steps to the five basic steps of EBP, which include: (a) cultivate a spirit of inquiry; (b) ask the burning clinical question in PICOT (P = patient population; I = intervention or area of interest; C = comparison intervention or group; O = outcomes; and T = time) format; (c) search for and collect the most relevant best evidence; (d) critically appraise the evidence; (e) integrate the best evidence with one’s clinical expertise and patient preferences and values in making a practice decision or change; (f) evaluate outcomes of the practice decision or change based on evidence; and (g) disseminate the outcomes of the EBP decision or change. The seven steps of EBP can serve as a structure around which to build EBP curriculum for graduate nursing students.

The effectiveness of graduate EBP pedagogy is not often a subject of research studies. One of the problems is the com- plexity of the EBP process and the difficulty of assessing all aspects of its effectiveness. Shaneyefelt and colleagues (2006) performed a systematic review of EBP instruments. Their re- sults showed that the majority of instruments targeted stu- dents and postgraduate trainees, whereas nonphysicians were rarely evaluated. The available instruments most commonly evaluated EBP skills (predominantly focusing on the critical ap- praisal of evidence), knowledge, attitudes, and behaviors. Most instruments are designed for specific purposes, such as the evaluation of theoretical EBP courses (instruments to assess cognitive skills), or the evaluation of EBP in clinical practice (instruments to assess performance-based skills and applica- tion; Ilic, 2009). Discussion: Evaluating Research Questions