Limits...
Development and implementation of an online hybrid model for teaching evidence-based practice to health professions: processes and outcomes from an Australian experience.

Kumar S, Perraton L, Machotka Z - Adv Med Educ Pract (2010)

Bottom Line: In 2006, the overall student satisfaction rating was 62.07, in 2007 it was 65.8, and in 2008 it was 55.7.Qualitative findings also supported these quantitative findings, indicating improvements in the structure and process of the new course.The outcomes from the evaluation of the remodeled course provide evidence of a consistent quality learning experience for students, and support the concept of using research evidence to guide the development of teaching and learning practices in the training of health professionals.

View Article: PubMed Central - PubMed

Affiliation: International Centre for Allied Health Evidence, School of Health Sciences, University of South Australia, Adelaide, South Australia.

ABSTRACT
Evidence-based practice is now considered to be a vital element of health care service delivery. The call to use evidence to inform other areas, such as teaching and learning, is growing. This paper reports on the processes used to integrate best evidence into teaching practices within an undergraduate health science program. An existing course within this program at an Australian tertiary institution was remodeled by a newly appointed course coordinator in response to critical feedback from student cohorts. A systematic, iterative, five-step approach was used in the development of the new course. The process of development was influenced by current research evidence, an audit of the existing course, and critical feedback from students. The new course was evaluated using quantitative and qualitative research methods for five study periods. In 2005, prior to implementing the changes, the overall student satisfaction rating for the course was zero (representing the lowest possible score). In 2006, the overall student satisfaction rating was 62.07, in 2007 it was 65.8, and in 2008 it was 55.7. Qualitative findings also supported these quantitative findings, indicating improvements in the structure and process of the new course. The outcomes from the evaluation of the remodeled course provide evidence of a consistent quality learning experience for students, and support the concept of using research evidence to guide the development of teaching and learning practices in the training of health professionals.

No MeSH data available.


Course evaluation instrument.This evaluation form may consist of two types of questions. The first type asks you to respond to a series of statements by indicating your agreement or disagreement with each of the statements. It is important that you respond to each item. The second type is a straightforward question to which you respond with text. You do not have to complete the text response items.Abbreviations: SA, strongly agree; A, agree; N, neutral; D, disagree; SD, strongly disagree.
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC3643125&req=5

f3-amep-1-001: Course evaluation instrument.This evaluation form may consist of two types of questions. The first type asks you to respond to a series of statements by indicating your agreement or disagreement with each of the statements. It is important that you respond to each item. The second type is a straightforward question to which you respond with text. You do not have to complete the text response items.Abbreviations: SA, strongly agree; A, agree; N, neutral; D, disagree; SD, strongly disagree.

Mentions: The revamped course was evaluated at the end of five study periods between 2006 and 2008. The anonymous course evaluation was conducted online as part of educational institutions’ Course Evaluation Instrument (CEI) and Student Experience of Teaching (SET). These tools are shown in Figures 3 and 4. These evaluation tools are routinely used for each course offered at the educational institution. This allows data to be compared within courses and across courses at the educational institution. Quantitative evaluation using CEI from 2006, 2007, and 2008 indicated improvements in students’ satisfaction with the course. In 2005, prior to implementing the hybrid model, the mean score for question 10 in the CEI (“Overall I was satisfied with the quality of this course”) was 0. No other evaluation data for 2005 were made available to the new course coordinator because of ethical and confidentiality reasons. In 2006 the mean score for this question was 62.07, in 2007 the mean score was 65.8, and in 2008 the mean score was 55.7 (data from only one study period was available in 2008). Quantitative evaluation using SET from 2006, 2007, and 2008 also indicated improvements in students’ satisfaction with the educator for this course. In 2006, the mean score for question 10 in the SET (“Overall, I was satisfied with the performance of this staff member”) was 82.7, in 2007 the mean score was 86.2, and in 2008 the mean score was 86.5 (data from only one study period was available in 2008).


Development and implementation of an online hybrid model for teaching evidence-based practice to health professions: processes and outcomes from an Australian experience.

Kumar S, Perraton L, Machotka Z - Adv Med Educ Pract (2010)

Course evaluation instrument.This evaluation form may consist of two types of questions. The first type asks you to respond to a series of statements by indicating your agreement or disagreement with each of the statements. It is important that you respond to each item. The second type is a straightforward question to which you respond with text. You do not have to complete the text response items.Abbreviations: SA, strongly agree; A, agree; N, neutral; D, disagree; SD, strongly disagree.
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC3643125&req=5

f3-amep-1-001: Course evaluation instrument.This evaluation form may consist of two types of questions. The first type asks you to respond to a series of statements by indicating your agreement or disagreement with each of the statements. It is important that you respond to each item. The second type is a straightforward question to which you respond with text. You do not have to complete the text response items.Abbreviations: SA, strongly agree; A, agree; N, neutral; D, disagree; SD, strongly disagree.
Mentions: The revamped course was evaluated at the end of five study periods between 2006 and 2008. The anonymous course evaluation was conducted online as part of educational institutions’ Course Evaluation Instrument (CEI) and Student Experience of Teaching (SET). These tools are shown in Figures 3 and 4. These evaluation tools are routinely used for each course offered at the educational institution. This allows data to be compared within courses and across courses at the educational institution. Quantitative evaluation using CEI from 2006, 2007, and 2008 indicated improvements in students’ satisfaction with the course. In 2005, prior to implementing the hybrid model, the mean score for question 10 in the CEI (“Overall I was satisfied with the quality of this course”) was 0. No other evaluation data for 2005 were made available to the new course coordinator because of ethical and confidentiality reasons. In 2006 the mean score for this question was 62.07, in 2007 the mean score was 65.8, and in 2008 the mean score was 55.7 (data from only one study period was available in 2008). Quantitative evaluation using SET from 2006, 2007, and 2008 also indicated improvements in students’ satisfaction with the educator for this course. In 2006, the mean score for question 10 in the SET (“Overall, I was satisfied with the performance of this staff member”) was 82.7, in 2007 the mean score was 86.2, and in 2008 the mean score was 86.5 (data from only one study period was available in 2008).

Bottom Line: In 2006, the overall student satisfaction rating was 62.07, in 2007 it was 65.8, and in 2008 it was 55.7.Qualitative findings also supported these quantitative findings, indicating improvements in the structure and process of the new course.The outcomes from the evaluation of the remodeled course provide evidence of a consistent quality learning experience for students, and support the concept of using research evidence to guide the development of teaching and learning practices in the training of health professionals.

View Article: PubMed Central - PubMed

Affiliation: International Centre for Allied Health Evidence, School of Health Sciences, University of South Australia, Adelaide, South Australia.

ABSTRACT
Evidence-based practice is now considered to be a vital element of health care service delivery. The call to use evidence to inform other areas, such as teaching and learning, is growing. This paper reports on the processes used to integrate best evidence into teaching practices within an undergraduate health science program. An existing course within this program at an Australian tertiary institution was remodeled by a newly appointed course coordinator in response to critical feedback from student cohorts. A systematic, iterative, five-step approach was used in the development of the new course. The process of development was influenced by current research evidence, an audit of the existing course, and critical feedback from students. The new course was evaluated using quantitative and qualitative research methods for five study periods. In 2005, prior to implementing the changes, the overall student satisfaction rating for the course was zero (representing the lowest possible score). In 2006, the overall student satisfaction rating was 62.07, in 2007 it was 65.8, and in 2008 it was 55.7. Qualitative findings also supported these quantitative findings, indicating improvements in the structure and process of the new course. The outcomes from the evaluation of the remodeled course provide evidence of a consistent quality learning experience for students, and support the concept of using research evidence to guide the development of teaching and learning practices in the training of health professionals.

No MeSH data available.