Skip to main content
  • Research article
  • Open access
  • Published:

A controlled trial of the effectiveness of internet continuing medical education

Abstract

Background

The internet has had a strong impact on how physicians access information and on the development of continuing medical education activities. Evaluation of the effectiveness of these activities has lagged behind their development.

Methods

To determine the effectiveness of a group of 48 internet continuing medical education (CME) activities, case vignette surveys were administered to US physicians immediately following participation, and to a representative control group of non-participant physicians. Responses to case vignettes were analyzed based on evidence presented in the content of CME activities. An effect size for each activity was calculated using Cohen's d to determine the amount of difference between the two groups in the likelihood of making evidence-based clinical decisions, expressed as the percentage of non-overlap, between the two groups. Two formats were compared.

Results

In a sample of 5621 US physicians, of the more than 100,000 physicians who participated in 48 internet CME activities, the average effect size was 0.75, an increased likelihood of 45% that participants were making choices in response to clinical case vignettes based on clinical evidence. This likelihood was higher in interactive case-based activities, 51% (effect size 0.89), than for text-based clinical updates, 40% (effect size 0.63). Effectiveness was also higher among primary care physicians than specialists.

Conclusion

Physicians who participated in selected internet CME activities were more likely to make evidence-based clinical choices than non-participants in response to clinical case vignettes. Internet CME activities show promise in offering a searchable, credible, available on-demand, high-impact source of CME for physicians.

Peer Review reports

Background

The internet has had a strong impact on how physicians access information, and many have reported the influence of this information on their medical decision making [1, 2]. The internet offers a platform for addressing healthcare quality and patient safety by assisting with diagnosis and patient management, and facilitating the free flow of information [3]. The internet also offers opportunities to facilitate improvement in the quality of care through physician maintenance of certification [4, 5].

Rapid growth of the internet has altered continuing education for health professionals by allowing access to more varied, individualized, and systematic educational opportunities. In 2008, 300 sites offered more than 16,000 CME activities [6]. Internet CME activities offer advantages over traditional methods of CME delivery; internet CME is a credible 'any time, any place' form of education, providing increased accessibility to busy physicians [7–11]. Other advantages may include increased engagement in the educational process, ease of use, cost effectiveness, hyperlinked navigation, and the ability to view content that may be continually updated.

The evaluation of internet CME activities has not kept pace with their development; evaluation has principally focused on participant satisfaction and increases in knowledge [12, 13]. Only a few studies have examined physician performance and patient health associated with participation in internet CME activities, and the results have been mixed [14–18]. Evaluation studies of internet CME activities have been limited by the lack of systematic evaluation across different clinical subject matter areas [12]. The purpose of this study was to use a consistent approach to evaluate the effectiveness of internet CME activities across various clinical topics by examining the amount of difference in the evidence-based clinical practice choices of participants compared with a control group of non-participants. Based on a recent meta-analysis of the effectiveness of CME activities [19], we hypothesized that physicians participating in internet CME activities would make evidence-based clinical practice choices more frequently than physicians who did not participate, and that the percentage of non-overlap in evidence-based choices between the two groups would be at least 10%.

Methods

A controlled trial was designed to measure the effectiveness of a group of 48 internet CME activities. Physicians who participated in these activities, matched the target audience for the activity, and completed case vignette self-assessment questions following participation were eligible to participate. A random sample of participants meeting the eligible criteria for each activity was drawn from each overall group. A random sample of non-participant physicians of similar specialties was identified as a control group and was asked to complete the same self-assessment questions. The average evidence-based response rates were calculated for the participant and non-participant samples for each activity, and an effect size was calculated. An overall effect size was calculated, as well as effect sizes for text and case-based activities, and for primary care and specialist participants.

A consistent assessment approach was developed that included 1) using case vignettes to assess clinical practice choices, 2) using a standard hypertext mark-up language programming approach to presenting assessment questions at the end of selected internet activities, 3) applying this assessment approach to specific content reflected in each individual activity, 4) collecting assessment data from CME participants in each individual clinical assessment, 5) collecting assessment data from a comparable group of non-participants in each of the assessments, and 6) analyzing the data to determine the amount of difference between the CME participant and non-participant groups by calculating effect size and the percentage of non-overlap between the two groups. The use of case vignette surveys was reviewed by the Western Institutional Review Board in 2004, prior to initiation of this study; voluntary completion of the survey questions by physicians was considered to constitute consent.

During 2005, a pilot was conducted on three internet CME activities to test a standardized evaluation procedure, and the use of standard hypertext mark-up language (HTML) online forms, for the purpose of systematically gathering clinical case vignette assessment data from physicians following participation in internet CME activities posted on a large medical education site. The pilot was designed to determine the technical feasibility of gathering and transferring large data sets using a standardized evaluation approach; the pilot was not designed to evaluate the effectiveness of the three internet CME activities. The standardized evaluation procedure included the following elements. A standard assessment template consisting of two clinical vignettes and five clinical questions was developed using a multiple choice format; evidence-based responses to the case vignettes were identified from content and references developed by the faculty for each activity. Content for the activities was written and referenced to clinical evidence by the faculty member for each activity. Only content referenced to peer-reviewed publications or guidelines was considered eligible for the development of clinical vignette assessment questions. Case vignettes were written by physicians and were referenced to the content and learning objectives. Content validity of the case vignettes was established by review from medical editors of the online portal; editors represented the appropriate clinical area for each set of case vignettes.

Case vignette evaluations were developed for the three pilot activities according to this procedure. Over 5000 physicians participated in the pilot activities. Data collection and transfer was successful; no technical glitches were identified in data collection using the HTML online forms or in the data transfer. This feasibility pilot established the processes for development and review of case vignette questions, as well as the technical platform for proceeding with the evaluation of the effectiveness of a series of 48 internet CME activities.

During an 18-month period, a group of internet CME activities was identified as eligible for assessment if the activity met the following criteria: 1) designed for physicians, 2) posted during an 18 month period between January 2006 and June 2007 to a large medical education website, 3) certified for CME credit, 4) presented in an on-demand archived format (webcasts and other live activities were not included), and 5) designed in a text-based format for clinical updates or as interactive case-based activities.

Text-based clinical update activities were defined as original review articles on scientific advances related to a particular clinical topic, similar to a written article in an internet journal. Interactive cases were original CME activities presented in a case format with extensive questions and feedback within each activity. Typically, they began with a short explanatory introduction and then presented the content within the context of a patient care scenario with discussion of diagnostic and therapeutic options and outcomes. Questions distributed throughout the activity provided interaction for learners to test their knowledge on either the material that was just presented, or for upcoming content. After submitting a response, the learner was presented with an explanation of the optimal answer, as well as a summary of the responses of past participants. There was no direct learner-instructor or learner-learner interaction in either of these formats.

The case vignette survey template consisted of a set of content-specific, case vignette questions that were delivered to participants at the conclusion of each CME activity. They were also distributed in a survey, by email or fax, to a similar non-participant group. This method was chosen as an adaptation for an online format with automated data transfer of the case vignette assessment method that has been recognized for its value in predicting physician practice patterns; results from recent research demonstrate case vignettes, compared with other processes of care measures such as chart review and standardized patients, are a valid and comprehensive method to measure a physician's processes of care [20, 21].

A sample size of at least 4800 with at least 100 (50 participants and 50 non-participants selected as a desired minimum sample size for individual activities) for each of the CME activities was chosen for the study in order to establish consistency in data collection even though content varied across multiple clinical areas. Participants were eligible for inclusion in the study only if they represented the specialty target audience for the activity, or were providing primary care. Eligible participants were identified for each activity, and a random sample of 50 was drawn from the group of eligible participants. Non-participating physicians were identified from a random sample drawn by specialty from the physician list of the American Medical Association. Participant and non-participant samples were matched on the following characteristics: physician specialty, degree, years in practice, whether or not direct patient care was their primary responsibility, and the average number of patients seen per week.

A statistical analysis software package (SAS 9.1.3) was used in data extraction and transformation, and statistical analyses. Participant and non-participant case vignette responses were scored according to their concordance with the evidence-informed content presented within each activity. Overall mean scores and pooled standard deviations were calculated for both the participant and non-participant groups for each of the activities. These were used to calculate the educational effect size using Cohen's d formula (i.e., the difference in mean divided by the square root of the pooled standard deviation) in order to determine the average amount of difference between participants and non-participants [22]. Effect size representing the difference between the two groups was expressed as a percentage of non-overlap between participants and non-participants. The amount of difference between participants and non-participants in the likelihood of making evidence-based clinical choices in response to clinical case vignettes was expressed using the percentage of non-overlap between participants and non-participants for each activity, and for the overall group of activities.

Results

Over 100,000 US physicians participated in the 48 selected activities over an 18 month period. A total of 5621 physician responses to assessment questions in 48 activities were analyzed; of these, 2785 physicians were responses from CME participants and 2836 were received from the control group of non-participants. The CME participant sample represents 1377 primary care physicians and 1241 physicians specializing in other areas. The non-participant sample represents 1441 primary care physicians and 1270 physicians specializing in other areas of medicine.

Demographics of physicians specializing in primary care, of physicians specializing in other clinical areas, and of all respondents, are presented in Table 1. Demographics of the participant group were consistent with demographics of the US physician population except in regard to patient care as a principal responsibility. Nationally, the average age of physicians is 51 years, with 27.8% female physicians and 6.5% representing those with DO degrees [23]. Nationally, 78.4% of US physicians are primarily involved with patient care; in the participant sample, this was significantly higher, at 94% [23]. When primary care participants were compared with specialist participants, there were no significant differences except in regard to gender. Primary care physician participants were more likely to be female (33%), compared with specialist participants (21%).

Table 1 Demographics of physician internet CME participants and non-participants

Of the 48 internet CME activities posted during the 18 month period of the study, 24 were interactive CME cases and 24 were text-based clinical updates. Effect sizes were highest for the cardiology and neurology activities. The effect sizes for these activities are presented in Tables 2 and 3 by clinical area and activity type.

Table 2 Interactive CME cases
Table 3 Effect size of text-based clinical updates

Overall, the average effect size for the 48 internet CME activities was 0.75. (Table 4) The non-overlap percentile, representing the non-overlap between participants and non-participants in evidence-based responses, was 45.2%, exceeding the hypothesized non-overlap of 10% between the two groups. Interactive case-based internet CME activities demonstrated a significantly higher effect size than text-based programming (p = 0.001). The effect size for primary care participants was also significantly higher than that for specialists. (p = < 0.001).

Table 4 Effect size of 48 internet CME activities by format and specialty

Discussion

Physician participants using internet CME activities selected evidence-based choices for clinical care in response to case vignettes more frequently than non-participants. The likelihood that physician internet CME participants would make clinical choices consistent with evidence in response to case vignettes more often than non-participants greatly exceeded the hypothesized 10% non-overlap between the two groups, demonstrating instead a 45% non-overlap. This effect is stronger than that from a recent meta-analysis of the effectiveness of CME activities where CME interventions had a small to moderate effect on physician knowledge and performance [19]. In this meta-analysis, however, only two internet studies were included.

The somewhat higher effect size for primary care physicians may be a reflection of broader educational needs, due to the wide range of clinical problems they encounter. Physicians specializing in clinical areas other than primary care have a narrower focus for medical information seeking and may have higher levels of baseline knowledge than primary care physicians on specific topics, also contributing to differences in effect size. The higher effect size for interactive CME cases is consistent with previous studies that demonstrate that increases in active participation improve the effectiveness of CME [24].

Internet CME physician participants represented by the sample in this study have extensive experience, and they are principally engaged in direct patient care, disputing earlier perceptions that most physicians accessing internet CME would be recent graduates from medical school. Compared with demographic data on the total population of US physicians, the years in practice are similar, but more physicians in the online group are engaged principally in patient care [23]. A higher percentage of female physicians participated in the online CME activities studied than is represented in the overall US physician population. It is clear that internet CME activities are reaching a large audience of busy physicians; the ACCME data compilation for 2006 showed that physicians participated in internet enduring materials over 2 million times [25]. Data from this study have demonstrated that, in addition to large increases in reach for internet CME, these activities show promise in influencing practice. The larger effect size for these internet CME activities may be associated with the searchability of internet CME activities, as well as their availability when physicians are prompted to address a clinical question or problem. More research is needed in this area.

One of the strengths of this study was the use of a consistent evaluation format applied to a large number of internet CME activities. A limitation, however, was the programmed format that limited the number of clinical vignette questions to five in each activity; thus, not all key points in the content of each activity could be evaluated. The format also limited the type of questions to multiple-choice questions, and did not include the opportunity to ask open-ended questions. While the use of a control group allowed a comparison of participants with non-participants, another limitation was the lack of baseline data to assess the practice patterns of CME participants prior to participation. It is possible that CME internet participants access the internet more frequently than non-participants, and access not only CME activities, but various forms of internet medical information; these medical information seeking behaviors may influence the amount of difference between participants and non-participants reflected in the effect sizes reported in this study. In future studies, baseline data would be helpful in addressing this issue.

While this study has demonstrated the promise of internet CME activities in influencing the diagnostic and therapeutic choices physicians make daily, many research questions have yet to be addressed. Future research studies should continue to apply consistent evaluation approaches to internet CME. Pre-tests or baseline measurements would contribute to a more robust understanding of physician practice patterns prior to participation; it will be important, however, not to create lengthy pre-tests that become barriers to accessing internet CME activities. Future studies are needed to determine not only which internet formats are most effective, but also how educational elements such as advance organizers, behavioral objectives, interactivity, and feedback should be incorporated into the design of activities to optimize effectiveness. In addition, studies will be needed to determine how activities can be tailored to various physician specialties and populations.

Conclusion

In summary, evaluation of internet CME activities lags far behind the development of these activities, and many research questions remain unaddressed. This study, however, has demonstrated that physicians who participated in selected internet CME activities were more likely following participation to make evidence-based clinical choices in response to case vignettes than were non-participants. Internet CME activities show promise in offering a searchable, credible, available on-demand, high-impact source of CME for physicians.

References

  1. Podichetty VK, Booher J, Whitfield M, Biscup RS: Assessment of internet use and effects among health professionals: a cross sectional survey. Postgrad Med J. 2006, 82: 274-249. 10.1136/pgmj.2005.040675.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Bennett NL, Casebeer LL, Kristofco RE, Strasser SM: Physicians' Internet information-seeking behaviors. J Contin Educ Health Prof. 2004, 24: 31-38. 10.1002/chp.1340240106.

    Article  PubMed  Google Scholar 

  3. Shine K: Healthcare quality and how to achieve it. Acad Med. 2002, 77: 91-99. 10.1097/00001888-200201000-00021.

    Article  PubMed  Google Scholar 

  4. Holmboe ES, Meehan TP, Lynn L, Doyle P, Sherwin T, Duffy FD: Promoting physicians' self assessment and quality improvement: the ABIM diabetes practice improvement module. J Contin Educ Health Prof. 2006, 26: 109-119. 10.1002/chp.59.

    Article  PubMed  Google Scholar 

  5. Holmboe ES, Lynn L, Duffy FD: Improving the quality of care via maintenance of certification and the web: An early status report. Perspect Biol Med. 2008, 51: 71-83. 10.1353/pbm.2008.0002.

    Article  PubMed  Google Scholar 

  6. Sklar B: Continuing medical education list. [http://www.cmelist.com/list.htm]

  7. Harden RM: A new vision for distance learning and continuing medical education. J Contin Educ Health Prof. 2005, 25: 43-51. 10.1002/chp.8.

    Article  PubMed  Google Scholar 

  8. Vollmar HC, Schurer-Maly CC, Frahne J, Lelgemann M, Butzlaff M: An e-learning platform for guideline implementation-evidence- and case-based knowledge translation via the Internet. Methods Inf Med. 2006, 45: 389-396.

    CAS  PubMed  Google Scholar 

  9. Boulos MN, Maramba I, Wheeler S: Wikis, blogs and podcasts: a new generation of Web-based tools for virtual collaborative clinical practice and education. BMC Med Educ. 2006, 6: 41-10.1186/1472-6920-6-41.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Ruiz JG, Mintzer MJ, Issenberg SB: Learning objects in medical education. Med Teach. 2006, 28: 599-605. 10.1080/01421590601039893.

    Article  PubMed  Google Scholar 

  11. Cobb SC: Internet continuing education for health care professionals: an integrative review. J Contin Educ Health Prof. 2004, 24: 171-180. 10.1002/chp.1340240308.

    Article  PubMed  Google Scholar 

  12. Curran VR, Fleet L: A review of evaluation outcomes of web-based continuing medical education. Med Educ. 2005, 39: 561-567. 10.1111/j.1365-2929.2005.02173.x.

    Article  PubMed  Google Scholar 

  13. Wutoh R, Boren SA, Balas EA: eLearning: a review of Internet-based continuing medical education. J Contin Educ Health Prof. 2004, 24: 20-30. 10.1002/chp.1340240105.

    Article  PubMed  Google Scholar 

  14. Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, Greenberg SB, Greisinger AJ: Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005, 294: 1043-1051. 10.1001/jama.294.9.1043.

    Article  PubMed  Google Scholar 

  15. Curtis JR, Westfall AO, Allison J, Becker A, Melton ME, Freeman A, Kiefe CI, MacArthur M, Ockershausen T, Stewart E, Weissman N, Saag KG: Challenges in improving the quality of osteoporosis care for long-term glucocorticoid users: a prospective randomized trial. Arch Intern Med. 2007, 167: 591-596. 10.1001/archinte.167.6.591.

    Article  PubMed  Google Scholar 

  16. Stewart M, Marshall JN, Ostbye T, Feightner JW, Brown JB, Harris S, Galajda J: Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med. 2005, 37: 131-138.

    PubMed  Google Scholar 

  17. Allison JJ, Kiefe CI, Wall T, Casebeer L, Ray MN, Spettell CM, Hook EW, Oh MK, Person SD, Weissman NW: Multicomponent Internet continuing medical education to promote chlamydia screening. Am J Prev Med. 2005, 28: 285-290. 10.1016/j.amepre.2004.12.013.

    Article  PubMed  Google Scholar 

  18. Short LM, Surprenant ZJ, Harris JM: A community-based trial of an internet intimate partner violence CME program. Am J Prev Med. 2006, 30: 181-185. 10.1016/j.amepre.2005.10.012.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Mansouri M, Lockyer J: A meta-analysis of continuing medical education effectiveness. J Contin Educ Health Prof. 2007, 27: 6-15. 10.1002/chp.88.

    Article  PubMed  Google Scholar 

  20. Peabody JW, Luck J, Glassman P, Dresselhaus TR, Lee M: Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. JAMA. 2000, 283: 1715-1722. 10.1001/jama.283.13.1715.

    Article  CAS  PubMed  Google Scholar 

  21. Peabody JW, Luck J, Glassman P, Jain S, Hansen J, Spell M, Lee M: Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med. 2004, 141: 771-780.

    Article  PubMed  Google Scholar 

  22. Cohen J: Statistical Power Analysis for the Behavioral Sciences. 1988, Hillsdale, NJ: Lawrence Earlbaum Associates, 2

    Google Scholar 

  23. Smart DR, Sellers J: Physician Characteristics and Distribution in the U.S. edition. 2008, American Medical Association

    Google Scholar 

  24. Davis D: Does CME work? An analysis of the effect of educational activities on physician performance or health care outcomes. Int J Psychiatry Med. 1998, 28: 21-23. 10.2190/UA3R-JX9W-MHR5-RC81.

    Article  CAS  PubMed  Google Scholar 

  25. Accreditation Council for Continuing Medical Education. 2006 Annual Report Data. [http://www.accme.org]

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Linda Casebeer.

Additional information

Competing interests

Authors are employees of Outcomes, Inc., Medscape LLC, or the University of Alabama School of Public Health. They have no other potential conflicts of interest to declare.

Authors' contributions

All authors have participated in the design of the study, the review of the data, and the writing of the article. SZ was principally responsible for the data analysis.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Casebeer, L., Engler, S., Bennett, N. et al. A controlled trial of the effectiveness of internet continuing medical education. BMC Med 6, 37 (2008). https://doi.org/10.1186/1741-7015-6-37

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1741-7015-6-37

Keywords