For Residents

Teaching Evidence-Based Dermatology Using a Web-Based Journal Club: A Pilot Study and Survey

Author and Disclosure Information

Practice Points

  • A novel web-based application was beta tested in an academic dermatology setting to design and run a journal club for residents.
  • Goal-directed reading was emphasized by using guided questions to critically appraise literature based on reliability, significance, and applicability.
  • The combination of independent appraisal of an article using targeted questions and a group debrief led to better understanding of the evidence and its clinical applicability.


 

References

To the Editor:

With a steady increase in dermatology publications over recent decades, there is an expanding pool of evidence to address clinical questions.1 Residency training is the time when appraising the medical literature and practicing evidence-based medicine is most honed. Evidence-based medicine is an essential component of Practice-based Learning and Improvement, a required core competency of the Accreditation Council for Graduate Medical Education.2 Assimilation of new research evidence is traditionally taught through didactics and journal club discussions in residency.

However, at a time when the demand for information overwhelms safeguards that exist to evaluate its quality, it is more important than ever to be equipped with the proper tools to critically appraise novel literature. Beyond accepting a scientific article at face value, physicians must learn to ask targeted questions of the study design, results, and clinical relevance. These questions change based on the type of study, and organizations such as the Oxford Centre for Evidence-Based Medicine provide guidance through critical appraisal worksheets.3

To investigate the utility of using guided questions to evaluate the reliability, significance, and applicability of clinical evidence, we beta tested a novel web-based application in an academic dermatology setting to design and run a journal club for residents. Six dermatology residents participated in this institutional review board–approved study comprised of 3 phases: (1) independent article appraisal through the web-based application, (2) group discussion, and (3) anonymous postsurvey.

Using this platform, we uploaded a recent article into the interactive reader, which contained an integrated tool for appraisal based on specific questions. Because the article described the results of a randomized clinical trial, we used questions from the Centre for Evidence-Based Medicine’s Randomised Controlled Trials Critical Appraisal Worksheet, which has a series of questions to evaluate internal validity, results, and external validity and applicability.3

Residents used the platform to independently read the article, highlight areas of the text that corresponded to 8 critical appraisal questions, and answer yes or no to these questions. Based on residents’ answers, a final appraisal score (on a scale of 1% to 100%) was generated. Simultaneously, the attending dermatologist leading the journal club (C.W.) also completed the assignment to establish an expert score.

Scores from the residents’ independent appraisal ranged from 75% to 100% (mean, 85.4%). Upon discussing the article in a group setting, the residents established a consensus score of 75%. This consensus score matched the expert score, which suggested to us that both independently reviewing the article using guided questions and conducting a group debriefing were necessary to match the expert level of critical appraisal.

Of note, the residents’ average independent appraisal score was higher than both the consensus and expert scores, indicating that the residents evaluated the article less critically on their own. With more practice using this method, it is possible that the precision and accuracy of the residents’ critical appraisal of scientific articles will improve.

Pages

Next Article: