Back to Journals » Clinical, Cosmetic and Investigational Dermatology » Volume 17

Development and Validation of a New Tool for Evaluating Educational Videos Discussing Skin Surgical Procedure Techniques

Authors Almuqarrab FJ , Alfurayh N , AlGhamdi K

Received 19 March 2024

Accepted for publication 29 May 2024

Published 5 June 2024 Volume 2024:17 Pages 1321—1328

DOI https://doi.org/10.2147/CCID.S469592

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Prof. Dr. Rungsima Wanitphakdeedecha



Fatimah J Almuqarrab,1 Nuha Alfurayh,2 Khalid AlGhamdi1,3

1Department of Dermatology, King Saud University Medical City, King Saud University, Riyadh, Saudi Arabia; 2Division of dermatology, Imam Abdulrahman Alfaisal Hospital, Ministry of Health, Riyadh, Saudi Arabia; 3Vitiligo Research Chair, Dermatology Department, College of Medicine, King Saud University, Riyadh, Saudi Arabia

Correspondence: Khalid AlGhamdi, Vitiligo Research Chair, Department of Dermatology, College of Medicine, King Saud University, Riyadh, Saudi Arabia, Email [email protected] Fatimah J Almuqarrab, Department of Dermatology, King Saud University Medical City, King Saud University, Riyadh, Saudi Arabia, Email [email protected]

Background: The available tools for evaluating scientific content target written scientific evidence and referencing without considering surgical, technical, or video graphic aspects.
Objective: This study developed and validated a tool for qualitatively evaluating videos in the field of skin surgery. This will increase the quality of recorded surgical materials published online and ultimately enhance the reliability of streaming platforms as educational resources.
Methodology: Tool development included several stages: draft generation, expert panel setting, internal reliability testing, and pilot study.
Results: After two rounds of expert panels evaluating the developed tool, 23 relevant items evaluating the educational value, scientific accuracy, and clarity of the surgical technical steps of the videos were obtained. We applied the tool to the top 25 YouTube videos discussing elliptical excision. Internal consistency, reliability, and substantial agreement between the raters were identified. We identified a strong positive correlation between our tool score and the global rating score (r= 0.55, P= 0.004).
Conclusion: It is critical to avoid relying on any video for educational purposes. The tool generated and validated in our study can determine a video’s value. A pilot study of 25 YouTube videos demonstrated that the available videos are of fair-good quality, thus necessitating the need for high-quality video production.

Keywords: youtube videos, skin surgery, education, tool validation, video evaluation

Introduction

A well-known proverb in medical education is “See one, do one, teach one”, commonly referred to when mastering examination or procedural skills at medical schools, residencies, fellowships, and even in practice. In particular, skin surgery relies on visual education. Reading rigid words and images in a textbook is not comparable to being present and observing the events of examinations or operations. In the same context, video is a media that can be measured to replicate a specific event. In fact, contrary to conventional face-to-face academic teaching approaches, the use of instructional videos for studying clinical skills has been shown to produce superior learning outcomes.1 For that purpose, visual aids that supplement current textbooks are now available to clinicians and students who want to learn how surgery is performed.2 Those videos can help standardize the educational experience for students and learners to eliminate biases against certain learners.

According to research by Rapp et al, YouTube is the most popular website among residents preparing for surgery.3 It is the largest and most accessible free-stream platform, with a wide variety of content uploaded by individuals and societies.4 Koya et al demonstrated YouTube enrichment regarding the most extensive dermatologic surgical video graphics media covering various topics, including Mohs surgery, shave biopsy, squamous cell/basal cell carcinoma (SCC/BCC) excisions, wart removal, cyst excision, suture techniques, tissue transfer simulations (flaps) and electrosurgery.4

Due to YouTube’s free access policy, any individual can post information about medical procedures without concern for the quality of the content. Additionally, users can post their comments under each video and rate the content publicly. As an ad hoc technique of validation and quality control, the comments with the most votes are displayed at the top of the list of comments. Furthermore, the community of YouTube contributers does regulate its content. However, reporting bias is a significant challenge for such a qualification process. Azer et al emphasized that the parameters for the videos did not distinguish between beneficial and redundant content.5 Another recent study emphasized that surgical trainees need to be critically aware that the quality of accessible educational surgical video content varies greatly.6

It is worth mentioning that several studies have been conducted to evaluate YouTube video content as a source of patient education for different dermatologic conditions.7–10 A further study has evaluated the educational and sentimental comments on Mohs surgery videos.11 The vast majority of the studies failed to identify any relationship between the videos’ popularity indices and the quality of the video content.6–12 There are several sources of bias related to advertisements and lack of citable sources of information, which rank the overall quality of content as poor. Those studies essentially relied on DISCERN or JAMA benchmark13,14 instruments for their evaluation. As Azer15 discussed, we believe the abovementioned instruments were not intended to evaluate videos; instead, they are designed to evaluate written content, mainly focusing on scientific evidence and referencing without considering technical aspects. Therefore, a better-standardized tool is needed to evaluate video graphics content.

The developed tool is expected to qualitatively appraise the videos for their potential teaching value concerning the scientific accuracy of their information, the clarity of the video purpose, the clarity and accuracy of the technical procedure step design, and the technical accuracy of the visual and audio graphic supplements.

In this study, we aim to develop and validate a tool for qualitatively evaluating surgical educational videos, particularly those related to skin surgery. Additionally, we aim to create a standardized tool that teachers and medical educators can use to recommend proper videos for online study and preparation for surgeries. A secondary aim that can be achieved subsequently is to improve the overall quality of the educational video content via the application of the developed tool; thus, the stream platforms can be a reliable and integral part of the educational process.

Methodology

This study is approved by the King Saud University Institutional Review Board (IRB) with IRB project No. E-23-6737.

The tool development passed through several stages, which began with developing a draft for item generation, tool validation, internal reliability testing, and a pilot study of the tool. Skin Surgery Videographic Contents Evaluation Tool (SSVC-ET) items were generated after reviewing the DISCERN, JAMA benchmark, and LAP-VEGaS guidelines.13,14,16

The tool measures two main domains covering 15 items, thus qualifying the reliability and educational value of the video content. Scientific evidence was tested by examining proper unbiased referencing of the provided information and considering areas of uncertainty or procedural methodology variation among practitioners. In contrast, technical aspects were examined from two perspectives, with one aspect related to surgical technique and the other related to audio-visual quality (Supplementary Index 1).

The generated draft was sent for validation to three review panel experts: a professor in surgical dermatology with particular expertise in medical education, a senior fellow of cutaneous and laser surgery, and an otologist consultant surgeon with special expertise in medical education. Each expert received the draft and was asked to evaluate the 15 items, wherein they were assigned a score ranging from 1 to 4 depending on their clarity and relevance to the measured domain. The items that received a score of 3 or more were considered valid/clear.

After collecting the experts’ ratings, item-level and scale-level validity indices (I-CVI and S-CVI/Ave) were calculated and set at 0.78 and 0.8 critical positive values, respectively.17,18 To adjust for the possibility of chance agreement, the probability of chance agreement (Pc) was calculated, and the adjusted kappa was further computed based on both the Pc and I-CVI for each instrument item to determine the degree of agreement beyond chance. Values above 0.74 are considered to be excellent. Those less than 0.4 are considered to be fair and were eliminated from the draft.19,20 Further redundant item reduction was performed based on the researcher’s experience.

We calculated the Fleiss kappa (κ) to determine the reliability of the raters’ agreement level, and further subgroup analysis for identifying the paired interrater agreement levels was conducted using Cohen’s κ.

To determine face validity, we interviewed two dermatologic surgeons face-to-face. We asked them to review the items for comprehensibility of the word meaning, suitability and ease of the items, objective coverage, and/or any possible misinterpretation or ambiguity.21

Generated SSVC-ET scaling relies on a 5-point Likert scale for the major items, wherein the item receives a score of 1 if it strongly represents the negative aspect and a score of 5 if it strongly represents the positive aspect of the scale. Items’ hinted elements must be clearly present in the video to receive a score of 5. Based on the summed total score of all of the measured items, a 5-point score can be given to categorize the video as being poor, fair, good, very good, or excellent.

The tool was subsequently piloted using videos from the YouTube platform discussing the elliptical excision procedure. The top 25 videos were included after excluding duplicates, videos unrelated to the topic, videos uploaded by patients, videos discussing personal experiences, cartoons, schematized videos, or videos in languages other than English. They were evaluated by another two independent board-certified dermatologists using the SSVC-ET. Due to the fact that most of the available video graphic content does not include clear references, we needed to confirm that the SSVC-ET is appropriate for measuring the technical usefulness of the videos, regardless of their clear/unclear scientific reliability. Therefore, the videos were further assessed to detect the clarity of their educational value to the technical procedure steps using the global rating scale and checklist adapted for elliptical excision evaluation.22 Furthermore, we calculated the Pearson rank correlation coefficient for normally distributed data and the Spearman coefficient for nonnormally distributed data to determine the correlation between the SSVC-ET and the global rating score, video providers, length, age, and/or popularity indices (video power index: ratio*view ratio/100, whereas like ratio equals to (like *100/ [likes + dislikes]), and view ratio equals the number of views/day). Moreover, a linear regression analysis was performed to examine the influence of the variable Global Rating on the variable SSVC-ET score.

The reliability of the pilot results was evaluated by comparing the two raters’ scores, calculating the weighted interrater Cohen’s κ with a result either greater than or equal to 0.7 (indicating acceptable agreement), and measuring the internal consistency reliability using Cronbach’s alpha.

The data were imported into a computerized spreadsheet (Microsoft Excel 2016; Redmond, WA-based Microsoft Corporation), and further analysis was performed by using statistical software (2019 release; DATAtab online statistical software, https://datatab.net/).

Continuous variables are described as the mean (range/standard deviation), whereas categorical variables are reported as percentages (%) and frequencies (n). A P value less than 0.05 was considered a statistically significant cutoff value.

Results

Designing the SSVC-ET

A comprehensive literature review demonstrated the three most widely used tools13,14,16 for evaluating medical and surgical website content, which led to identifying the main content domains covering three essential dimensions: video reliability, information, and technical qualities. In this step, 38 items were obtained from the related literature and instruments. Our research group evaluated these items for duplication removal and construct modification to make them suitable for video graphics content evaluation. Finally, a preliminary SSVC-ET composed of 15 items within two main domains was generated, including an additional 12 subitem elements mainly focused on surgical educational techniques quality.

Validating the SSVC-ET

The instrument developers created an expert panel, which included three content experts. The panel members were requested (either in person or via e-mail) to assess the content comprehension, content validity index, and face validity. In each round, a letter was sent describing the study objectives, the two domains and underlying items, as well as the scoring method on relevancy and clarity.

In the first round, universal agreement identified 15 relevant items out of 28; however, the S-CVI was lower than expected (0.79), thus indicating that the tool validity must be revised.23 One item and 2 subitem elements were eliminated from the draft, as they received low I-CVI (0.3) and fair Pc calculated κ (−1.6). Moreover, five items and four subitem elements were revised and reconstructed according to the experts’ opinions (I-CVI= 0.67).

The Fleiss kappa showed slight agreement between the three raters with κ= 0.04. Although there was poor agreement between the otologist and both surgical dermatologists (κ= 0.04 and −0.14, P= 0.8 and 0.4, respectively), there was moderate agreement (κ= 0.52, P= 0.04) between the two surgical dermatologists.

In the second round, the Fleiss κ showed almost perfect agreement between the three raters (κ=1, P<0.001). The universal agreement identified 23 relevant items out of 25. Apart from two eliminated items that received an I-CVI of 0, the S-CVI of this round was excellent (0.98). Additionally, face validity was requested from the panel members in this round to modify the tool construction and word clarifications.

Piloting the Tool

We applied the tool to YouTube videos discussing elliptical excision. The top 25 videos that appeared in the search that fit the inclusion criteria were reviewed by two independent raters. The tool showed internal consistency reliability (Cronbach’s alpha=0.8), optimal Pearson correlation (r= 0.65, P = <0.001), and substantial agreement between the two raters (weighted Cohen’s kappa=0.74, St. error=0.1, 95% confidence interval=0.54–0.94, P value<0.001).

The videos were produced from 2010 to 2023, 28% of which were released in 2021. Private personnel released 56% of the videos, and academic/health institutions released the remaining 44%. Table 1 shows the videos’ characteristics, including video length, age, popularity indices, SSVC-ET score, overall rating of the videos, and global rating. As noted in the table, the SSVC-ET total score ranged from 1 to 3, thus indicating that the available video quality was fairly good. In contrast, the technical rating of the videos had a mean of 19.6 out of 30, as examined via the global rating adapted for elliptical excision evaluation, which technically indicated that most of the video information was helpful in learning the proper technique. Figures 1 and 2 compare the videos’ qualities and popular indices between those uploaded by a private person and those uploaded by an academic or hospital institution. Although the SSVC-ET mean was similar between the two provider types, the global rating score was higher for privately uploaded videos (point-biserial correlation= 0.24, p=0.2). Again, personally owned videos had greater popularity (as determined by the popular reactions videos received) than those owned by institutions.

Table 1 Characteristics of the Elliptical Excision Videos (n=25)

Figure 1 Comparing videos’ educational values by their providers. α SSVC-ET score: Skin Surgery Video Graphic Content Evaluation Tool, the total score. β Global Rating: checklist adapted for elliptical excision evaluation.

Figure 2 Comparing videos’ popular indices by their providers. α Video Power Index: like ratio*view ratio/100. β View ratio: (like *100/[likes + dislikes]).

To examine the correlation between the independent variables (video length, age, popularity index, and global rating) and the dependent variable SSVC-ET score, the Pearson correlation coefficient was calculated for the normally distributed data that were detected by using the quantile‒quantile plot (length and age), and Spearman correlation was performed for the nonnormally distributed data (video power index and global rating). The Pearson coefficients for the variables length and age were r=−0.06 (p =0.786) and r=−0.19 (p=0.376), respectively, thus indicating that there was no significant correlation between video length or the age of the videos and the SSVC-ET score. Similarly, the Spearman correlation Results showed no significant correlation between the video power index and the SSVC-ET score (r= 0.27, p =0.198).

However, a strong positive correlation existed between the SSVC-ET score and the global rating (r= 0.55, P= 0.004). The regression model showed that the variable Global Rating explained 36.51% of the variance in the variable SSVC-ET score. ANOVA was used to test whether this value was significantly different from zero. By utilizing the present sample, it was found that the effect was significantly different from zero (F=13.23, p =0.001, R2 = 0.37). The following regression model was obtained: evaluation score = 14.69 +0.58, which indicated that when all of the independent variables are zero, the value of the variable SSVC-ET score is 14.69, and if the value of the variable global rating changes by one unit, the value of the variable SSVC-ET score changes by 0.58. The standardized coefficient beta was 0.6, which significantly explains the positive contribution of the global rating on the SSVC-ET.

Finally, multiple linear regression analysis was performed to examine the influence of the variables absence of audio commentary and absence of written commentary on the variable SSVC-ET. The following regression model was obtained; when all of the independent variables are zero, the value of the variable SSVC-ET score is 2.41. In the absence of audio commentary, the value of the SSVC-ET score changes by −0.49 (p=0.036), whereas the absence of written commentary changes the SSVC-ET score by −0.3 (p=0.2).

Discussion

The exponential growth of online technologies has impacted our learning process. This is especially true concerning acquiring a new skill, such as surgical technique, wherein complex visual explanation is a fundamental part of teaching. With the growing popularity of nonregulated videos on the YouTube platform, the largest freely accessible video streaming website, the lack of integral information and proper reliable videos as a source of education has become a significant concern. Therefore, a need has arisen to develop a new instrument intended to qualitatively evaluate the video graphics of surgical skin content and its possible potential use by educators to properly select videos that are suitable for teaching students.

Most of the websites’ evaluation tools are generated for examining written content, and they mainly focus on evaluating the quality and reliability of information rather than the technical educational values concerning the procedure steps. An interesting consensus guideline for evaluating laparoscopic surgery educational videos has emerged as a helpful tool for filling the knowledge gaps in evaluating the usefulness of various technical aspects.16 However, its application in meticulous surgical field spaces, such as those performed during skin surgeries, is not always possible, given the differences between surgeries. This hypothesis was reflected in our validation process in the first round, wherein poor agreement was detected between our expert otologist and the two surgical dermatologists. In conjunction with the generation of the LAP-VEGaS guidelines, our SSVC-ET generation depends on a trainer-trainee committee investigating not only what experts have expected from the video content but also what the recipients’ learning needs are.

The items of the tool are pooled together from a variety of tools identified via a literature review. Such tools are directed to educational websites targeting patients, librarians, or medical personnel.13–15,24 Identifying a tool that can be used to assess the educational value of video content in three dimensions is challenging. SSVC-ET items consider authority, clarity of the aim and its achievement, targeted audience definition, the content’s validity, reliability, navigability of the cited resources, and technical benefits/drawbacks of the procedure steps. The generated items are incorporated under two main domains: the reliability of the video and the quality of the information presented in the video. Each item was given a weight out of a 5-point Likert scoring system. Moreover, subjective elements were assigned less weight in the scoring system; their principal value is examining the specific score the main item must receive.

Subsequently, a rigorous validation process was performed. In this study, the second validation stage was conducted by three surgical dermatologists who showed excellent interobserver agreement levels, statistically determined by the considerable kappa value. The high levels of Cronbach’s alpha correlation coefficient and S-CVI, respectively, indicated the high reliability and content validity of our tool.

Our tool was then applied to videos discussing the elliptical excision procedure which has already had a standardized global rating tool addressing the different surgery steps accuracy. Our tool’s reliability, accuracy, and internal consistency are examined and compared to the elliptical excision rating score. The results of the videos recommended/not recommended by the tool showed a normal distribution pattern. Although the quality of most of the available videos tested by the tool was fairly good, its technical usefulness has a higher impact when examined by the global rating adapted for elliptical excision evaluation. Interestingly, records uploaded by institutions received lower quality values. We failed to identify any correlation between the quality of the available videos and their length, age, or popularity. In contrast, although there was a significant positive correlation between the SSVC-ET score and the global rating score, a substantial negative correlation was detected between the SSVC-ET score and the absence of audio-written commentary.

By enhancing the application of the agreed-upon tool to surgical video evaluation, the overall quality of the video educational content is expected to improve. High-quality, reliable videos can subsequently be integrated into surgical education.

This is the first validated tool targeting educational skin surgery video evaluation. It evaluates the reliability of information resources and focuses on different aspects of the learning process, including surgical and technical aspects. One limitation of our study was that the surgical field in skin surgery differs from that in other specialties, thus making this tool suitable for the skin surgical field but not generalizable to other specialties.

Conclusion

Numerous videos are available online for teaching surgical procedures; however, a validated surgical video graphics resource evaluation tool is lacking. It is critical to avoid reliance on any online uploaded video for learning. We developed and validated a tool for qualitative evaluation of the educational value of the videos regarding the scientific accuracy and the clarity of the technical steps of the described procedure. A pilot study of 25 YouTube videos discussing elliptical excision was evaluated by the developed tool and demonstrated that the available videos are of fair-to-good quality, thus necessitating the need for high-quality video production by academic/health institutions.

The eventual objective is to increase the quality of the recorded surgical materials published online, thus ultimately enhancing the reliability of the stream platforms as educational resources. Thus, the tool developed in this study could help achieve these outcomes.

Acknowledgments

The authors extend their appreciation to the Deanship of Scientific Research, King Saud University, for funding through Vice Deanship of Scientific Research Chairs.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Lee JC, Boyd R, Stuart P. Randomized controlled trial of an instructional DVD for clinical skills teaching. Emergency Med Australasia. 2007;19(3):241–245. doi:10.1111/j.1742-6723.2007.00976.x

2. Farag M, Bolton D, Lawrentschuk N. Use of youtube as a resource for surgical education—clarity or confusion. Eur Urol Focus. 2020;6(3):445–449. doi:10.1016/j.euf.2019.09.017

3. Rapp AK, Healy MG, Charlton ME, Keith JN, Rosenbaum ME, Kapadia MR. YouTube is the most frequently used educational video source for surgical preparation. J Surg Educ. 2016;73(6):1072–1076. doi:10.1016/j.jsurg.2016.04.024

4. Koya KD, Bhatia KR, Hsu JTS, Bhatia AC. YouTube and the expanding role of videos in dermatologic surgery education. Semin Cutan Med Surg. 2012;31(3):163–167. doi:10.1016/j.sder.2012.06.006

5. Azer SA, Bokhari RA, AlSaleh GS, et al. Experience of parents of children with autism on YouTube: are there educationally useful videos? Inform Health Soc Care. 2018;43(3):219–233. doi:10.1080/17538157.2018.1431238

6. Besmens IS, Uyulmaz S, Giovanoli P, Lindenblatt N. YouTube as a resource for surgical education with a focus on plastic surgery – a systematic review. J Plast Surg Hand Surg. 2021;55(6):323–329. doi:10.1080/2000656X.2021.1884084

7. Guzman AK, Wang RH, Nazarian RS, Barbieri JS. Evaluation of YouTube as an educational resource for treatment options of common dermatologic conditions. Int J Dermatol. 2020;59(3). doi:10.1111/ijd.14693

8. Xiang L, Ravichandran S, Tamashunas N, Wan A, Mazmudar RS, Scott JF. YouTube as a source of dermatologic information on isotretinoin. J Am Acad Dermatol. 2020;83(2):653–655. doi:10.1016/j.jaad.2019.12.014

9. Gorrepati PL, Smith GP. DISCERN scores of YouTube information on eczema treatments. J Am Acad Dermatol. 2021;85(5):1354–1355. doi:10.1016/j.jaad.2020.11.007

10. Hossler EW, Conroy MP. YouTube as a source of information on tanning bed use. Arch Dermatol. 2008;144(10). doi:10.1001/archderm.144.10.1395

11. Iglesias-Puzas Á, Conde-Taboada A, López-Bran E. A cross-sectional study of youtube videos on mohs surgery: quality of content and sentiment analysis. J Am Acad Dermatol. 2022;86(3):649–651. doi:10.1016/j.jaad.2021.02.016

12. Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on youtube. Spine. 2018;43(22):E1334–E1339. doi:10.1097/BRS.0000000000002691

13. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105–111. doi:10.1136/jech.53.2.105

14. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: caveant lector et viewor--Let the reader and viewer beware. JAMA. 1997;277(15):1244–1245. doi:10.1001/jama.1997.03540390074039

15. Azer SA. Are DISCERN and JAMA suitable instruments for assessing youtube videos on thyroid cancer? Methodological concerns. J Cancer Educ. 2020;35(6):1267–1277. doi:10.1007/s13187-020-01763-9

16. Celentano V, Smart N, McGrath J, et al. LAP-VEGaS Practice Guidelines for Reporting of Educational Videos in Laparoscopic Surgery. Ann Surg. 2018;268(6):920–926. doi:10.1097/SLA.0000000000002725

17. Shi J, Mo X, Sun Z. Content validity index in scale development. Zhong Nan Da Xue Xue Bao Yi Xue Ban. 2012;37(2):152–155. doi:10.3969/j.issn.1672-7347.2012.02.007

18. Yusoff MSB. ABC of content validation and content validity index calculation. Educat Med J. 2019;11(2):49–54. doi:10.21315/eimj2019.11.2.6

19. Wynd CA, Schmidt B, Schaefer MA. Two quantitative approaches for estimating content validity. West J Nurs Res. 2003;25(5):508–518. doi:10.1177/0193945903252998

20. Polit DF, Beck CT, Owen SV. Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Res Nurs Health. 2007;30(4):459–467. doi:10.1002/nur.20199

21. Banna JC, Vera Becerra LE, Kaiser LL, Townsend MS. Using qualitative methods to improve questionnaires for Spanish speakers: assessing face validity of a food behavior checklist. J Am Diet Assoc. 2010;110(1):80–90. doi:10.1016/j.jada.2009.10.002

22. Garcia C, Neuburg M, Carlson-Sweet K. A model to teach elliptical excision and basic suturing techniques. Arch Dermatol. 2006;142(4):526. doi:10.1001/archderm.142.4.526

23. Abdollahpour E, Nejat S, Nourozian M, et al. The process of content validity in instrument development. Iranian Epidemiology. 2010;6(4):66–74.

24. Cook DA, Dupras DM. A practical guide to developing effective web-based learning. J Gen Intern Med. 2004;19(6):698–707. doi:10.1111/j.1525-1497.2004.30029.x

Creative Commons License © 2024 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.