首页 | 本学科首页   官方微博 | 高级检索  
     

小切口白内障手术能力录像评分的一致性研究
引用本文:刘斌,黄文勇,王兰花,黄圣松. 小切口白内障手术能力录像评分的一致性研究[J]. 国际医药卫生导报, 2013, 19(11): 1558-1561
作者姓名:刘斌  黄文勇  王兰花  黄圣松
作者单位:中山大学中山眼科中心眼科学国家重点实验室, 广州,510060
基金项目:广东省适宜卫生技术推广项目(项目编号:[粤卫2010]152号)
摘    要:目的评估培训师利用国际眼科理事会(International Council of Ophthalmology,ICO)批准的小切口白内障手术(SmallIncision Cataract Surgery,MSICS)眼科手术能力评分表(Ophthalmology Surgical C0ropetenev Assessment Rubric,OSCAR;ICO—OSCAR:SICS)对外置式录像系统所采集的手术录像资料进行评分的可行性和一致性。方法由经过培训的技术员用统一的外置摄像系统在基层医院现场拍摄10名在我院完成手法小切口白内障手术培训的基层医生的完整手术过程;再由我院经过培训的5位手术培训师观看录像并进行评分,采用国际眼科理事会制定的小切口白内障手术眼科手术能力评分表为基础的5分评分制评分,2分为很不熟练,3分为不熟练,4分为熟练,5分为很熟练;由培训师代替进行的手术步骤,记为0分。在第1次评分2周后再次用同样的方法对录像进行评分。用加权Kappa法计算评定者信度和重测信度。结果共收集了10名医生的录像,医生中位年龄为40岁(29—48岁)。评定者信度的平均Kappa值为0.866(范围0.734~0.982)。2位培训师各评分项目的重测一致性均〉0.800,其中培训师1的平均Kappa值为0.921(范同0.843~0.981),培训师2的平均Kappa值为0.926(范围0.854~0.978)。结论依据ICO—MSICS评分表,利用外置式录像系统,可以对MSICS每一步骤的完成质量进行有效、一致的评价,这为基层医院医生手术质量的远程监控和自我评价提供了新的方法。

关 键 词:小切口白内障手术  录像  眼科手术能力评分表  一致性

Consistency of video-based operation ability rating scale for manual small incision cataract surgery
LIU Bin , HUANG Wen-yong , WANG Lan-hua , HUANG Sheng-song. Consistency of video-based operation ability rating scale for manual small incision cataract surgery[J]. International Medicine & Health Guidance News, 2013, 19(11): 1558-1561
Authors:LIU Bin    HUANG Wen-yong    WANG Lan-hua    HUANG Sheng-song
Affiliation:. (Sun Yat-sen Ophthalmology Center, Sun Yat-sen Univermty, Guangzhou 510060, China)
Abstract:Objective To evaluate the feasibility and consistency of surgical recordings by manual small incision cataract surgery ( MSICS ) operation ability rating scale, Ophthalmology Surgical Competency Assessment Rubric ( OSCAR ) , which was proposed by International Council of Ophthalmology. Methods Ten MSICS procedures performed by ten trainees who had undergone our "manual small incision cataract surgery training" program were recorded with a external digital-video-recorder by professional technician in 10 county-level hospitals, then these videos were graded by 2 different professional trainers using a 5-point scale, ranging from 2 ( not ve~7 skilled ) to 5 ( skilled ) , according to the ICO-SICS. The score was 0 if the particular step was finished by the trainer. After two weeks, these recordings were graded once more by the same method. Kappa value was used to assess the inter-rater consistency and intra-rater retest reliability. Results Available data and videos were collected from 10 surgeons, aged from 29 to 48 with a median age of 40. The mean kappa value of inter-observer reproducibility for each surgical step was 0.866 ( 0.754 - 0.982 ) .The retest reliabilities of both trainers were over 0.800, with a mean kappa value of 0.921 ( range: 0.843 - 0.981 ) for trainer A and 0.926 ( range: 0.854 - 0.978 ) for trainer B. Conclusions According to ICO-OSCR : SICS scale, external video system is an effective and consistent assessment of the quality of each step of SICS. This method can be a method for trainees' surgery quality remote monitoring and self-assessment.
Keywords:Small incision cataract surgery  Video  Ophthalmology surgical competency assessment rubric  consistency
本文献已被 维普 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号