首页 | 本学科首页   官方微博 | 高级检索  
检索        


Clinically relevant pretraining is all you need
Authors:Oliver J Bear Don&#x;t Walk IV  Tony Sun  Adler Perotte  Nomie Elhadad
Institution:Department of Biomedical Informatics, Columbia University, New York, New York, USA
Abstract:Clinical notes present a wealth of information for applications in the clinical domain, but heterogeneity across clinical institutions and settings presents challenges for their processing. The clinical natural language processing field has made strides in overcoming domain heterogeneity, while pretrained deep learning models present opportunities to transfer knowledge from one task to another. Pretrained models have performed well when transferred to new tasks; however, it is not well understood if these models generalize across differences in institutions and settings within the clinical domain. We explore if institution or setting specific pretraining is necessary for pretrained models to perform well when transferred to new tasks. We find no significant performance difference between models pretrained across institutions and settings, indicating that clinically pretrained models transfer well across such boundaries. Given a clinically pretrained model, clinical natural language processing researchers may forgo the time-consuming pretraining step without a significant performance drop.
Keywords:deep learning  natural language processing  transfer learning  social determinants of health  international classification of disease
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号