首页 | 本学科首页   官方微博 | 高级检索  
检索        

基于点信息的人体寰椎三维模型配准
引用本文:焦培峰,常建,杨晓松,焦樱,刘芳德,郭诗辉,柏瑞,欧阳钧,张建军.基于点信息的人体寰椎三维模型配准[J].医用生物力学,2012,27(5):567-571.
作者姓名:焦培峰  常建  杨晓松  焦樱  刘芳德  郭诗辉  柏瑞  欧阳钧  张建军
作者单位:南方医科大学 基础医学院, 人体解剖学教研室广东省医学生物力学重点实验室;英国伯恩茅斯大学 媒体学院, 国家计算动画中心;英国伯恩茅斯大学 媒体学院, 国家计算动画中心;英国伯恩茅斯大学 媒体学院, 国家计算动画中心;英国伯恩茅斯大学 媒体学院, 国家计算动画中心;英国伯恩茅斯大学 媒体学院;广州军区广州总医院 放射线科;南方医科大学 基础医学院, 人体解剖学教研室广东省医学生物力学重点实验室;英国伯恩茅斯大学 媒体学院
基金项目:国家自然科学基金资助项目(31170903,31200708)
摘    要:目的建立一种基于点信息的寰椎三维模型局部点配准方法,为进行三维数据的统计建模奠定基础。方法以正常人体CT序列图像生成寰椎三维模型30个,所有模型标注人工选择的对应点,设为模板模型1个,训练模型20个,验证模型9个。首先进行训练组模型对模板模型的配准,包括点信息的比较计算和权重系数的机器训练两步,以自动配准点与人工选点的欧式距离之和为测度,获得点配准公式及对应的最佳系数;其次以验证组模型对模板模型进行配准,统计自动配准点与人工选点的欧式距离值,同训练组做对比分析,评估方法的稳定性。结果获得配准函数及对应的最佳权重系数,训练组和验证组配准结果误差分别为1.983和2.045 mm,统计分析表明两组结果没有显著性差异。结论此方法精度及稳定性均达到预期目的,可用于寰椎模型之间感兴趣点的自动配准及统计建模工作中的元素分类。

关 键 词:寰椎  三维模型  配准  点信息  CT扫描
收稿时间:2011/12/16 0:00:00
修稿时间:2012/3/16 0:00:00

Registration for 3D model of human atlas based on vertex information
JIAO Pei-feng,CHANG Jian,YANG Xiao-song,JIAO Ying,LIU Fang-de,GUO Shi-hui,BAI Rui,OUYANG Jun and ZHANG Jian-jun.Registration for 3D model of human atlas based on vertex information[J].Journal of Medical Biomechanics,2012,27(5):567-571.
Authors:JIAO Pei-feng  CHANG Jian  YANG Xiao-song  JIAO Ying  LIU Fang-de  GUO Shi-hui  BAI Rui  OUYANG Jun and ZHANG Jian-jun
Institution:Guangdong Provincial Medical Biomechanical Key Laboratory, Department of Human Anatomy, School of Basic Medical Science, Southern Medical University;National Centre for Computer Animation, The Media School, Bournemouth University;National Centre for Computer Animation, The Media School, Bournemouth University;National Centre for Computer Animation, The Media School, Bournemouth University;National Centre for Computer Animation, The Media School, Bournemouth University;National Centre for Computer Animation, The Media School, Bournemouth University;Department of Radiology, Guangzhou General Hospital of Guangzhou Military Command;Guangdong Provincial Medical Biomechanical Key Laboratory, Department of Human Anatomy, School of Basic Medical Science, Southern Medical University;National Centre for Computer Animation, The Media School, Bournemouth University
Abstract:Objective To develop a registration method for 3D human atlas models by using geometric information of the vertices so as to lay a foundation for statistical modeling of atlas. Methods Based on CT images of the normal human, thirty 3D models of human atlases were created and marked by the manual selected points, including 1 standard module, 20 training sets and 9 testing samples. The training sets were first registered with the standard module, including calculation on geometric information of the individual vertex and optimization process of weight coefficients in the registration models. By minimizing the energy function defined with the Euclidean distances between the automatic registered points and the manual selected points in training sets, the optimized weight coefficients could be obtained. The testing samples were then registered with the standard module to calculate the Euclidean distances between the automatic registered points and the manual selected points. The results were then compared with the training sets to evaluate the stability of the registration method. Results The registration function and the corresponding optimized weight coefficients were obtained, and the average errors for the training sets and testing samples were 1.983 mm and 2.045 mm, respectively. Further statistical analysis showed that there were no obvious differences in the error distributions among the training sets and testing samples. Conclusions The accuracy and stability of the proposed registration method meet the requirement in medical applications, and it can provide automatic registration of points of interest on human atlas models and be used for element classification in statistical modeling.
Keywords:Atlas  3D model  Registration  Vertex information  CT scans
本文献已被 CNKI 等数据库收录!
点击此处可从《医用生物力学》浏览原始摘要信息
点击此处可从《医用生物力学》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号