首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到2条相似文献,搜索用时 0 毫秒
1.
Effective communication with the hearing and speech impaired often requires at least a basic working knowledge of sign language gestures, without which a memo pad and pen, or a mobile phone's notepad is indispensable. The aim of this study was to build a neural network that could be used to recognize static finger-hand gestures of the yubimoji, the Japanese sign language syllabary. To build the network, signal inputs from a data glove interface were taken for each of the static yubimoji gestures. The network was trained and tested 10 times using a multilayer perceptron model. Overall, only 18 of the 41 static gestures were successfully recognized. One of the reasons was attributed to the inability of the data glove to measure gesture directions particularly for yubimoji gestures with similar finger configurations. Future work will focus on these problems as well as the inclusion of dynamic yubimoji gestures.  相似文献   

2.
In a previous paper, the authors built a neural network model to recognize Japanese sign language syllabary or yubimoji. One of the problems encountered in that study was the accurate digital representation and distinction of similar yubimoji gestures, i.e. gestures with the same finger flexure positions but with different hand/finger orientations. This study focuses on these yubimoji gestures. Using data from a glove interface with bend sensors and accelerometers, a neural network was built, trained and tested. The network performed well and good results were obtained.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号