2017年1月17日
Tactile brain-computer interface using classification of P300 responses evoked by full body spatial vibrotactile stimuli
2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2016
- ,
- ,
- 記述言語
- 掲載種別
- 研究論文(国際会議プロシーディングス)
- DOI
- 10.1109/APSIPA.2016.7820734
© 2016 Asia Pacific Signal and Information Processing Association. In this study we propose a novel stimulus-driven brain-computer interface (BCI) paradigm, which generates control commands based on classification of somatosensory modality P300 responses. Six spatial vibrotactile stimulus patterns are applied to entire back and limbs of a user. The aim of the current project is to validate an effectiveness of the vibrotactile stimulus patterns for BCI purposes and to establish a novel concept of tactile modality communication link, which shall help locked-in syndrome (LIS) patients, who lose their sight and hearing due to sensory disabilities. We define this approach as a full-body BCI (fbBCI) and we conduct psychophysical stimulus evaluation and realtime EEG response classification experiments with ten healthy body-able users. The grand mean averaged psychophysical stimulus pattern recognition accuracy have resulted at 98.18%, whereas the realtime EEG accuracy at 53.67%. An information-transfer-rate (ITR) scores of all the tested users have ranged from 0.042 to 4.154 bit/minute.
- リンク情報
- ID情報
-
- DOI : 10.1109/APSIPA.2016.7820734
- SCOPUS ID : 85013810729