Accès libre

A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control

À propos de cet article

Citez

Many people suffer from movement disabilities and would benefit from an assistive mobility device with practical control. This paper demonstrates a face-machine interface system that uses motion artifacts from electroencephalogram (EEG) signals for mobility enhancement in people with quadriplegia. We employed an Emotiv EPOC X neuroheadset to acquire EEG signals. With the proposed system, we verified the preprocessing approach, feature extraction algorithms, and control modalities. Incorporating eye winks and jaw movements, an average accuracy of 96.9% across four commands was achieved. Moreover, the online control results of a simulated power wheelchair showed high efficiency based on the time condition. The combination of winking and jaw chewing results in a steering time on the same order of magnitude as that of joystick-based control, but still about twice as long. We will further improve the efficiency and implement the proposed face-machine interface system for a real-power wheelchair.

eISSN:
1178-5608
Langue:
Anglais
Périodicité:
Volume Open
Sujets de la revue:
Engineering, Introductions and Overviews, other