Accès libre

A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control

À propos de cet article

Citez

Figure 1:

The proposed face-machine interface system for simulated wheelchair control using artifacts from an EEG neuroheadset.
The proposed face-machine interface system for simulated wheelchair control using artifacts from an EEG neuroheadset.

Figure 2:

Components of the EEG signal acquisition process using an Emotiv EPOC X neuroheadset.
Components of the EEG signal acquisition process using an Emotiv EPOC X neuroheadset.

Figure 3:

Examples of EEG signals from the Emotiv Neuroheadset: (a) left eye winking, (b) right eye winking, and (c) both eyes winking.
Examples of EEG signals from the Emotiv Neuroheadset: (a) left eye winking, (b) right eye winking, and (c) both eyes winking.

Figure 4:

Examples of EEG signals from the Emotiv Neuroheadset during: (a) left side jaw chewing, (b) right side jaw chewing, and (c) jaw chewing on both sides.
Examples of EEG signals from the Emotiv Neuroheadset during: (a) left side jaw chewing, (b) right side jaw chewing, and (c) jaw chewing on both sides.

Figure 5:

Flowchart of the proposed classification decisions.
Flowchart of the proposed classification decisions.

Figure 6:

The proposed modalities for simulated wheelchair control.
The proposed modalities for simulated wheelchair control.

Figure 7:

(a) The testing route (total distance: 30 m). (b) A sample scenario encountered by the simulated power wheelchair during testing.
(a) The testing route (total distance: 30 m). (b) A sample scenario encountered by the simulated power wheelchair during testing.

Figure 8:

The average times taken by all participants to complete route 1.
The average times taken by all participants to complete route 1.

Figure 9:

The average times taken by all participants to complete route 2.
The average times taken by all participants to complete route 2.

Results of the proposed control modalities.

Average accuracy (%)
ParticipantsProposed modality #1Proposed modality #2
195.8100
295.8100
391.795.8
487.595.8
591.7100
687.591.7
795.895.8
891.795.8
Mean ± SD92.2 ± 3.4696.9 ± 2.94

The commands for simulated wheelchair control.

Commands No.ActionsOutput commands
1Jaw chewing on both sidesForward
2Jaw chewing on the left sideTurn Left
3Jaw chewing on the right sideTurn Right
4Winking both eyesBackward
5Winking the left eyeTurn Left
6Winking the right eyeTurn Right
OptionalWinking the left eye and then the right eye within 3 secEnable/Disable System

j.ijssis-2021-015.tab.002

if JL >TJL & JR>TJR,Decision is “Com#1”
if JL>JR & JL>TJL,Decision is “Com#2”
if JR>JL & JR>TJR,Decision is “Com#3”
if WL >TWL & WR>TWR,Decision is “Com#4”
if WL >WR & WL>TWL,Decision is “Com#5”
if WR>WL & WR>TWR,Decision is “Com#6”
Otherwise,No Decision

The command sequence for testing the performance of the proposed system.

Sequence No.CommandsSequence No.Commands
1Turn Left7Turn Right
2Turn Right8Turn Left
3Turn Right9Backward
4Turn Left10Turn Left
5Forward11Turn Right
6Backward12Forward
eISSN:
1178-5608
Langue:
Anglais
Périodicité:
Volume Open
Sujets de la revue:
Engineering, Introductions and Overviews, other