Brain-computer interface based on generation of visual images

From Brede Wiki
Jump to: navigation, search
Paper (help)
Brain-computer interface based on generation of visual images
Authors: Pavel Bobrov, Alexander Frolov, Charles Cantor, Irina Fedulova, Mikhail Bakhnyan, Alexander Zhavoronkov
Citation: PLOS ONE 6 (6): e20674. 2011 June
Database(s): Google Scholar cites
DOI: 10.1371/journal.pone.0020674.
Link(s): http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0020674&representation=PDF
Search
Web: Bing Google Yahoo!Google PDF
Article: BASE Google Scholar PubMed
Restricted: DTU Digital Library
Other: NIF
Services
Format: BibTeX
Extract: Talairach coordinates from linked PDF: CSV-formated wiki-formated

Brain-computer interface based on generation of visual images describes a brain-computer interface.

Contents

[edit] Abstract from paper (CC-BY)

This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects) and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive Bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP) classifier.

[edit] Subjects

Subject group #1 (help)
Male subjects
Subjects/♂/♀: 7 / 7 / 0
Age: (23–30)
Handedness: Right
Nationality: Russian
Approval: Institute for Higher Nervous Activity and Neurophysiology of the Russian Academy of Sciences
Databases:

Group 1 of 7 right-handed male subjects with 7 males and 0 females were included in the study. The Russian group was from 23 up to 30 years old. The study on the human subjects was approved by the Institute for Higher Nervous Activity and Neurophysiology of the Russian Academy of Sciences.

[edit] Experiment

[edit] Image experiment

  • Pictures from Yale Face Database B
  • Pictures from Microsoft Research Cambridge Object Recognition Data Base, version 1

These images was adjusted to black-and-white.

Each stimulation and imagination:

  1. 3 seconds "marker turned blue"
  2. 7 seconds relaxation
  3. 3 seconds "marker turned blue"
  4. 15 seconds command to imagine

[edit] "EOG" experiment

Five EOG events: Blinking, upward, rightward, downward and leftward eye movement.

[edit] Methods

  • Classifier
    • "Bayesian approach"
    • "MCSP method"[1]
  • EOG artifact removal with an implementation of EEGLAB RUNICA

[edit] Results

  • 0.52-.58 recognized states (relax, imaging house, imaging face)
  • 0.70-0.81 recognized EOG states (five states)

[edit] Related papers

  1. A P300-based quantitative comparison between the Emotiv Epoc headset and a medical EEG device
  2. Consumer neuroscience: Assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking
  3. Interface to convert mental states and facial expressions to application input
  4. Motor imagery and action observation: modulation of sensorimotor brain rhythms during mental control of a brain-computer interface

[edit] References

  1. Multilinear generalization of common spatial pattern
Personal tools