QuickSearch:   Number of matching entries: 0.

Search Settings

    AuthorTitleYearJournal/ProceedingsReftypeDOI/URL
    Rebsamen, B., Burdet, E., Guan, C., Zhang, H., Teo, C.L., Zeng, Q., Ang, M. & Laugier, C. A Brain-Controlled Wheelchair Based on P300 and Path Guidance 2006 Biomedical Robotics and Biomechatronics, 2006. BioRob 2006. The First IEEE/RAS-EMBS International Conference on, pp. 1101-1106  article DOI  
    Abstract: This paper presents the first working prototype of a brain controlled wheelchair able to navigate inside a typical office or hospital environment. This brain controlled wheelchair (BCW) is based on a slow but safe P300 interface. To circumvent the problem caused by the low information rate of the EEG signal, we propose a motion guidance strategy providing safe and efficient control without complex sensors or sensor processing. Experiments demonstrated that healthy subjects could safely control the wheelchair in an office like environment, without any training
    Review: The paper A Brain-Controlled Wheelchair Based on P300 and Path Guidance showed the "first" wheel chair to be controlled in an office like environment via BMI/BCI. The wheel chair was controlled via the P300 system. The P300 system is slower but more reliable for button choosing. In this system the wheel chair would have pre defined paths in the office that the user would simply pick a location where to go with their BMI/BCI. This solution shows how you can combine autonomous robots with BMI/BCI user input but is slow.
    BibTeX:
    @article{Rebsamen2006,
      author = {Rebsamen, B. and Burdet, E. and Guan, C. and Haihong Zhang and Chee Leong Teo and Qiang Zeng and Ang, M. and Laugier, C.},
      title = {A Brain-Controlled Wheelchair Based on P300 and Path Guidance},
      journal = {Biomedical Robotics and Biomechatronics, 2006. BioRob 2006. The First IEEE/RAS-EMBS International Conference on},
      year = {2006},
      pages = {1101-1106},
      doi = {http://dx.doi.org/10.1109/BIOROB.2006.1639239}
    }
    
    Rubin Wang, E.S. & Gu, F. Advances in Cognitive Neurodynamics ICCN 2007: Proceedings of the International Conference on Cognitive Neurodynamics. ICCN 2007 Proceedings: Multi-Command Real-Time Brain Machine Interface Using SSVEP: Feasibility Study for Occipital and Forehead Sensor Locations 2008 , pp. 783-786  inbook DOI  
    Abstract: Abstract We propose a new multi-stage procedure for a real time brain machine/computer interface (BMI) based on the Steady State Visual Evoked Potentials (SSVEP) paradigm. The developed system allows a BMI user to navigate a small car (or any other object) on the computer screen in near real time, in any of the four directions. Extensive experiments with 4 young healthy subjects for two different electrode configurations (Occipital/Forehead), confirmed the high performance of the proposed on-line BMI system.
    Review: The paper Multi-Command Real-Time Brain Machine interface using SSVEP: Feasibility Study for Occipital and Forehead Sensor Locations showed the results from a visual stimulation method of BMI decision making. This was done by putting four small checkerboard images next to a car on a CRT Monitor running at 120hz. One checkerboard would be in to the top of the car, one would be to the bottom of the car, one to the left and one to the right. Each checkerboard would flickering at a different frequency. One set of tests were done with low frequencies, between 5 and 8 Hz, and one test was done with middle range frequencies, 12 to 17Hz. Each test was also done using Occipital configuration for the electrodes and one for electrodes on the forehead. All subjects were in good health. The results showed that the Occipital accuracy out preformed the forehead mounted electrodes with an accuracy of 96.5% for the Mid range Frequencies to 78% for the forehead.
    BibTeX:
    @inbook{RubinWang2008,
      author = {Rubin Wang, Enhua Shen and Fanji Gu},
      title = {Advances in Cognitive Neurodynamics ICCN 2007: Proceedings of the International Conference on Cognitive Neurodynamics. ICCN 2007 Proceedings: Multi-Command Real-Time Brain Machine Interface Using SSVEP: Feasibility Study for Occipital and Forehead Sensor Locations},
      publisher = {Springer Netherlands},
      year = {2008},
      pages = {783-786},
      doi = {http://dx.doi.org/10.1007/978-1-4020-8387-7_135}
    }
    
    Trejo, L., Rosipal, R. & Matthews, B. Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials 2006 Neural Systems and Rehabilitation Engineering, IEEE Transactions on
    Vol. 14(2), pp. 225-229 
    article DOI  
    Abstract: We have developed and tested two electroencephalogram (EEG)-based brain-computer interfaces (BCI) for users to control a cursor on a computer display. Our system uses an adaptive algorithm, based on kernel partial least squares classification (KPLS), to associate patterns in multichannel EEG frequency spectra with cursor controls. Our first BCI, Target Practice, is a system for one-dimensional device control, in which participants use biofeedback to learn voluntary control of their EEG spectra. Target Practice uses a KPLS classifier to map power spectra of 62-electrode EEG signals to rightward or leftward position of a moving cursor on a computer display. Three subjects learned to control motion of a cursor on a video display in multiple blocks of 60 trials over periods of up to six weeks. The best subject's average skill in correct selection of the cursor direction grew from 58% to 88% after 13 training sessions. Target Practice also implements online control of two artifact sources: 1) removal of ocular artifact by linear subtraction of wavelet-smoothed vertical and horizontal electrooculograms (EOG) signals, 2) control of muscle artifact by inhibition of BCI training during periods of relatively high power in the 40-64 Hz band. The second BCI, Think Pointer, is a system for two-dimensional cursor control. Steady-state visual evoked potentials (SSVEP) are triggered by four flickering checkerboard stimuli located in narrow strips at each edge of the display. The user attends to one of the four beacons to initiate motion in the desired direction. The SSVEP signals are recorded from 12 electrodes located over the occipital region. A KPLS classifier is individually calibrated to map multichannel frequency bands of the SSVEP signals to right-left or up-down motion of a cursor on a computer display. The display stops moving when the user attends to a central fixation point. As for Target Practice, Think Pointer also implements wavelet-based online removal of ocular art- - ifact; however, in Think Pointer muscle artifact is controlled via adaptive normalization of the SSVEP. Training of the classifier requires about 3 min. We have tested our system in real-time operation in three human subjects. Across subjects and sessions, control accuracy ranged from 80% to 100% correct with lags of 1-5 s for movement initiation and turning. We have also developed a realistic demonstration of our system for control of a moving map display (http://ti.arc.nasa.gov/).
    Review: The paper “Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials” showed how using visual stimulus a curser could be moved via the use of EEG data. The stimulus consisted of four sets of checker pattern bars, one on the top, one on the bottom, one on the left, and one on the right side of the screen, which would blink at different frequencies. The EEG would then look at the optic nerves and look for the frequencies from the stimulus to tell the curser where to move. This method is simple and seems to have worked for most cases. This is also being developed through NASA for computer interface when in a space suit.
    BibTeX:
    @article{Trejo2006,
      author = {Trejo, L.J. and Rosipal, R. and Matthews, B.},
      title = {Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials},
      journal = {Neural Systems and Rehabilitation Engineering, IEEE Transactions on},
      year = {2006},
      volume = {14},
      number = {2},
      pages = {225-229},
      doi = {http://dx.doi.org/10.1109/TNSRE.2006.875578}
    }
    
    Zhao, M., Rattanatamrong, P., DiGiovanna, J., Mahmoudi, B., Figueiredo, R.J., Sanchez, J.C., Principe, J.C. & Fortes, J.A.B. BMI cyberworkstation: Enabling dynamic data-driven brain-machine interface research through cyberinfrastructure 2008 Engineering in Medicine and Biology Society, 2008. EMBS 2008. 30th Annual International Conference of the IEEE, pp. 646-649  article DOI  
    Abstract: Abstract—Dynamic data-driven brain-machine interfaces (DDDBMI) have great potential to advance the understanding of neural systems and improve the design of brain-inspired rehabilitative systems. This paper presents a novel cyberinfrastructure that couples in vivo neurophysiology experimentation with massive computational resources to provide seamless and efficient support of DDDBMI research. Closed-loop experiments can be conducted with in vivo data acquisition, reliable network transfer, parallel model computation, and real-time robot control. Behavioral experiments with live animals are supported with real-time guarantees. Offline studies can be performed with various configurations for extensive analysis and training. A Web-based portal is also provided to allow users to conveniently interact with the cyberinfrastructure, conducting both experimentation and analysis. New motor control models are developed based on this approach, which include recursive least square based (RLS) and reinforcement learning based (RLBMI) algorithms. The results from an online RLBMI experiment shows that the cyberinfrastructure can successfully support DDDBMI experiments and meet the desired real-time requirements.
    Review: The paper BMI Cyberworkstation: Enabling Dynamic Data-Driven Brain-Machine Interface Research through Cyberinfrastructure gives an outline of how the Neuroprosthetic Research Group, NRG, and the Advanced Computing Information Systems Lab, ACIS at the University of Florida set up a system where they can acquire BMI data from a host in parallel, then analyze that data using an array of powerful computers. The data would be transmitted over a computer network over UDP with customized error checking and a buffer in order to increase speed. Their results showed that they could get about 10ms response time. The paper did not give examples of what they did or said how they did it besides how they setup the network. Two of their papers were cited which should explain what they did.
    BibTeX:
    @article{Zhao2008,
      author = {Zhao, Ming and Rattanatamrong, Prapaporn and DiGiovanna, Jack and Mahmoudi, Babak and Figueiredo, Renato J. and Sanchez, Justin C. and Principe, Jose C. and Fortes, Jose A. B.},
      title = {BMI cyberworkstation: Enabling dynamic data-driven brain-machine interface research through cyberinfrastructure},
      journal = {Engineering in Medicine and Biology Society, 2008. EMBS 2008. 30th Annual International Conference of the IEEE},
      year = {2008},
      pages = {646-649},
      doi = {http://dx.doi.org/10.1109/IEMBS.2008.4649235}
    }
    

    Created by JabRef on 10/03/2009.