In this project the objective is to access the perception mechanism of the visual system with the ‘noisy’ but time accurate EEG recordings. We are now interested in otpimizing the capacity of human visual search in free eye movement paradigm.


Previous Work

In the area of Brain Computer Interfaces (BCI) the group of Professor Lucas’s has focused on search tasks and how to use the EEG signal from the human subject as input for the computer to help detect particular targets in application of BCI systems to search tasks.

The previous group’s work was based on Rapid Serial Visual Presentation (RSVP), the subject were shown 100 images at 10Hz and asked to detect a two targets within the 100 images. The EEG signal was classified based on the p300 [1] signature to find the image of the detected stimuli [2].

In this experiment the subject was asked to avoid the eye movements (macro-saccades), as it would increase the noncorrelated signal in the EEG data and it would decrease the detection rate of subject (each time they would saccade they would increase the prbability of missing the target, saccade blindness).

In follow up study we characterized the subject’s detection rate with several features of the images, in particular distance from the center of the image. We found not surprisingly that in the search of helipads (picture) it was possible for the subject to detect the target outside of the fovea area. So based on these results that point to the periphery as a detection area in visual system, we would like to understand if there is a difference in the EEG signal that can correlated with the saccades on target.

For that we the first thing is to come up with an experiment that will allow for eye movements in a search task.


Present Hypothesis

For now the objective is to study the synchromization between saccades decision and target visual perception in the EEG signal. The idea is that there are informed and non-inforned saccades. The informed saccades would be the saccade that are based on the information that is previously acquired and the non-informed saccades are the ‘exploratory’ saccades, with little information bias.

In order to test this hypothesis we are developing a set of experiments that have as main objective allow to look for signatures in the EEG saccade locked data. In the search task two search conditions will be studied: looking without seing (inattentional blindness [3]) and seeing before looking through EEG and Eye tracking.

In this project we use EEG recording and Eye tracking to access both the electrical activity of the brain and the eye movements.

In post processing we try to synchronize correctly each saccade to the corresponding EEG signal. We further classify each selected saccade as saccade to target, distractor or miss.

In the Signal Processing area we are trying to uncover the time and frequency profiles of visual perception in the EEG signal, that correlates with the classification of the saccades( target / distractor / miss ).

We use Matlab and Python.



1- Sutton, S., Braren, M., Zubin, J., & John, E. R. (1965). Evoked-Potential Correlates of Stimulus Uncertainty.Science, 150(3700), 1187-1188. PDF

2- Adam D. Gerson, Lucas C. Parra, Paul Sajda, “Cortically-coupled Computer Vision for Rapid Image Search”,IEEE Transactions on Neural Systems & Rehabilitation Engineering, vol. 14, no. 2, pp. 174-179, June 2006. PDF

3-Simons, D. J.; Chabris, C. F. (1999). “Gorillas in our midst: sustained inattentional blindness for dynamic events”. Perception 28: 1059–1074. PDF

Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>