Diff for "EyeTracking" - Meg Wiki
location: Diff for "EyeTracking"
Differences between revisions 8 and 35 (spanning 27 versions)
Revision 8 as of 2009-04-23 13:30:17
Size: 5624
Comment:
Revision 35 as of 2009-11-16 11:44:23
Size: 8584
Editor: JohanCarlin
Comment:
Deletions are marked like this. Additions are marked like this.
Line 2: Line 2:
The CBU currently has 4 eye trackers. One is in the MEG lab, one in the MRI (from May 2009), and we have two separate eye trackers for use in other locations. All eye trackers are made by SMI and use the same software for controlling the eye tracking hardware, for stimulus presentation and for analysis of the eye tracking data. In addition, all trackers can also be used with E-Prime, if needed. The CBU currently has 4 eye trackers. One is in the MEG lab, one in the MRI scanner, and we have two separate eye trackers for use in other locations. All eye trackers are manufactured by SMI and use the same SMI software for controlling the eye tracking hardware, for stimulus presentation and for analysis of the eye tracking data. In addition, all trackers can also be used with E-Prime, if needed.
Line 4: Line 4:
All 4 eye trackers use 'dark pupil' technology, where the gaze of the eye is tracked by identifying the pupil and the reflection of an infra-red light source on the cornea. All 4 eye trackers use 'dark pupil' technology, where the gaze of the eye is tracked with an infra-red camera by identifying the pupil and the reflection of an infra-red light source on the cornea.
Line 6: Line 6:
Eye trackers come in two variaties: 'contact' and 'remote'. With contact eye trackers the camera and light source are fixed to the head, or the head is on a chin rest to limit movements relative to the camera. In remote systems the camera and light source are in a fixed location, at some distance from the subject, enabling some head movements. Contact systems are more precise and reliable, in general. Eye trackers come in two varieties: 'contact' and 'remote'. With contact eye trackers the camera and light source are fixed to the head, or the head is on a chin rest to limit movements relative to the camera. In remote systems the camera and light source are in a fixed location, at some distance from the subject, enabling some head movements. Contact systems are more precise and reliable, in general.
Line 8: Line 8:
Another important feature of an eye tracker is the frequency. To be able to follow the eye during saccades the minimum frequency is about 200 Hz. Another important feature of an eye tracker is the sampling frequency. To be able to follow the eye during saccades the minimum frequency necessary is about 200 Hz. For 'heat maps', which map summed gaze duration by location, or for AOI dwell time analyses lower frequencies are sufficient.
Line 10: Line 10:
All eye trackers are able to output the x and y coordinates of the screen location the subject is looking at, and the diameter of the pupil. That last value will be 0 during a blink. All our eye trackers are able to output the x and y coordinates of the screen location the subject is looking at, and the diameter of the pupil. That last value will be 0 during a blink.
Line 12: Line 12:
=== MEG eye tracker ===
This is a 50 Hz system that should theoretically be called a remote eye tracker, but actually behaves more like a contact system as head movement is extremely limited in the MEG helmet. For that reason the camera can actually be zoomed in quite closely and the accuracy of the MEG eye tracker should be reasonably high.
A very good online resource for anything related to eye tracking is [http://www.eyemovementresearch.com eyemovementresearch.com].
Line 15: Line 14:
The eye tracker sits on a tripod under the screen, about 1.2 meter from the subject. It is controlled from a separate PC in the control room. = Eye trackers at the CBU =
[:MegEyeTracker:MEG eye tracker]
Line 17: Line 17:
The eye tracking data will be recorded with the MEG/EEG data in the .fif file produced by the MEG acquisition software. To enable this you will have to select the three first MISC channels in your setup. [:MRIEyeTracker:MRI eye tracker]
Line 19: Line 19:
In addition, the eye tracking data will also be saved on the PC controlling the tracker. This PC is connected to the Elekta trigger box, so will also record the triggers coming from the stimulus presentation machine and all responses from the subject. [:REDEyeTracker:RED eye tracker]
Line 21: Line 21:
=== MRI eye tracker ===
The MRI eye tracker will arrive in May. It will be a system comparable to that in the MEG.
[:HiSpeedEyeTracker:Hi-speed eye tracker]
Line 24: Line 23:
=== RED eye tracker ===
RED stands for 'Remote Eye-tracking Device', and this is a 50 Hz remote system. For that reason the precision is limited, but this is a very easy system to use and very comfortable for the subject. The system is also relatively portable and consists of a monitor with added camera unit, a power supply box and a laptop.
Line 27: Line 24:
=== Hi-speed eye tracker ===
This is a contact system that will allow frequencies of up to 1250 Hz for monocular eye tracking and 500 Hz for binocular tracking. The Hi-speed system is extremely precise and reliable, but requires the subject to put their head on a chin rest.

== Eye tracking in general ==
= Eye tracking in general =
Line 33: Line 27:
=== Setup ===
When the subject is in position you first need to make sure that you are getting a clear picture from the camera. The picture should be in focus and the eye should be in the center of the image and should be well lid without shadows of the eyelids on the eye. Make sure to verify that the picture is good while the subject is looking at all four corners of the screen, not just the middle. If an area of the picture causes problems the part of the picture that is used for eye tracking can be reduced. This works just like resizing a window.

It is important to always have a good look at the actual eye image that the camera sees to verify that the pupil is clearly visible and that the eye tracker is working properly. The RED eye tracker will only show a very crude representation of the eyes, just two ellipses, which is only good to see if the subject is positioned properly. Another window, showing the actual eye image, can be opened, and this is highly recommended, even if you think everything is fine. The SMI eye image will show an outline around the area that is interpreted as the pupil, and the area that is seen as the corneal reflection. Both these areas will also have cross-lines through them. This makes it very easy to see if the eye is being tracked properly. Ask your participant to look at all 4 corners of the screen while you inspect the eye image, as problems will ususally only show up with extreme eye angles.
Line 34: Line 33:
The first thing that needs to be done is a calibration. This is an automatic procedure that will adjust the main parameters to the specific subject. Calibration shouldn't take more that 20-30 seconds. If the subject leaves the setup, during a break, you will have to calibrate again at the beginning of the next block. Before an experiment can be run you need to do a calibration. This is an automatic procedure that will adjust the main parameters to the specific subject. It requires the subject to fixate on a point that is being moved to a number of locations all over the screen. Calibration shouldn't take more that 20-30 seconds. If the subject leaves the setup, during a break, you will have to calibrate again at the beginning of the next block. It is also possible to do a quick 'drift correction' during the experiment, where the subject only fixates a central point.
Line 37: Line 36:
Mascara can make eye tracking impossible, as the software will interpret the black regions in the picture as the pupil, and mascare is very black too. Mascare will alwasy have to be removed, and it is best to ask people not to wear any mascara when they participate in an experiment involving eye tracking. Glasses are normally not a problem. Bifocal or varifocal glasses can be more problematic, but normal glasses are no obstacle for eye tracking. The only problem can be with reflections. If the orientation of the glasses is such that a clear reflection of the infra-red light source is visible eye tracking can be impossible. This can usually be solved by changing the angle of the glasses. Dirty or scratched glasses can be a problem too. Contact lenses are usually no problem at all.
Line 39: Line 38:
Drooping eye lids can be a problem, as they can partly obscure the pupil. This is more common with older people. The solution is to move the camera to a lower position, so that the eye is filmed from below. Mascara can make eye tracking impossible, as the software will interpret the black regions in the picture as the pupil, and mascare is very black too. Mascare will always have to be removed, and it is best to ask people not to wear any mascara when they participate in an experiment involving eye tracking.
Line 41: Line 40:
Left-right difference. Eye tracking data will always be less reliable and distorted towards the edges of the screen, and the problems are more serious for the eye on the opposite side. Try to present your stimuli is the center section of the screen, if possible. Drooping eye lids can be a problem, as they can partly obscure the pupil. This is more common with older people. The solution is to move the camera to a lower position, if possible, so that the eye is filmed from below. If that doesn't help an eyelash curler can be used.

Left-right difference. Eye tracking data will always be less reliable and distorted towards the edges of the screen, and the problems are more serious for the eye on the opposite side. Try to present your stimuli in the center section of the screen, if possible.

=== E-Prime ===

The SMI eye trackers come with their own stimulus presentation software, Experiment Center, and their own analyses tool, BeGaze. Experiments designed and executed in Experiment Center are very easy to analyse in BeGaze, as everything is recognised automatically. When using E-Prime things are a bit more complicated.
Read more about this in the section EyeTrackingWithEprime.
Line 44: Line 51:
Some of our eye trackers offer the choice between monocular and binocular eye tracking. There is no real consensus about which one is better. The RED system will always record binocular data, and the MEG and MRI systems will usually record monocular data. The hi-speed system can be switched between the two modes very quickly. Some of our eye trackers offer the choice between monocular and binocular eye tracking. There is no real consensus about which is better. All systems only have a single camera, so it will have to be zoomed out to cover both eyes. This will lower the resolution, and with that the precision. It is unclear if the advantage of tracking both eyes will compensate for that. The RED system will always record binocular data, and the MEG and MRI systems will usually record monocular data. The hi-speed system can be switched between the two modes very quickly.
Line 46: Line 53:
When using an eye tracker in monocular mode, the next question is which eye to use. There are two possibilities: using the same eye for all subjects, or using the dominant eye for each subject. the dominant eye can be found by asking the subject to look through a small hole in a card. People do this with their dominant eye. In binocular mode the locations of the left and right eye can be averaged. This can give enhanced precision when both eyes are correctly tracked, but can reduce reliability when one of the eyes has problems. The only way to be sure which option is best is to inspect the raw output for both eyes.
Line 48: Line 55:
The advantage of using the same eye for all subjects is that your data will be more consistent. This is because of the left-right distortion mentioned in the previous section. When using an eye tracker in monocular mode, the next question is which eye to use. There are two possibilities: using the same eye for all subjects, or using the dominant eye for each subject. The dominant eye can be found by asking the subject to look through a small hole in a card. People do this with their dominant eye.

The
advantage of using the same eye for all subjects is that your data will be more consistent over subjects. This is because of the left-right distortion mentioned in the previous section.
Line 51: Line 60:

= Data Analysis =
The eye tracking data can be analysed very easily with the SMI software package BeGaze. This will plot raw data, gaze paths, dwell times, heatmaps etc. at the click of a mouse. It cannot do statistical analyses, though, and you will have to export the data from BeGaze for that. BeGaze will allow you to create areas of interest, even moving ones in videos, and calculate total dwell time for all your AOI's.

The raw data coming from the eye tracker will in many cases be converted to fixations and saccades. The algorithms used for this do vary quite a bit in the results they will produce, and all of them are also very sensitive to parameter settings. It is highly recommended to double check the results against the raw data (BeGaze has a tool for this) to verify that the conversion is correct, or at least as expected. Direct comparisons between different studies are impossible for this reason, at least as far as fixation data is concerned.

Eye tracking

The CBU currently has 4 eye trackers. One is in the MEG lab, one in the MRI scanner, and we have two separate eye trackers for use in other locations. All eye trackers are manufactured by SMI and use the same SMI software for controlling the eye tracking hardware, for stimulus presentation and for analysis of the eye tracking data. In addition, all trackers can also be used with E-Prime, if needed.

All 4 eye trackers use 'dark pupil' technology, where the gaze of the eye is tracked with an infra-red camera by identifying the pupil and the reflection of an infra-red light source on the cornea.

Eye trackers come in two varieties: 'contact' and 'remote'. With contact eye trackers the camera and light source are fixed to the head, or the head is on a chin rest to limit movements relative to the camera. In remote systems the camera and light source are in a fixed location, at some distance from the subject, enabling some head movements. Contact systems are more precise and reliable, in general.

Another important feature of an eye tracker is the sampling frequency. To be able to follow the eye during saccades the minimum frequency necessary is about 200 Hz. For 'heat maps', which map summed gaze duration by location, or for AOI dwell time analyses lower frequencies are sufficient.

All our eye trackers are able to output the x and y coordinates of the screen location the subject is looking at, and the diameter of the pupil. That last value will be 0 during a blink.

A very good online resource for anything related to eye tracking is [http://www.eyemovementresearch.com eyemovementresearch.com].

Eye trackers at the CBU

[:MegEyeTracker:MEG eye tracker]

[:MRIEyeTracker:MRI eye tracker]

[:REDEyeTracker:RED eye tracker]

[:HiSpeedEyeTracker:Hi-speed eye tracker]

Eye tracking in general

Eye tracking is not as easy as it might look, and there's also quite a bit of variability between subjects. Most problems can be solved and with the vast majority of people you should be able to acquire decent quality eye tracking data.

Setup

When the subject is in position you first need to make sure that you are getting a clear picture from the camera. The picture should be in focus and the eye should be in the center of the image and should be well lid without shadows of the eyelids on the eye. Make sure to verify that the picture is good while the subject is looking at all four corners of the screen, not just the middle. If an area of the picture causes problems the part of the picture that is used for eye tracking can be reduced. This works just like resizing a window.

It is important to always have a good look at the actual eye image that the camera sees to verify that the pupil is clearly visible and that the eye tracker is working properly. The RED eye tracker will only show a very crude representation of the eyes, just two ellipses, which is only good to see if the subject is positioned properly. Another window, showing the actual eye image, can be opened, and this is highly recommended, even if you think everything is fine. The SMI eye image will show an outline around the area that is interpreted as the pupil, and the area that is seen as the corneal reflection. Both these areas will also have cross-lines through them. This makes it very easy to see if the eye is being tracked properly. Ask your participant to look at all 4 corners of the screen while you inspect the eye image, as problems will ususally only show up with extreme eye angles.

Calibration

Before an experiment can be run you need to do a calibration. This is an automatic procedure that will adjust the main parameters to the specific subject. It requires the subject to fixate on a point that is being moved to a number of locations all over the screen. Calibration shouldn't take more that 20-30 seconds. If the subject leaves the setup, during a break, you will have to calibrate again at the beginning of the next block. It is also possible to do a quick 'drift correction' during the experiment, where the subject only fixates a central point.

Problems

Glasses are normally not a problem. Bifocal or varifocal glasses can be more problematic, but normal glasses are no obstacle for eye tracking. The only problem can be with reflections. If the orientation of the glasses is such that a clear reflection of the infra-red light source is visible eye tracking can be impossible. This can usually be solved by changing the angle of the glasses. Dirty or scratched glasses can be a problem too. Contact lenses are usually no problem at all.

Mascara can make eye tracking impossible, as the software will interpret the black regions in the picture as the pupil, and mascare is very black too. Mascare will always have to be removed, and it is best to ask people not to wear any mascara when they participate in an experiment involving eye tracking.

Drooping eye lids can be a problem, as they can partly obscure the pupil. This is more common with older people. The solution is to move the camera to a lower position, if possible, so that the eye is filmed from below. If that doesn't help an eyelash curler can be used.

Left-right difference. Eye tracking data will always be less reliable and distorted towards the edges of the screen, and the problems are more serious for the eye on the opposite side. Try to present your stimuli in the center section of the screen, if possible.

E-Prime

The SMI eye trackers come with their own stimulus presentation software, Experiment Center, and their own analyses tool, BeGaze. Experiments designed and executed in Experiment Center are very easy to analyse in BeGaze, as everything is recognised automatically. When using E-Prime things are a bit more complicated. Read more about this in the section EyeTrackingWithEprime.

Monocular versus binocular

Some of our eye trackers offer the choice between monocular and binocular eye tracking. There is no real consensus about which is better. All systems only have a single camera, so it will have to be zoomed out to cover both eyes. This will lower the resolution, and with that the precision. It is unclear if the advantage of tracking both eyes will compensate for that. The RED system will always record binocular data, and the MEG and MRI systems will usually record monocular data. The hi-speed system can be switched between the two modes very quickly.

In binocular mode the locations of the left and right eye can be averaged. This can give enhanced precision when both eyes are correctly tracked, but can reduce reliability when one of the eyes has problems. The only way to be sure which option is best is to inspect the raw output for both eyes.

When using an eye tracker in monocular mode, the next question is which eye to use. There are two possibilities: using the same eye for all subjects, or using the dominant eye for each subject. The dominant eye can be found by asking the subject to look through a small hole in a card. People do this with their dominant eye.

The advantage of using the same eye for all subjects is that your data will be more consistent over subjects. This is because of the left-right distortion mentioned in the previous section.

The advantage of using the dominant eye is that the data will be less noisy. The non-dominant eye can have the tendency to make 'glissades', small eye movements to align itself with the dominant eye at the end of a saccade. These will not be present to the same extent in the dominant eye.

Data Analysis

The eye tracking data can be analysed very easily with the SMI software package BeGaze. This will plot raw data, gaze paths, dwell times, heatmaps etc. at the click of a mouse. It cannot do statistical analyses, though, and you will have to export the data from BeGaze for that. BeGaze will allow you to create areas of interest, even moving ones in videos, and calculate total dwell time for all your AOI's.

The raw data coming from the eye tracker will in many cases be converted to fixations and saccades. The algorithms used for this do vary quite a bit in the results they will produce, and all of them are also very sensitive to parameter settings. It is highly recommended to double check the results against the raw data (BeGaze has a tool for this) to verify that the conversion is correct, or at least as expected. Direct comparisons between different studies are impossible for this reason, at least as far as fixation data is concerned.

CbuMeg: EyeTracking (last edited 2022-02-23 17:42:55 by OlafHauk)