You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here's the text for the eye physiology analysis section. Where can I send the scripts?
Eye tracking using the Eyelink software in the scanner: introduction
Here is some hopefully useful information about using eyetracking data as it comes off of the scanner. Please note that I did this analysis in MATLAB, but there may be some procedural things that are useful even if you are using Python. In my study, I use the X/Y coordinates of the eye, pupil dilation data, blinks, and saccades. There are some challenges to doing this on the analysis end, which I will cover. There are obviously other functionalities that I am not using, such as using the real-time position of the eye to control stimulus presentation, which I know Abby Novick has done. There are also ways to input and save the data within MATLAB itself rather than on the eyetracking computer and converting later.
For experiments, the most relevant scripts to check out are ‘init_ss.m’ and ‘playMovEZ.m’.
For analysis, the most relevant script to check out is ‘XSubHR.m’.
Calling the Eyelink using MATLAB functions
I like to create a flag variable, for instance, ‘et’, that, if set to 1, runs all of the functions. The text below will initialize the functions for the current psych toolbox window and then launch the eye tracker set up.
if et;EyelinkInit();el=EyelinkInitDefaults(window); EyelinkDoTrackerSetup(el,'c');mode=0;
You will now have to do set up on the eyetracking computer before the Psychtoolbox scripts will continue. See the below section titled, ‘Running the study’, for more information on this.
If you want to start saving a new file, you might create a command like this, where you have some file name that is saved on the eyetracking computer. Here I have done so using the subject number as a string and the trial number.
if et;edfFile=[sub '_' num2str(gtrial) '.edf'];
Eyelink('Openfile',edfFile);Eyelink('StartRecording');Eyelink('Message','SYNCTIME');end
To stop recording, use a command like this.
if et;Eyelink('StopRecording');Eyelink('CloseFile');end
When you are finished with a session, you can shut things down with this. Use ListenChar if you had previously turned off character input.
if et;Eyelink('Shutdown');ListenChar(1);end
Running the study
Before the subject arrives: go to the eye tracker and set options to 5-point test rather than 9-point test. The latter test is more ideal, but the subject often cannot see all 9 points on the screen.
Once the subject is in the scanner: on the first run, when you run the script containing the above information, it will first bring up a screen to calibrate the eye tracker. For this, you will need to go to the eye tracking computer and press C so the circles appear. I usually unclick and then click auto. After the subject moves their eyes in accordance with the circles, it will ask if you approve, and you should approve if it looks good. You have to then go click the button start/open recording for it to return to Matlab and continue.
On the second and third runs, I typically only validate the eye tracker rather than calibrating. Only if the validation is really off do I re-calibrate. Validate by pressing V on the eye tracking computer rather than C. You must again approve and click start recording to go on.
After the scan is over: Collect EDF Data from the eye tracking computer by quitting Eyelink and going to the browser mode. I believe the files are in drive ‘I’. Drag these to a thumb drive.
Converting Eyelink data to something you can use
The data are saved as .EDF files, and the easiest way I have found to get them into a usable format is to convert them to .ASC files. Download a program called EDF2ASC and convert them (you can do more than one at once).
Analyzing the data
First, you want to be in the working directory where are the files are. Then, a command like this will grab you all of the file names.
etfiles = dir(fullfile(pwd, '*.asc'));
Then, you will want to read in the data. From my experience, the large majority of files have 27 lines before the real data start to stream in. In some rare cases, there is an extra line that you will probably want to just change manually.
Now, you have the data in matlab saved in a bunch of cell columns. I did a bunch of things to extract the data from there, including asking where blinks and saccades occur, and then storing the continuous eye data in etmat with column 1 being the time point, column 2 the X-coordinate, column 3 the Y-coordinate, and column 4 the pupil area. You will note at the bottom some other transformations.
for iiii=1:length(et{1,1})-1
a=cell2str(et{1,1}(iiii));
endnum=length(cell2str(et{1,1}(iiii+1)))-3;
if length(a)>6
%if blink on, don't proceed; if end, proceed again
if strcmpi(a(3:8),'SBLINK');proc=0;jj=jj+1;
plug=cell2str(et{1,3}(iiii));%find previous point
bmat(jj)=str2double(plug(3:endnum));end
if strcmpi(a(3:8),'EBLINK');proc=1;end
%same deal with saccades
if strcmpi(a(3:7),'SSACC');jjj=jjj+1;plug=cell2str(et{1,1}(iiii+1));
smat(jjj)=str2double(plug(3:endnum));end
if strcmpi(a(3:7),'ESACC');end
if proc
endnum=length(a)-3;
aa=str2double(a(3:endnum));
if length(a)>8
if ~isnan(aa)
j=j+1;
etmat(j,1)=aa;
a=cell2str(et{1,2}(iiii));
if length(a)>6
endnum=length(a)-3;
aa=str2double(a(3:endnum));%8
if ~isnan(aa)
etmat(j,2)=aa;%xcoord
a=cell2str(et{1,3}(iiii));
endnum=length(a)-3;
aa=str2double(a(3:endnum));
etmat(j,3)=aa; %ycoord
etmat(j,4)=et{1,4}(iiii);
end
end
end
end
end
end
end
etmat=etmat(~isnan(etmat(:,1)),:);%kill extra rows
etmat=etmat(1:end-1,:);%kill last row with strange number
bmat2=(bmat-etmat(1,1))/etfs;smat2=(smat-etmat(1,1))/etfs;
etmat(:,1)=(etmat(:,1)-etmat(1,1))/etfs;%normalize to first pt
etmat2=etmat(and(etmat(:,2)>xminvid,etmat(:,2)<xmaxvid),:);%on screen
etmat2=etmat2(and(etmat2(:,3)>yminvid,etmat2(:,3)<ymaxvid),:);%on screen
etmat2=etmat2(etmat2(:,4)>1000,:);%kill trials w/o pupil area...
etmat2=etmat2(abs(zscore(etmat2(:,4)))<5,:);%kill extreme outliers
fn=[dr 'etb-' num2str(gtrialt) '.mat'];
save(fn,'etmat','etmat2','secspace_norm','smat2','bmat2'); % save
Depending on your goal, you may or may not want to norm for the position of the eye on the screen. If you are interested in this, please refer to my analysis script. I also have a way of controlling for global luminance, which involves finding luminance values averaged over each TR of video, as well as local luminance, which does the same thing but using a circle centered around the eye location, and includes them as regressors in various analyses. To find the luminance for a video, see the code below.
v=VideoReader([root 'exp/vids/720res/' gf 'cut.mov']);% file name lum=zeros(floor(v.Duration*v.FrameRate),1);%global luminance mean for that second
lum_temp=zeros(ceil(v.FrameRate),vdims(1),vdims(2));%temporary luminance for each second
lum_f=zeros(floor(v.Duration),vdims(1),vdims(2));%stores mean of that second for x/y
j=1;jj=1;jjj=1;
while hasFrame(v);video=readFrame(v);lum(j)=mean2(rgb2gray(video));
lum_temp(jj,:,:)=rgb2gray(video);j=j+1;jj=jj+1;
if jj>v.FrameRate;lum_f(jjj,:,:)=mean(lum_temp,1);jjj
lum_temp=zeros(floor(v.Duration),vdims(1),vdims(2));jj=1;jjj=jjj+1;
end;end;
if jj>1;lum_f(jjj,:,:)=mean(lum_temp,1);end%final (partial) second
The text was updated successfully, but these errors were encountered:
Hello,
Here's the text for the eye physiology analysis section. Where can I send the scripts?
Eye tracking using the Eyelink software in the scanner: introduction
Here is some hopefully useful information about using eyetracking data as it comes off of the scanner. Please note that I did this analysis in MATLAB, but there may be some procedural things that are useful even if you are using Python. In my study, I use the X/Y coordinates of the eye, pupil dilation data, blinks, and saccades. There are some challenges to doing this on the analysis end, which I will cover. There are obviously other functionalities that I am not using, such as using the real-time position of the eye to control stimulus presentation, which I know Abby Novick has done. There are also ways to input and save the data within MATLAB itself rather than on the eyetracking computer and converting later.
For experiments, the most relevant scripts to check out are ‘init_ss.m’ and ‘playMovEZ.m’.
For analysis, the most relevant script to check out is ‘XSubHR.m’.
Calling the Eyelink using MATLAB functions
I like to create a flag variable, for instance, ‘et’, that, if set to 1, runs all of the functions. The text below will initialize the functions for the current psych toolbox window and then launch the eye tracker set up.
if et;EyelinkInit();el=EyelinkInitDefaults(window); EyelinkDoTrackerSetup(el,'c');mode=0;
You will now have to do set up on the eyetracking computer before the Psychtoolbox scripts will continue. See the below section titled, ‘Running the study’, for more information on this.
If you want to start saving a new file, you might create a command like this, where you have some file name that is saved on the eyetracking computer. Here I have done so using the subject number as a string and the trial number.
if et;edfFile=[sub '_' num2str(gtrial) '.edf'];
Eyelink('Openfile',edfFile);Eyelink('StartRecording');Eyelink('Message','SYNCTIME');end
To stop recording, use a command like this.
if et;Eyelink('StopRecording');Eyelink('CloseFile');end
When you are finished with a session, you can shut things down with this. Use ListenChar if you had previously turned off character input.
if et;Eyelink('Shutdown');ListenChar(1);end
Running the study
Before the subject arrives: go to the eye tracker and set options to 5-point test rather than 9-point test. The latter test is more ideal, but the subject often cannot see all 9 points on the screen.
Once the subject is in the scanner: on the first run, when you run the script containing the above information, it will first bring up a screen to calibrate the eye tracker. For this, you will need to go to the eye tracking computer and press C so the circles appear. I usually unclick and then click auto. After the subject moves their eyes in accordance with the circles, it will ask if you approve, and you should approve if it looks good. You have to then go click the button start/open recording for it to return to Matlab and continue.
On the second and third runs, I typically only validate the eye tracker rather than calibrating. Only if the validation is really off do I re-calibrate. Validate by pressing V on the eye tracking computer rather than C. You must again approve and click start recording to go on.
After the scan is over: Collect EDF Data from the eye tracking computer by quitting Eyelink and going to the browser mode. I believe the files are in drive ‘I’. Drag these to a thumb drive.
Converting Eyelink data to something you can use
The data are saved as .EDF files, and the easiest way I have found to get them into a usable format is to convert them to .ASC files. Download a program called EDF2ASC and convert them (you can do more than one at once).
Analyzing the data
First, you want to be in the working directory where are the files are. Then, a command like this will grab you all of the file names.
etfiles = dir(fullfile(pwd, '*.asc'));
Then, you will want to read in the data. From my experience, the large majority of files have 27 lines before the real data start to stream in. In some rare cases, there is an extra line that you will probably want to just change manually.
content=fileread(etfiles(gtrialt).name);
clear et;
et=textscan(content,'%s%s%s%n%n%s','HeaderLines',27,'EmptyValue',-Inf);
Now, you have the data in matlab saved in a bunch of cell columns. I did a bunch of things to extract the data from there, including asking where blinks and saccades occur, and then storing the continuous eye data in etmat with column 1 being the time point, column 2 the X-coordinate, column 3 the Y-coordinate, and column 4 the pupil area. You will note at the bottom some other transformations.
Depending on your goal, you may or may not want to norm for the position of the eye on the screen. If you are interested in this, please refer to my analysis script. I also have a way of controlling for global luminance, which involves finding luminance values averaged over each TR of video, as well as local luminance, which does the same thing but using a circle centered around the eye location, and includes them as regressors in various analyses. To find the luminance for a video, see the code below.
v=VideoReader([root 'exp/vids/720res/' gf 'cut.mov']);% file name lum=zeros(floor(v.Duration*v.FrameRate),1);%global luminance mean for that second
lum_temp=zeros(ceil(v.FrameRate),vdims(1),vdims(2));%temporary luminance for each second
lum_f=zeros(floor(v.Duration),vdims(1),vdims(2));%stores mean of that second for x/y
j=1;jj=1;jjj=1;
while hasFrame(v);video=readFrame(v);lum(j)=mean2(rgb2gray(video));
lum_temp(jj,:,:)=rgb2gray(video);j=j+1;jj=jj+1;
if jj>v.FrameRate;lum_f(jjj,:,:)=mean(lum_temp,1);jjj
lum_temp=zeros(floor(v.Duration),vdims(1),vdims(2));jj=1;jjj=jjj+1;
end;end;
if jj>1;lum_f(jjj,:,:)=mean(lum_temp,1);end%final (partial) second
The text was updated successfully, but these errors were encountered: