
  This demos SPM5 (with cbu_updates) for distributed source 
  =========================================================
         localisation of CBU Neuromag MEG data
	 =====================================

Rik Henson, Sep 07
==================

There are two demo localisations, on the same raw data.

Demo 1 describes how to examine Left-hand vs Right-hand evoked responses (~0ms 
relative to keypress), using the GUI

Demo 2 describes how to examine Faces vs Scrambled faces evoked responses (or 
evoked and induced across all trials), using command-line (ie batch scripts)


FIRST
=====

You will need to copy the structural to a subdirectory, eg "sMRI", including a 
mat file containing at least three reference points (fiducials) in MRI space (in 
this example, nasion and left and right periaricular points), or you can create 
this yourself, by simply displaying the MRI in SPM, creating a matrix of 3 rows 
(nasion, left ear, right ear) and 3 columns (their x,y,z, in mm), and saving as 
a mat file.

You will also need the text files that recode the events of interest (there were 
actually at least 8 different codes in the original FIF file, with rather odd 
codings, corresponding to four different types of stimuli and four different 
types of keypress). The website should provide these as "hands.txt" (for Demo 1) 
and "faces.txt" (for Demo 2).

Both demos require that the RAW FIF data is first run through MaxFilter 
(including movement estimation every 200ms and compensation, HPI signal removal, 
conversion to short, and downsampling by a factor of 4):

/neuro/bin/util/maxfilter -gui -f 
/megdata/cbu/henson/henson_rik/070412/faces_raw.fif -o raw.fif -ctc 
/neuro/databases/ctc/ct_spars'bcdnste.fif -cal /neuro/databases/sss/sss_cal.dat 
-format short -movecomp -hpistep 200 -hpisubt amp -hp raw_movement.pos -ds 4


1 LEFT vs RIGHT HAND MEAN EVOKED RESPONSES (GUI)
================================================


1.1 Type "spm 5" on a linux box, and toggle SPM to EEG mode (or type "spm 5 
eeg").


1.2 Press "Convert", select "FIF", select "raw.fif", select "FIF306_setup.mat" 
(a layout file for display)...

...the Graphics window should now show the position and three (orthogonal) 
orientations of each sensor (magnetometer and planar gradiometers) in 3D MEG 
space, including sensor name (use cursor to rotate). The red dots are the 
Polhemus digitised points...

...press return to select the default channel for the trigger codes ("STI101"), 
press return to convert the whole raw data (ie start and end of timeseries)...

... then select the channel "EEG062" (second one) as the vertical EOG channel 
(the other one will be made horizontal EOG; this depends how the electrodes were 
attached)...

...this will now produce "raw.mat" and "raw.dat" which are the raw data in SPM 
format. The mat file is a header file (you can load and peruse the structure "D" 
inside); the dat file contains all the data (which will be memory-mapped later; 
see spm_eeg_ldata). 

There will also be some output in the Matlab window describing what was read, 
and including the rescaling of the data. The gradiometers are downscaled 
relative to the magnetometers (this can be overriden if you call 
spm_eeg_rdata_FIF from the command line). This scaling is because the 
gradiometers measure a gradient (T/m) and the magnetometers measure total flux 
(T). They are therefore not commensurate. One can scale by the baseline SNR, 
which NeuroMag report is typically 1:20 for grads:mags. I decided to scale the 
gradiometers by 0.0595 (rather than 0.05), which is a function of the baseline 
between the centres of two gradiometer coils, which is 16.8cm (if you want to 
change this scaling, it is a hidden function call in spm_eeg_rdata_FIF; see 
batch script that accompanies this README for an example). This scaling seems to 
make the magnetometer and gradiometer data have similar amplitude range at 
least. Alternative scalings are described in the NeuroMag manuals.


1.3 Press "Epoching", select "raw.mat" file, enter "-300" for start of epoch and 
"300" for end...

...then type "-2304 -1280 -768 -512" for the event-types to epoch. These strange 
numbers simply correspond to the trigger codes for the four types of finger 
press (left middle, left index, right index and right middle) - as distinct from 
the codes for the visual stimuli (see Demo 2). (They are strange simply because 
of a negative offset that was not set properly in the acquisition software for 
this particular dataset)...

...then type "yes" to read a new event-list. If you press "no", it will epoch 
the four trigger codes above, and keep those codes. However, for this Demo, we 
want to collapse finger and just compare left vs right hand. This has been done 
already in the text file "hands.txt". This file simply contains 1s and 2s for 
each of the key press triggers (172 in total), and will recode each event into 1 
(left hand) or 2 (right hand). This illustrates the type of procedure you will 
need if you want to further conditionalise your original trigger codes on, for 
example, subsequent behavioural analysis (eg correct vs incorrect)...

...It will prompt you to enter 172 new codes, but because the SPM input is 
evaluated, you can type "spm_load" and then select the file "hands.txt" 
described above. 

This will produce two new files (e_raw.mat and e_raw.dat), which are the epoched 
data.


1.4 Press "Display", choose the "M/EEG" option, and select the "e_raw.mat" 
file...

...the graphics window will now show all the channels. The MEG channels are 
grouped into triplets with the magnetometer on top and the gradiometers below. 
The two EOG channels are in the top left and right. If you left-click on the top 
left channel, a new blow-up window will appear with the VEOG data (for the first 
epoch). (Make sure the Matlab "Rotate3D" cursor is not still set, from the 
Matlab "Tools" menu). You should see a bump around -200ms which is probably a 
blink. This trial will be removed by artifact rejection next... 

(You can explore the data further, eg other trials, scaling, topographies, etc, 
but this is slow, and probably best done after the next step.)

(Note also that if you select the "M/EEG fast" Display option instead, the 
plotting will be slightly faster, and you can now use the left-mouse to "zoom 
in" on channels, and the right-mouse to plot them).


1.5 Press "Artefacts", select the e_raw.mat file, press "no" to read own list 
(e.g, a list of rejected trials from another package; note you can also indicate 
rejects manually through visual inspection of each trial via the Display 
option)...

...press "no" for robust averaging (this is essentially an average weighted by 
the typicality of each epoch, see Kilner et al, HBM06, but is very slow), then 
press "yes" to Thresholding...

You can now enter a single number to threshold all channels by that amount (in 
absolute terms), or as many numbers as there are channels (306+2=308) if you 
want a channel-specific threshold. There is no right or wrong answer here. Since 
one rarely has a good idea what a max or min magnetic deflection is (in fT), you 
can turn off thresholding of the MEG channels by giving a threshold of infinity. 
Past experience for EOG suggests that blinks often exceed a vertical deflection 
of 200uV. To implement this thresholding via EOG channels only, type

    "[ones(1,306)*Inf 200 200]"

which simply says the above in Matlab.

The Matlab window will now report that there were no "Bad" channels (channels 
with more than 20% of trials above threshold), but 19 rejected trials (i.e, with 
EOG deflections >200). Note we are currently working on more sophisticated 
methods, eg for blink correction using ICA (called from EEGLAB).


1.6 If you now select "Display: M/EEG" and select the "ae_raw.mat" file, you can 
see which trials were rejected by the menu on the bottom right of the graphics 
window. Inspection of these will show that they are all blinks picked up in the 
VEOG channel around the time that the button was pressed (quite a common 
occurrence: there were very few blinks in the epoch around the visual triggers 
in Demo 2 below because the subject was instructed not to blink when the visual 
stimulus was displayed).

1.7 Now we want to create an average (excluding those trials marked as rejects 
from 1.6). Press "Averaging", select the "ae_raw.mat" file. The Matlab window 
will report that 96 trials were of type 1 (left hand0 and 57 of type 2 (right). 
The new file created is called "mae_raw.mat" (m for mean).


1.8 To aid visual inspection, you can now filter the data (though this is not 
always advisable for analysis, depending on the frequency cutoff relative to 
epoch length; filtering raw data is better to avoid end-effects). You could 
press "Filter", select "low-pass", enter "40" as the cutoff, and you will get a 
file "fmae_raw.mat" containing smoothed data. (This uses a high-order, optimised 
Butterworth digital filter.)


1.9 You now select "Display: M/EEG" and select the "fmae_raw.mat" file to see 
the smoothed averages. You speed things up by selecting only a subset of 
channels to display. Press "channels" in the Graphics window, and then "Deselect 
all". Then select only the first 102 channels from the menu on the right using 
Shift-Left-mouse. The first 102 channels are the magnetometers (also indicated 
by the fact that their names end in "1" rather than "2" or "3", for the 
gradiometers). You can also select the two EOG channels from the bottom of the 
menu using Ctrl-Left-mouse.

If you now press shift and left click trial-type 2, you will see a red line 
added to the channels. If you click on one of the more central (magnetometer) 
channels, you can see the left vs right hand differences around 0ms (key press).

If you press "Topography", type 0ms (rather than 100), and select "2D", you will 
see the evoked responses for the first trial-type. You should see red/blue 
maxima on the right, corresponding to the left hand. If you first select trial-
type 2, then press Topography, you will see two larger maxima, more on the left 
(for right hand; the subject was right-handed).

If you move the slider left and right, the topography will change over time 
(precise timepoint shown in Matlab window).


1.10 LOCALISATION. We are now going to localise the left vs right hand responses 
in 3D. Press "3D reconstruction"...

(The button "2D interpolation" would allow you to write out 2D images in sensors 
space, if you want to do statistics in sensor space, using the usual machinery 
of SPM, including RFT correction for multiple comparisons. However, topographies 
are difficult to interpret for our MEG data with its three different sensor-
types (unlike for EEG data for example), since they give a vector rather than a 
scalar field. So any "topographies" displayed as above using all the sensors are 
fairly meaningless.)

...in the new window, press "Load" and select the "mae_raw.mat" file (or fmae 
one, if you prefer). Enter any comment you want...

...then start the first of four stages:

Stage 1: Segment and normalise the MRI for this subject. Segmentation allows 
creation of boundaries of scalp, inner skull and cortex. Normalisation allows 
the canonical cortical mesh to be inverse-normalised to match this specific 
subject. (See papers on WIKI)...

(If you have no MRI, you can use a canonical MRI by pressing the "template" 
button, with associated greater potential for errror)

...select the "smri.img" structural MRI (available from WIKI), then select 
"7200" for the number of vertices in the mesh (ie maximum resolution)

This call's SPM's normal segmentation, which will take some time. Put your feet 
up; you deserve it. Having got this far... (you can watch some output in the 
Matlab window).

When it has finished, an image will appear in the Graphics window showing the 
cortical, inner skull and scalp meshes.

Stage 2: Coregister the MEG data to the MRI space. This is done in two stages: a 
coarse co-registration by fitting the fiducials in MEG and MRI space (the latter 
demarked by hand; see above), and a finer co-registration by fitting the 
digitised headshape in MEG space to the MRI scalp mesh created during Stage 1.

Press "Co-register", select the file "smri_fids.mat" (winzipped with the 
structural on the WIKI), and watch the Graphics window. When it has finished, 
you will see in the top panel the MRI meshes surrounded by the sensors (in 
green; magnetometers and gradiometers collapsed per chip), plus the fiducials 
according to MRI and MEG (purple and blue markers). The fiducial fit is not 
excellent because the fiducials were only approximated by hand on the MRI, but 
this does not matter because the precise fit is provided from the many more 
points digitized on the headshape (and matched to MRI scalp). You can rotate the 
image to see more. The bottom panel just shows the sensor names.

Stage 3: Construct a forward model. This vital step creates the leadfield matrix 
that expresses how each of the (7200) possible sources contributes to the (306) 
sensors. Note that the sources are constrained to be normal to the cortical 
surface; only their amplitude is estimated. This requires Maxwell's equations, 
and for magnetic fields, can be approximated reasonably well by a single-sphere 
model (Brainstorm code for a Boundary Element Model (BEM) is available, but not 
yet integrated). This sphere is fit to the inner skull (show dynamically in the 
Graphics window). All the matters from this fit is the centre of the sphere (and 
the position and orientation of the sensors, which is already encoded in the 
*.mat file as derived from the *.fif file).

Stage 4: We now want to invert the forward model, given the MEG data. Press 
"Invert", select "Classical" ("DCM" is for new methods for ECD), select "yes" to 
localise both trial-types (left and right hand), and select "MSP"...

...MSP stands for "Multiple Sparse Priors", which is a novel approach unique to 
SPM. It resembles weighted minimum L2-norm, but with multiple rather than one 
prior. More details can be found in the paper on the WIKI. It has been shown to 
be superior to the other two options: "MNM" (which is basic minimum norm, 
minimising "energy" of solution) and "COH" (for "coherent", which maximises 
spatial smoothness of solution, like LORETA but in the context geodesic distance 
within a mesh)...

...select "Standard" to use standard parameter values (or "Custom to hand-
craft).

The Matlab window will show some details of the fitting (inversion via ReML). 
The Graphics window will then show the results for condition 1 (left hand). It 
shows by default the maximum in 3D space and time. The lines are the estimated 
source amplitudes at the coordinate described (43, -36, 45; somewhere in right 
motor or somatosensory cortex?). The red line is the condition currently 
selected in the reconstruction window (ie condition 1; left hand); the grey 
lines are any other conditions. The peak is at 28ms, and 87% of the total 
variance has been explained (the log-evidence is actually a more useful, though 
relative, measure, since one can explain all the variance in such 
underdetermined problems, unless one explicitly models the noise. And yes, it 
should be negative.). The bottom panel shows the MIP of posterior probabilities 
of reliable sources.

If you press "movie", and enter times from -300 to 300 (whole epoch), you can 
see the estimated source amplitudes evolve over time. (Note right-lateralisation 
around 0ms, but evidence of subsequent left and bilateral activity. Ask a motor 
person.)

If you enter 0 (or 200) into the time box and press MIP, you can see the MIP for 
that time.

If you press the "Condition 1" button it will toggle to the other conditions (ie 
Condition 2; right hand), and then press MIP again (for 0ms), you will see more 
left-lateralisation for condition 2 (right hand), though somewhat less focal, 
and possibly several generators.

If you press "Render" then you can view the data in lots more fancy ways. Mostly 
self-explanatory, though as above, ignore the scalar sensor topographies which 
are incorrect for our vector-like sensor data. Fancy, but perhaps not as 
relevant as...

...To get an estimate of the sources over a particular timewindow and frequency 
range, press "Window". This creates a time-freq contrast, eg by entering [0 50] 
for the time and 0 for the frequency band, you will get a (Hanning windowed) 
average source map for all frequencies....

...This map can now be written as a 3D Nifti image in MNI space by pressing the 
"Image" button (and selecting a spatial smoothing of, eg, 12mm). The blobs in 
this image will be displayed together with the structural MRI. This image can be 
entered into (2nd-level) analyses just like for fMRI data, eg, by repeating this 
process for multiple conditions/subjects to make population-level inferences. 
Such 3D group-analyses are less easy in other packages.


1.11 You can obviously explore other options, eg MNM or COH localisations (you 
can avoid repeating Stages 1-3 above simply by pressing "Save" in the 
reconstruction window, and then "New" (new localisation of these data). You will 
notice for example that the MNM solution is more dispersed and superficial 
(particular for the deeper, fusiform sources in Demo 2 below). Or you can 
explore the relative importance of magnetometers vs gradiometers by artificially 
switching off one or t'other by selecting them to be "Bad" (this has to be done 
via the command line, as illustrated in the batch script that accompanies this 
demo).


1.12 Note that we have localised evoked responses (from trial-average). SPM can 
also localise all responses (evoked and induced) simply by localising the 
average covariance across trials (see paper on WIKI), rather than covariance of 
the average. To do this, simply repeat the above localisation process, but 
select the "ae_raw.mat" file (ie all epochs, pre-averaging) rather than the 
"mae_raw.mat" file.


2 FACES vs SCRAMBLED EVOKED RESPONSES (via batch script)
========================================================

For this Demo, you can use the batch script provided "spm5_cbu_meg_batch.m" 
(batching for Demo 1 is also included). Just cut and paste relevant sections. 

The batch does 7 localisations (note: spm_eeg_weight_epochs used to create 
contrasts of MEG data, eg differential ERFs): 
1) differential ERF between faces and scrambled faces (all channels, whole 
epoch)
2) common (average) ERF (all channels, whole epoch)
3) faces and scrambled faces separately (all channels, whole epoch)
4) faces and scrambled faces separately (mags only, whole epoch)
5) faces and scrambled faces separately (grads only, whole epoch)
6) differential ERF between faces and scrambled faces (all channels, only 150-
200 (M170)),
7) faces and scrambled faces separately (all channels, only 150-200 (M170)),

Things to note include: 
a) in localisation 3 (all channels, faces and scrambled separately), there is 
greater fusiform and temporal pole activity for faces than scrambled around 
160ms
b) in localisation 3 there is similar more posterior activity around 100ms 
(localisations 3 and 2)
c) localisation 1 of the differential ERF is not as clear (in this case), with 
greatest early visual differences 
d) gradiometers pick up posterior fusiform ok, but not temporal poles, and 
magnetometers vice versa (cf localisations 4 and 5)
e) reducing the timewindow localised from the whole epoch to 150:200 is 
reasonable, but introduces possible ghost sources in superior regions, 
particularly for differential ERF. This is likley to reflect the fewer data. If 
the same reduced timewindow is used together with all trials (ie using the 
e_faces.mat file), the localisations are much better - largely restricted to 
fusiform - reflecting the better estimate of the data covariance. Now you are an 
expert, you can attempt the latter (via a script)....

;-)


