Don't Assume that fMRI and MEG Will Give You Comparable Results

Thursday, January 27, 2011

Accessibility: Intermediate/Advanced

There are three common methods of studying brain function in normal human populations: fMRI, MEG, an EEG. There is surprisingly little crosstalk between the techniques, mostly due to practical issues.For better or worse, labs tend to specialize in one technology.

It's often assumed that the relationship with techniques is straightforward, that it's simple to map results from one technique onto another. However, a recent study by Johanna Vartianen and colleagues suggests otherwise.

The group wanted to study reading using all three brain techniques. Participants performed the same experimental paradigm twice: once with simultaneous EEG and fMRI, and once with simultaneous EEG and MEG. Participants saw words, pseudowords, consonant strings, and symbol strings, and words embedded in noise. Their task was to detect immediate repetitions. The EEG results from the two sessions were comparable, so the researchers went on to compare the fMRI and MEG activation patterns for the experiment.

To summarize, activation patterns between MEG and fMRI did not show a straightforward relationship. In some regions, the two techniques showed the same pattern. For example, in the occipital lobe, both MEG and fMRI measures had more activation to noisy words than other types of stimuli.

If you look at the occipitaltemporal lobe however, the two techniques had opposite results. MEG showed more activation to real letters than symbols, while FMRI showed more activation to symbols then letters.

In the left frontal cortex the two regions had completely different patterns. FMRI activation was higher for words and pseudowords than symbols and noisy words. The MEG results showed no difference at all between stimulus types.

I guess this is one of these results that you don't see going in, but in hindsight make you hit yourself over the head. FMRI and MEG measure very different things, so it’s entirely possible that results would come out differently. FMRI measures cerebral blood flow on a timescale of several seconds, while MEG measures synchronous electrical activation with millisecond resolution. So ( as the authors suggest) non-synchronous activity may be lost in MEG. Meanwhile, fMRI picks up average activity over a longer time period and may miss short-term activity.

Interestingly, the authers mentioned that previous MEG results for the visual word form area were fairly robust to task differences, while fMRI results do seem to vary with task. Now I don't know the MEG literature well, but they're certainly right about the fMRI literature. In that case, I wonder what it is about the MEG that makes its results relatively task independent. Is it the better temporal resolution? Perhaps MEG analyses focus on early, bottom up processing, which may be relatively task independent?

Vartiainen J, Liljeström M, Koskinen M, Renvall H, & Salmelin R (2011). Functional magnetic resonance imaging blood oxygenation level-dependent signal and magnetoencephalography evoked responses yield different neural functionality in reading. The Journal of neuroscience : the official journal of the Society for Neuroscience, 31 (3), 1048-58 PMID: 21248130


Recycling Neurons for Reading

Monday, January 24, 2011

Accesibility: Intermediate-Advanced

Our brains have evolved to be good at certain things: seeing, hearing, learning language, and interacting with other similar brains, to name a few examples. But say you want it to do something new – look at symbols on a page and map them to language. In other words, you want to teach your brain to read. How would you go about doing this? What parts of the brain would you use?

Unless you plan on developing a completely new region, it makes sense to repurpose the brain regions you already have -- a process that neuroscientist Stanislas Dehaene refers to as “neuronal recycling.” This raises the question -- what regions are recycled? And do the regions that get co-opted become worse at their original function?

Dehaene and colleagues explored this question by scanning adults at different levels of literacy: literates, ex-literates (adults who used to be illiterate but learned to read in adulthood), and illiterate adults. They had several interesting findings:

1. They first looked at whether learning to read changes brain activation when looking at words. Not surprisingly, it does. Reading performance was correlated with increased brain activation in much of the left hemisphere language network, including the visual word form area. And this increased activation appeared to be specific to word-like stimuli.

2. During reading, ex-literates have more bilateral activation and also recruited more posterior brain regions. This is similar to what we find in children, who also show more spread out activation while reading. This suggests that unskilled readers recruit a wider set of brain regions as they are learning to read. As readers become more skilled, their brains become more efficient and recruit fewer regions

3. In literate adults, response to checker boards and faces in the visual word form area was lower in the visual word form area compared to non-readers. This suggests that learning to process words may actually be taking resources away from processing other stimuli.

4. The researchers looked more closely at responses to other faces and houses to see how exactly learning to read competed with other visual functions. They found that activation in the peak voxels for faces and houses did not change with literacy. However, activation in surrounding voxels did decrease.

5. And here's an interesting result. Since reading is a horizontal process (at least in the languages they were testing), the researchers checked to see if the visual system became more attuned to horizontal stimuli. They found that literacy enhanced response to horizontal but not vertical checker boards in some primary visual areas.

Dehaene S, Pegado F, Braga LW, Ventura P, Nunes Filho G, Jobert A, Dehaene-Lambertz G, Kolinsky R, Morais J, & Cohen L (2010). How learning to read changes the cortical networks for vision and language. Science (New York, N.Y.), 330 (6009), 1359-64 PMID: 21071632


  © Blogger template The Professional Template II by 2009

Back to TOP