Open AccessJournal of Speech, Language, and Hearing ResearchForum: Advances in Neuroplasticity Research on Language Recovery in Aphasia22 Nov 2019

Language Mapping in Aphasia



    Recovery from aphasia is thought to depend on neural plasticity, that is, functional reorganization of surviving brain regions such that they take on new or expanded roles in language processing. To make progress in characterizing the nature of this process, we need feasible, reliable, and valid methods for identifying language regions of the brain in individuals with aphasia. This article reviews 3 recent studies from our lab in which we have developed and validated several novel functional magnetic resonance imaging paradigms for language mapping in aphasia.


    In the 1st study, we investigated the reliability and validity of 4 language mapping paradigms in neurologically normal older adults. In the 2nd study, we developed a novel adaptive semantic matching paradigm and assessed its feasibility, reliability, and validity in individuals with and without aphasia. In the 3rd study, we developed and evaluated 2 additional adaptive paradigms—rhyme judgment and syllable counting—for mapping phonological encoding regions.


    We found that the adaptive semantic matching paradigm could be performed by most individuals with aphasia and yielded reliable and valid maps of core perisylvian language regions in each individual participant. The psychometric properties of this paradigm were superior to those of other commonly used paradigms such as narrative comprehension and picture naming. The adaptive rhyme judgment paradigm was capable of identifying fronto-parietal phonological encoding regions in individual participants.


    Adaptive language mapping paradigms offer a promising approach for future research on the neural basis of recovery from aphasia.

    Presentation Video

    Acquired aphasia results from damage to left-hemisphere temporal, frontal, and parietal brain regions that are critical for language, most commonly as a consequence of stroke (Kertesz, Harlock, & Coates, 1979). Fortunately, most individuals with poststroke aphasia experience some degree of recovery over time (Holland, Fromm, Forbes, & MacWhinney, 2017; Kertesz & McCabe, 1977; Swinburn, Porter, & Howard, 2004; Wilson et al., 2019). Although early recovery can be attributed to the resolution of hypoperfusion and edema that often accompany acute stroke (Hillis et al., 2002), longer-term recovery is thought to reflect neuroplasticity, that is, the functional reorganization of surviving brain regions such that they take on new or expanded roles in language processing (Heiss & Thiel, 2006; Saur et al., 2006; Turkeltaub, Messing, Norise, & Hamilton, 2011).

    In principle, language function can be sustained by regions other than the canonical left-hemisphere perisylvian language network. This has been clearly established in studies of perinatal stroke survivors (Lenneberg, 1967; Newport et al., 2017; Staudt et al., 2002). These studies have shown that, when early damage to the typical left-hemisphere network is sufficiently extensive, language develops instead in homotopic regions in the right hemisphere, and language function is often normal or near normal. Is such dramatic macroscopic reorganization possible in adults recovering from stroke? The answer to this question is far from clear.

    Many studies have argued that recovery from poststroke aphasia depends on the recruitment of right hemisphere regions (Turkeltaub et al., 2011; Weiller et al., 1995). Other patterns of reorganization have also been reported, including recruitment of left hemisphere regions beyond the typical language network (Fridriksson, Bonilha, Baker, Moser, & Rorden, 2010; Fridriksson, Richardson, Fillmore, & Cai, 2012; Robson et al., 2014) and recruitment of domain-general cognitive or “multiple demand” regions, which are often bilateral (Brownsett et al., 2014; DeMarco, Wilson, Rising, Rapcsak, & Beeson, 2018; Geranmayeh, Brownsett, & Wise, 2014; Geranmayeh, Chau, Wise, Leech, & Hampshire, 2017). However, the reported patterns of language reorganization have been rather diverse, and few have been replicated.

    The only consistently replicated finding is that recovery from aphasia is associated with return to function of core left-hemisphere language regions (Hartwigsen & Saur, 2019; Heiss, Kessler, Thiel, Ghaemi, & Karbe, 1999; Heiss & Thiel, 2006; Saur et al., 2006). Presumably, regions that show increased function associated with recovery were not destroyed by stroke but were rendered partially dysfunctional by diaschisis and/or persisting hypoperfusion. However, although the reintegration of these regions is an important substrate of recovery, it does not necessarily constitute functional reorganization per se. Moreover, it seems clear that mechanisms other than reintegration of left hemisphere language regions must be critically important, for the simple reason that many patients with significant damage to left hemisphere regions nevertheless do make remarkable recoveries. In these patients, restoration of core left-hemisphere language regions is simply not an option.

    Consider the gentleman whose brain is illustrated in Figure 1, whom we will call Mr. L. Although he had severe expressive aphasia for many years, he has made major gains over the last few years, and now, 8.5 years after his stroke, he communicates with ease about all topics. In the connected speech section of our language evaluation, he stated:

    Figure 1.

    Figure 1. Structural magnetic resonance imaging of the brain of Mr. L. The left inferior frontal gyrus, much of the middle frontal gyrus, the ventral precentral gyrus, the insula, and underlying white matter were completely destroyed, yet Mr. L experienced an excellent long-term recovery from his aphasia.

    I'm və- very en-thu-si-astically working on a novel. Um (…) that's the second novel. I'm um (…) really wanna have it done this year. So <in the> in the mid late fall. So… Uh it's uh taking up lots of my time. I'm uh (…) trying to uh… I'm adjusting my schedule /ti/ be earli-er in the də- day. Actually at times starting before the sun is up.

    Although Mr. L is still clearly aphasic, his speech and language capacity is truly remarkable considering that his left inferior frontal cortex was completely destroyed. How is this possible?

    Efforts to understand the nature of functional reorganization supporting recovery from aphasia face many challenges, including the difficulty of recruiting and following sufficient numbers of patients (Price, Seghier, & Leff, 2010), variability between individuals in stroke size and location (Crinion & Price, 2005), and the complexity of analyzing functional changes in the context of different patterns of structural damage (Fridriksson et al., 2012; Griffis, Nenert, Allendorfer, & Szaflarski, 2017; Sims et al., 2016; Skipper-Kallal, Lacey, Xing, & Turkeltaub, 2017; Tyler, Wright, Randall, Marslen-Wilson, & Stamatakis, 2010). However, perhaps the most fundamental challenge has been the difficulty of developing appropriate functional imaging paradigms for identifying language regions in individuals with aphasia. To support research on functional reorganization of language regions in recovery from aphasia, language mapping paradigms need to meet at least three criteria.

    First, they must be feasible for individuals with aphasia. Language areas are generally identified with functional magnetic resonance imaging (MRI) by comparing blood oxygen level–dependent signal in conditions that do involve language processing (i.e., performing a language task) to conditions that do not (i.e., performing a task that does not involve language; Binder et al., 1997). This is a well-developed field because of its widespread clinical application in presurgical planning (Binder, Swanson, Hammeke, & Sabsevitz, 2008). However, by their very nature, individuals with aphasia find language tasks difficult, if not impossible, to perform. If a patient cannot perform a task, then it is hard to interpret activation maps associated with failure to perform the task (Price, Crinion, & Friston, 2006). If they can perform the task, but doing so requires much more effort than normal, then domain-general networks may be differentially recruited and misidentified as language regions (Geranmayeh et al., 2014). Therefore, it is important to use language mapping paradigms that individuals with aphasia can perform.

    Second, language mapping paradigms need to be reliable, exhibiting good test–retest reproducibility. In our work, we evaluate reliability in terms of overlap between activation maps obtained on different occasions, in situations where no change is expected. We quantify reliability using the Dice coefficient of similarity (Rombouts et al., 1997). Dice coefficients range from 0 (no overlap) to 1 (perfect overlap), with intermediate values reflecting partial overlap. Dice coefficients are conceptually related to the kappa statistic and can be interpreted as poor (< .40), fair (.40–.60), good (.60–.75), or excellent (≥ .75), following Cicchetti (1994).

    Third, language mapping paradigms need to be valid, identifying language regions and not other regions. It is difficult to evaluate validity directly in individuals with aphasia, because the nature of their language organization is unknown. That is, there is no ground truth against which to assess validity. Therefore, we generally assess validity in neurologically normal individuals, in whom we have a good understanding of typical language organization. In particular, it is established that the great majority of neurologically normal individuals demonstrate three features of language organization: (a) lateralization to the left hemisphere, (b) activation of left inferior frontal cortex, and (c) activation of left posterior temporal cortex (Bradshaw, Thompson, Wilson, Bishop, & Woodhead, 2017; Knecht et al., 2003; Seghier, Kherif, Josse, & Price, 2011; Springer et al., 1999; Tzourio-Mazoyer et al., 2010). We can assess the validity of language mapping paradigms in terms of their ability to reveal these known features of normal language organization in the majority of neurologically normal participants. If a language mapping paradigm is capable of reliably identifying typical language areas in cases where their likely organization is known, then we can be more confident in whatever it may reveal when applied in individuals with potentially atypical language organization.

    The psychometric properties of various language mapping paradigms have been extensively investigated in the clinical context of presurgical language mapping (e.g., Fernández et al., 2003; Janecek et al., 2013; see Bradshaw et al., 2017, and Wilson, Bautista, Yen, Lauderdale, & Eriksson, 2017, for reviews). However, most presurgical patients do not have significant language deficits. Only a few studies prior to our own have addressed psychometric aspects of language mapping in individuals with aphasia (Eaton et al., 2008; Kurland et al., 2004; Meltzer, Postman-Caucheteux, McArdle, & Braun, 2009). This has been, therefore, a critical gap in the literature.

    In this article, we will review three recent publications in which we have investigated the feasibility, reliability, and validity of various approaches to language mapping in aphasia. We propose that adaptive paradigms that dynamically adjust to individual performance offer the most promising approach at present, and we will conclude by briefly describing how we are currently using adaptive language mapping paradigms to investigate functional neuroplasticity underlying recovery from poststroke aphasia.

    Reliability and Validity of Four Language Mapping Paradigms

    The goal of our first study (Wilson et al., 2017) was to identify one or more appropriate language mapping paradigms for a long-term longitudinal study of the neural correlates of recovery from aphasia. To that end, we investigated the reliability and validity of four potential language mapping paradigms in neurologically normal older adults. Five older adults were each scanned with functional MRI on four occasions, with sessions typically a few weeks apart.

    Four candidate language mapping paradigms were selected that were likely to be feasible for individuals with aphasia to perform. Two of them—narrative comprehension and picture naming—had been widely used in the previous literature on neuroplasticity in aphasia (e.g., Crinion & Price, 2005; Fridriksson et al., 2010). A sentence completion (cloze) task was designed, which was intended to recruit both comprehension and production mechanisms, as well as word and sentence level processing. Finally, a naturalistic paradigm was constructed in which participants viewed an edited television program, from which language and nonlanguage segments could be contrasted. Four equivalent forms of each paradigm were constructed, and each paradigm was exactly 7 min long.

    The results of the study clearly demonstrated that all four paradigms were markedly lacking in both reliability and validity (see Figure 2). This figure shows a single representative participant; the data from all five participants are presented in the publication (Wilson et al., 2017).

    Figure 2.

    Figure 2. Reliability and validity of four language mapping paradigms in a representative neurologically normal individual. Activations within potential language regions or their homotopic counterparts are depicted in the hot color scale, whereas activations elsewhere are depicted in yellow. Adapted from “Validity and reliability of four language mapping paradigms,” by S. M. Wilson, A. Bautista, M. Yen, S. Lauderdale, & D. K. Eriksson, 2017, NeuroImage: Clinical, 16, p. 404. Copyright © 2017 Elsevier. Adapted with permission. ROI = region of interest.

    Generally, test–retest reproducibility was poor to fair. The language maps differed strikingly from session to session, even though presumably, no actual changes had taken place in these neurologically normal older adults. Such variability would be a serious obstacle to any longitudinal study in which actual change was anticipated because it would be difficult to distinguish between changes due to functional reorganization and changes reflecting limited test–retest reproducibility.

    Validity was also limited. The narrative comprehension paradigm yielded the most lateralized language maps, although there was substantial activation of the right temporal lobe in all five individuals. This is not to imply that the right temporal lobe is not genuinely involved in language comprehension—it surely is (Binder et al., 2011)—but a language mapping paradigm that highlights bilateral aspects of language organization may be less well suited to documenting functional reorganization after stroke, which depends upon finding new locations for functions that depended on damaged left-hemisphere regions. The sentence completion paradigm produced the next most lateralized activations, whereas the picture naming and naturalistic paradigms resulted in largely bilateral activations. None of the four paradigms consistently activated the left frontal and temporal regions known to be critical for language in most neurologically normal individuals.

    In sum, none of the four paradigms investigated in this initial study appeared to be well suited for studying the functional reorganization of language areas in aphasia.

    An Adaptive Semantic Matching Paradigm

    Based on the results of the previous study, we concluded that it would be necessary to develop new, psychometrically sound, paradigms for language mapping in aphasia. We undertook this task in our next study (Wilson, Yen, & Eriksson, 2018). We decided that a semantic decision paradigm would constitute the best starting point for developing a new paradigm because semantic decision paradigms are strongly lateralizing and robustly activate frontal and temporal language regions in neurologically normal individuals and presurgical patients with normal language function (Binder et al., 1997, 2008; Fesl et al., 2010; Janecek et al., 2013; Szaflarski et al., 2008). The highest Dice coefficients that have been reported in the literature have been derived from semantic decision paradigms (Fernández et al., 2003; Fesl et al., 2010).

    The problem was that semantic decision tasks had proven to be difficult, if not impossible, for many individuals with aphasia to perform. For instance, in one study (Szaflarski, Allendorfer, Banks, Vannest, & Holland, 2013), individuals with aphasia were asked to perform a variant of the classic Binder task in which participants decide if auditorily presented animal names are native to the United States and commonly used by humans. The control task involved listening to tone sequences and deciding whether they contained exactly two tones of a specific frequency. These tasks are challenging for individuals with normal language function, and the contrast between them yields robust left-lateralized language maps (Binder et al., 1997, 2008). However, the behavioral data showed that individuals with aphasia were unable to perform the tasks, performing at chance not only on the language condition (47.6% correct) but also on the control condition (52.2% correct; Szaflarski et al., 2013).

    We wanted to design a simpler semantic decision task. But if the task was too easy, then it might not yield robust activations in patients with milder aphasias or in control participants. Therefore, we created a task that would be adaptive to each individual's performance. Each item consists of a pair of words, which are presented one above the other in the center of the screen (see Figure 3A). Half of the pairs are semantically related, whereas the other half are not related. The participant presses a button with a finger of their left hand if they decide that the words are semantically related. If the words are not related, they do nothing. We use an adaptive staircase procedure and control difficulty by simultaneously manipulating a number of linguistic factors—frequency, concreteness, age of acquisition, length, and phonological complexity—and presentation rate. When participants make correct responses, more difficult items are presented, and when they make errors, easier items are presented. This means that each individual is presented with stimuli that are challenging yet within their competence so that language processing can be fully engaged in people with and without language impairments, minimizing performance confounds.

    Figure 3.

    Figure 3. Adaptive semantic matching paradigm. (A) Example of a semantic item. This item is a match and is shown surrounded by a box that appears when the “match” button is pressed. (B) Example of a perceptual item. This item is a mismatch, so the button should not be pressed. Adapted from “An adaptive semantic matching paradigm for reliable and valid language mapping in individuals with aphasia,” by S. M. Wilson, M. Yen, & D. K. Eriksson, 2018, Human Brain Mapping, 39, p. 3288. Copyright © 2018 John Wiley and Sons. Adapted with permission.

    The control condition is a perceptual decision task in which participants press the button if two symbol strings are identical (see Figure 3B). This task is also adaptive to participant performance, which is achieved by varying the similarity of the mismatching strings and the presentation rate. Note that the task is fundamentally similar across the language and control conditions: Both tasks involve pressing a button to matching pairs. This minimizes task switching demands, making the paradigm more accessible for individuals with aphasia. The use of a single button is also important because many patients find it difficult to learn an arbitrary mapping between choices and response buttons.

    We assessed the feasibility, reliability, and validity of the adaptive semantic matching paradigm in 16 individuals with chronic poststroke aphasia and 14 neurologically normal participants. The participants with aphasia were mostly recruited from an outpatient aphasia center and spanned a range of severities, approximately evenly distributed across mild, moderate, and severe aphasias, characterized with the Quick Aphasia Battery (Wilson, Eriksson, Schneck, & Lucanie, 2018). The patients were scanned with functional MRI on two separate occasions, typically a few weeks apart, to quantify test–retest reliability. Validity was assessed in the neurologically normal participants, since only in these individuals did we have clear expectations about the lateralization and localization of language regions. All participants also completed narrative comprehension and picture naming paradigms for the sake of comparison.

    We found that all 16 individuals with aphasia were able to learn the tasks and performed above chance on both the language and control conditions. The adaptive staircase procedure used should ideally converge at just over 80% accuracy (García-Pérez, 1998). We found that 15 of 16 patients' accuracies were in this range on the semantic task. The remaining patient, who was the most severely impaired patient in the group, performed with 59% accuracy, reflecting a floor effect (i.e., the staircase procedure called for easier trials than were available to present). However, 59% accuracy was still significantly better than chance, indicating that this patient was performing the task like the other participants, albeit less successfully. All 16 patients performed in the expected range on the perceptual control condition.

    Language activation maps derived from the adaptive semantic matching paradigm are shown for all 16 individuals with aphasia on the two separate occasions in Figure 4. It can be appreciated that test–retest reproducibility was good, and interestingly, in most patients (with the notable exception of A12), language processing continued to be left lateralized. Left frontal and left temporal language areas were generally activated, except in cases where they had been destroyed.

    Figure 4.

    Figure 4. Language activation maps derived from the adaptive semantic matching paradigm. (A) Group analysis in 14 neurologically normal participants. (B) Activation maps in 16 individuals with aphasia at two time points each. Voxels with the highest 5% of t statistics were plotted, subject to a minimum cluster volume of 2,000 mm3, in a region of interest comprising known language regions or plausible candidate regions for functional reorganization; note that the cerebellum was not included (unlike in Panel A). Inset axial slices show lesion reconstructions. T1 = first imaging session; T2 = second imaging session; Dice = Dice coefficient of similarity; LI = lateralization index. Adapted from “An adaptive semantic matching paradigm for reliable and valid language mapping in individuals with aphasia,” by S. M. Wilson, M. Yen, & D. K. Eriksson, 2018, Human Brain Mapping, 39, p. 3296. Copyright © 2018 John Wiley and Sons. Adapted with permission.

    The mean Dice coefficient of similarity, quantifying test–retest reproducibility of the semantic paradigm across the two scanning sessions, was .66 ± .15 (range: .40–.82), which significantly exceeded the reliability of the narrative comprehension (Dice coefficient = .47) and picture naming (Dice coefficient = .43) paradigms. To the best of our knowledge, the highest Dice coefficient previously reported for a valid language mapping task was .61 (Fesl et al., 2010). We believe that the good test–retest reproducibility of the adaptive semantic matching paradigm is most likely a consequence of its adaptive design, which ensures that linguistic and other processing are highly constrained, such that similar cognitive states are induced each time a participant performs the paradigm. In contrast, other paradigms such as narrative comprehension are less constrained. For instance, a participant might be interested in the backward speech control condition on one occasion and so might attend to it but might then ignore it on another occasion.

    Validity was assessed primarily in the neurologically normal participants, in whom we have strong expectations about how language is likely to be organized, as described earlier (i.e., left lateralized, with left inferior frontal and left posterior temporal regions activated). The adaptive semantic matching paradigm robustly demonstrated these three known features of normal language organization in the control group. The narrative comprehension paradigm yielded less lateralized maps, and the picture naming paradigm produced essentially bilateral activations, as in our previous study (Wilson et al., 2017). Neither of these paradigms activated the left frontal region consistently, and only the narrative paradigm activated the left temporal region.

    In sum, this study showed the adaptive semantic matching paradigm is a feasible, reliable, and valid method for mapping language regions in people with aphasia. In particular, the core language regions in the left inferior frontal cortex and left posterior temporal cortex were consistently and reliably identified. However, these are not the only brain regions that are important for language.

    Identification of Phonological Encoding Regions

    In the third study, our goal was to design additional adaptive paradigms in order to identify another important component of the language network (Yen, DeMarco, & Wilson, 2019): the left-lateralized fronto-parietal regions that are important for phonological encoding in speech production. Phonological encoding involves processes such as selection and sequencing of phonemes, syllabification, and the application of (morpho)phonological rules (Levelt, 1989). The brain regions specifically implicated in phonological encoding include the supramarginal gyrus and the ventral precentral gyrus (McDermott, Petersen, Watson, & Ojemann, 2003; Pillay, Stengel, Humphries, Book, & Binder, 2014; Price, Moore, Humphreys, & Wise, 1997).

    We designed two tasks that would require participants to engage in phonological encoding. In the first task—rhyme judgment—participants see two pseudowords and are asked to press a button if they rhyme. This task requires phonological encoding because, in order to decide whether the pairs of pseudowords rhyme, the participant has to assemble speech sounds into novel sequences, syllabify them, and determine a stress pattern, applying phonological rules and being sensitive to phonotactic constraints. The stimuli were carefully constructed to ensure that the task could not be performed based on orthographic strings alone; for instance, pairs such as mulky–tulkie rhyme even though the second syllables are spelled differently. In the second task—syllable counting—participants again see two pseudowords, but now they have to press the button if the words have the same number of syllables. Again, this task requires phonological encoding, because syllabification is not inherent in an orthographic string, but must be computed based on the phonological rules of the language.

    The rhyme judgment and syllable counting tasks were both compared to the same perceptual control condition that was used in the semantic matching paradigm. Critically, both tasks were adaptive to participant performance. Task difficulty was controlled by simultaneously manipulating a number of factors including pseudoword length, orthographic transparency, stress patterns, and presentation rate.

    We assessed the reliability and validity of the two phonological paradigms in 16 neurologically normal individuals who also performed the semantic paradigm described above. Each participant completed the three paradigms in a single functional MRI session. Because there was only one session per participant, we could only calculate split-half reliability. This is not an optimal measure because it does not address between-session variability, but it still allows for objective and unbiased comparisons between paradigms.

    We found that the rhyme judgment paradigm was comparable in reliability to the semantic matching paradigm, whereas the syllable counting paradigm was somewhat less reliable. Both phonological paradigms activated the two key phonological encoding regions that have been motivated by the prior literature: the left supramarginal gyrus and the left ventral precentral gyrus, in contrast to the semantic paradigm (see Figure 5). Importantly, these regions were activated not only at the group level but also in the great majority of individual participants. These supramarginal and ventral precentral activations were more strongly left-lateralized for the rhyme judgment paradigm than the syllable counting paradigm. For that reason, along with its greater reliability, we concluded that the adaptive rhyme judgment paradigm would be more optimal for future applications. We also suspect that the rhyme judgment task will be more feasible for individuals with aphasia. We had previously trained eight individuals with aphasia on an adaptive syllable counting paradigm and found that only four of the eight performed above chance (DeMarco, 2016). As described in the next section, our work in progress suggests that a greater proportion of patients can perform the rhyme judgment paradigm.

    Figure 5.

    Figure 5. Activation maps derived from group analyses of neurologically normal participants for (A) the rhyming judgment paradigm, (B) the syllable counting paradigm, and (C) the semantic matching paradigm. Note that the left supramarginal gyrus and the left ventral precentral gyrus were activated only by the phonological paradigms. Adapted from “Adaptive paradigms for mapping phonological regions in individual participants,” by M. Yen, A. T. DeMarco, & S. M. Wilson, 2019, NeuroImage, 189, p. 374, Copyright © 2019 Elsevier. Adapted with permission.

    By using the adaptive rhyme judgment and adaptive semantic paradigms in conjunction, it should be possible to construct differentiated maps of domain-specific language regions in individual participants. This will allow us to go beyond a simple concept of “language areas” so that we can investigate the potential reorganization of a network of regions with distinct functions.

    Current and Future Directions

    Returning to our example from the beginning of the article, we can now use the adaptive semantic matching and rhyme judgment paradigms to approach the question of how Mr. L made such an impressive recovery from aphasia over the past 8.5 years. Mr. L performed both tasks, and the brain regions that he recruited for each task are shown in Figure 6. Not surprisingly, the extensive left frontal activations that are normally characteristic of both paradigms are entirely lacking since Mr. L's left frontal language areas were destroyed in their entirety. In his surviving brain regions, several atypical activations are evident that might reflect functional reorganization: The semantic paradigm activated the right superior temporal sulcus, left angular gyrus, and left supramarginal gyrus, in addition to the expected activation in posterior temporal cortex, and the rhyme paradigm activated a number of right hemisphere regions, in addition to the expected activation in the left supramarginal gyrus. Which of these potentially compensatory activations are responsible for Mr. L's impressive long-term recovery? We cannot answer this question on the basis of a single patient, but we hope that, by studying many patients with this same approach, we will be able to identify relationships between patterns of functional reorganization and language outcomes.

    Figure 6.

    Figure 6. Functional magnetic resonance imaging of the brain of Mr. L. (A) Core language regions identified by the adaptive semantic matching paradigm. (B) Phonological encoding regions identified by the adaptive rhyming judgment paradigm. Activations were thresholded at t > 3.5, with an arbitrary minimum cluster size of 1,000 mm3.

    We are currently carrying out a longitudinal study of functional reorganization of language in the first year after stroke (R01 DC013270). We recruit patients with left-hemisphere stroke at the hospital bedside, typically 2–3 days after stroke, at which time we assess their speech and language function using our Quick Aphasia Battery (Wilson, Eriksson, et al., 2018). We also obtain their acute clinical MRI or computed tomography scans, so that their lesion can be identified and traced. For all patients with aphasia, we then attempt to obtain follow-up data points at 1 month, 3 months, and 12 months. We repeat our speech/language evaluation and also acquire structural and functional MRI whenever possible. Over the past 2 years, we have attempted to scan 35 patients. Of those, 19 were scanned on two or more occasions, yielding a total of 61 attempted scans. This research was approved by the institutional review board at Vanderbilt University Medical Center, and all participants were compensated for their time and travel expenses.

    Of the 35 patients, 31 (89%) were able to learn and perform the adaptive semantic matching task, and these 31 scored comfortably above chance in the scanner (mean accuracy = 80.5%, minimum accuracy = 68.4%). Of the remaining four patients, two were severely impaired and had great difficulty learning the task. Coincidentally, neither of these patients was actually able to be scanned, since they turned out to be slightly too large to fit in the scanner. Therefore, it is possible that they may have been able to perform the task above chance, had they been scanned. Two other patients could not perform the task due to severe reading deficits secondary to a medial occipital stroke in one case and a posterior parietal stroke in the other. The patient with the medial occipital stroke successfully performed an auditory version of the semantic matching task with a melody matching control condition. The patient with the posterior parietal stroke had impaired hearing, so no functional scan was feasible for him. In total then, the adaptive semantic matching paradigm was successfully implemented 58 out of 61 times.

    We also administered the adaptive rhyme judgment paradigm whenever possible. Of the 35 patients, 27 (77%) were able to learn and perform the adaptive rhyme judgment task at one or more time points (mean accuracy = 77.7%, minimum accuracy = 64.4%). The rhyme judgment paradigm is definitely more difficult than the semantic matching paradigm, and it is noteworthy that five of these 27 patients were unable to learn the paradigm at one or more earlier time points before succeeding at later time point(s), presumably due to recovery from aphasia, since the paradigm was not trained in the interim. Eight patients were not able to perform the rhyme paradigm at any time point (including the four patients described above who did not successfully perform the semantic paradigm), although it is possible that some of them will be able to perform it at future time points.

    These experiences to date suggest almost all individuals with aphasia can perform the adaptive semantic matching paradigm 1 month after stroke and that the majority of patients can perform the rhyme judgment paradigm too. This supports the feasibility of using these paradigms as a foundation for investigating the neural substrates of recovery from aphasia after stroke. Our adaptive language mapping paradigms are freely available at Whether researchers use our paradigms or other paradigms, we hope to have demonstrated the importance of investigating the feasibility, reliability, and validity of any potential approach for identifying language regions in individuals with aphasia.


    This article stems from the 2018 Research Symposium at the American Speech-Language-Hearing Association Convention, which was supported by the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health under Award R13 DC003383. Research reported in this publication was supported by the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health under Awards R01 DC013270 and R21 DC016080 to Stephen M. Wilson. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.


    • Binder, J. R., Frost, J. A., Hammeke, T. A., Cox, R. W., Rao, S. M., & Prieto, T. (1997). Human brain language areas identified by functional magnetic resonance imaging.Journal of Neuroscience, 17, 353–362.
    • Binder, J. R., Gross, W. L., Allendorfer, J. B., Bonilha, L., Chapin, J., Edwards, J. C., … Weaver, K. E. (2011). Mapping anterior temporal lobe language areas with fMRI: A multicenter normative study.NeuroImage, 54, 1465–1475.
    • Binder, J. R., Swanson, S. J., Hammeke, T. A., & Sabsevitz, D. S. (2008). A comparison of five fMRI protocols for mapping speech comprehension systems.Epilepsia, 49, 1980–1997.
    • Bradshaw, A. R., Thompson, P. A., Wilson, A. C., Bishop, D. V. M., & Woodhead, Z. V. J. (2017). Measuring language lateralisation with different language tasks: A systematic review.PeerJ, 5, e3929.
    • Brownsett, S. L. E., Warren, J. E., Geranmayeh, F., Woodhead, Z., Leech, R., & Wise, R. J. S. (2014). Cognitive control and its impact on recovery from aphasic stroke.Brain, 137, 242–254.
    • Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology.Psychological Assessment, 6, 284–290.
    • Crinion, J., & Price, C. J. (2005). Right anterior superior temporal activation predicts auditory sentence comprehension following aphasic stroke.Brain, 128, 2858–2871.
    • DeMarco, A. T. (2016). Neural substrates of phonological processing in chronic aphasia from stroke (Unpublished doctoral dissertation). University of Arizona, Tucson, AZ.
    • DeMarco, A. T., Wilson, S. M., Rising, K., Rapcsak, S. Z., & Beeson, P. M. (2018). The neural substrates of improved phonological processing following successful treatment in a case of phonological alexia and agraphia.NeuroCase, 24, 31–40.
    • Eaton, K. P., Szaflarski, J. P., Altaye, M., Ball, A. L., Kissela, B. M., Banks, C., & Holland, S. K. (2008). Reliability of fMRI for studies of language in post-stroke aphasia subjects.NeuroImage, 41, 311–322.
    • Fernández, G., Specht, K., Weis, S., Tendolkar, I., Reuber, M., Fell, J., … Elger, C. E. (2003). Intrasubject reproducibility of presurgical language lateralization and mapping using fMRI.Neurology, 60, 969–975.
    • Fesl, G., Bruhns, P., Rau, S., Wiesmann, M., Ilmberger, J., Kegel, G., & Brueckmann, H. (2010). Sensitivity and reliability of language laterality assessment with a free reversed association task—A fMRI study.European Radiology, 20, 683–695.
    • Fridriksson, J., Bonilha, L., Baker, J. M., Moser, D., & Rorden, C. (2010). Activity in preserved left hemisphere regions predicts anomia severity in aphasia.Cerebral Cortex, 20, 1013–1019.
    • Fridriksson, J., Richardson, J. D., Fillmore, P., & Cai, B. (2012). Left hemisphere plasticity and aphasia recovery.NeuroImage, 60, 854–863.
    • Garcia-Pérez, M. A. (1998). Forced-choice staircases with fixed step sizes: Asymptotic and small-sample properties.Vision Research, 38, 1861–1881.
    • Geranmayeh, F., Brownsett, S. L. E., & Wise, R. J. S. (2014). Task-induced brain activity in aphasic stroke patients: What is driving recovery?.Brain, 137, 2632–2648.
    • Geranmayeh, F., Chau, T. W., Wise, R. J. S., Leech, R., & Hampshire, A. (2017). Domain-general subregions of the medial prefrontal cortex contribute to recovery of language after stroke.Brain, 140, 1947–1958.
    • Griffis, J. C., Nenert, R., Allendorfer, J. B., & Szaflarski, J. P. (2017). Linking left hemispheric tissue preservation to fMRI language task activation in chronic stroke patients.Cortex, 96, 1–18.
    • Hartwigsen, G., & Saur, D. (2019). Neuroimaging of stroke recovery from aphasia—Insights into plasticity of the human language network.NeuroImage, 190, 14–31.
    • Heiss, W.-D., Kessler, J., Thiel, A., Ghaemi, M., & Karbe, H. (1999). Differential capacity of left and right hemispheric areas for compensation of poststroke aphasia.Annals of Neurology, 45, 430–438.
    • Heiss, W.-D., & Thiel, A. (2006). A proposed regional hierarchy in recovery of post-stroke aphasia.Brain and Language, 98, 118–123.
    • Hillis, A. E., Wityk, R. J., Barker, P. B., Beauchamp, N. J., Gailloud, P., Murphy, K., … Metter, E. J. (2002). Subcortical aphasia and neglect in acute stroke: The role of cortical hypoperfusion.Brain, 125, 1094–1104.
    • Holland, A., Fromm, D., Forbes, M., & MacWhinney, B. (2017). Long-term recovery in stroke accompanied by aphasia: A reconsideration.Aphasiology, 31, 152–165.
    • Janecek, J. K., Swanson, S. J., Sabsevitz, D. S., Hammeke, T. A., Raghavan, M. E., Rozman, M., & Binder, J. R. (2013). Language lateralization by fMRI and Wada testing in 229 patients with epilepsy: Rates and predictors of discordance.Epilepsia, 54, 314–322.
    • Kertesz, A., Harlock, W., & Coates, R. (1979). Computer tomographic localization, lesion size, and prognosis in aphasia and nonverbal impairment.Brain and Language, 8, 34–50.
    • Kertesz, A., & McCabe, P. (1977). Recovery patterns and prognosis in aphasia.Brain, 100, 1–18.
    • Knecht, S., Jansen, A., Frank, A., van Randenborgh, J., Sommer, J., Kanowski, M., & Heinze, H. J. (2003). How atypical is atypical language dominance?.NeuroImage, 18, 917–927.
    • Kurland, J., Naeser, M. A., Baker, E. H., Doron, K., Martin, P. I., Seekins, H. E., … Yurgelun-Todd, D. (2004). Test–retest reliability of fMRI during nonverbal semantic decisions in moderate-severe nonfluent aphasia patients.Behavioural Neurology, 15, 87–97.
    • Lenneberg, E. H. (1967). Biological foundations of language. New York, NY: Wiley.
    • Levelt, W. J. M. (1989). Speaking: From intention to articulation. Cambridge, MA: MIT Press.
    • McDermott, K. B., Petersen, S. E., Watson, J. M., & Ojemann, J. G. (2003). A procedure for identifying regions preferentially activated by attention to semantic and phonological relations using functional magnetic resonance imaging.Neuropsychologia, 41, 293–303.
    • Meltzer, J. A., Postman-Caucheteux, W. A., McArdle, J. J., & Braun, A. R. (2009). Strategies for longitudinal neuroimaging studies of overt language production.NeuroImage, 47, 745–755.
    • Newport, E. L., Landau, B., Seydell-Greenwald, A., Turkeltaub, P. E., Chambers, C. E., Dromerick, A. W., … Gaillard, W. D. (2017). Revisiting Lenneberg's hypotheses about early developmental plasticity: Language organization after left-hemisphere perinatal stroke.Biolinguistics, 11, 407–422.
    • Pillay, S. B., Stengel, B. C., Humphries, C., Book, D. S., & Binder, J. R. (2014). Cerebral localization of impaired phonological retrieval during rhyme judgment.Annals of Neurology, 76, 738–746.
    • Price, C. J., Crinion, J., & Friston, K. J. (2006). Design and analysis of fMRI studies with neurologically impaired patients.Journal of Magnetic Resonance Imaging, 23, 816–826.
    • Price, C. J., Moore, C. J., Humphreys, G. W., & Wise, R. J. S. (1997). Segregating semantic from phonological processes during reading.Journal of Cognitive Neuroscience, 9, 727–733.
    • Price, C. J., Seghier, M. L., & Leff, A. P. (2010). Predicting language outcome and recovery after stroke: The PLORAS system.Nature Reviews Neurology, 6, 202–210.
    • Robson, H., Zahn, R., Keidel, J. L., Binney, R. J., Sage, K., & Lambon Ralph, M. A. (2014). The anterior temporal lobes support residual comprehension in Wernicke's aphasia.Brain, 137, 931–943.
    • Rombouts, S. A., Barkhof, F., Hoogenraad, F. G., Sprenger, M., Valk, J., & Scheltens, P. (1997). Test—retest analysis with functional MR of the activated area in the human visual cortex.American Journal of Neuroradiology, 18, 1317–1322.
    • Saur, D., Lange, R., Baumgaertner, A., Schraknepper, V., Willmes, K., Rijntjes, M., & Weiller, C. (2006). Dynamics of language reorganization after stroke.Brain, 129, 1371–1384.
    • Seghier, M. L., Kherif, F., Josse, G., & Price, C. J. (2011). Regional and hemispheric determinants of language laterality: Implications for preoperative fMRI.Human Brain Mapping, 32, 1602–1614.
    • Sims, J. A., Kapse, K., Glynn, P., Sandberg, C., Tripodis, Y., & Kiran, S. (2016). The relationships between the amount of spared tissue, percent signal change, and accuracy in semantic processing in aphasia.Neuropsychologia, 84, 113–126.
    • Skipper-Kallal, L. M., Lacey, E. H., Xing, S., & Turkeltaub, P. E. (2017). Right hemisphere remapping of naming functions depends on lesion size and location in poststroke aphasia.Neural Plasticity, 2017, 8740353.
    • Springer, J. A., Binder, J. R., Hammeke, T. A., Swanson, S. J., Frost, J. A., Bellgowan, P. S., … Mueller, W. M. (1999). Language dominance in neurologically normal and epilepsy subjects: A functional MRI study.Brain, 122, 2033–2046.
    • Staudt, M., Lidzba, K., Grodd, W., Wildgruber, D., Erb, M., & Krägeloh-Mann, I. (2002). Right-hemispheric organization of language following early left-sided brain lesions: Functional MRI topography.NeuroImage, 16, 954–967.
    • Swinburn, K., Porter, G., & Howard, D. (2004). Comprehensive Aphasia Test (CAT). Hove, United Kingdom: Psychology Press.
    • Szaflarski, J. P., Allendorfer, J. B., Banks, C., Vannest, J., & Holland, S. K. (2013). Recovered vs. not-recovered from post-stroke aphasia: The contributions from the dominant and non-dominant hemispheres.Restorative Neurology and Neuroscience, 31, 347–360.
    • Szaflarski, J. P., Holland, S. K., Jacola, L. M., Lindsell, C., Privitera, M. D., & Szaflarski, M. (2008). Comprehensive presurgical functional MRI language evaluation in adult patients with epilepsy.Epilepsy and Behavior, 12, 74–83.
    • Turkeltaub, P. E., Messing, S., Norise, C., & Hamilton, R. H. (2011). Are networks for residual language function and recovery consistent across aphasic patients?.Neurology, 76, 1726–1734.
    • Tyler, L. K., Wright, P., Randall, B., Marslen-Wilson, W. D., & Stamatakis, E. A. (2010). Reorganization of syntactic processing following left-hemisphere brain damage: Does right-hemisphere activity preserve function?.Brain, 133, 3396–3408.
    • Tzourio-Mazoyer, N., Petit, L., Razafimandimby, A., Crivello, F., Zago, L., Jobard, G., … Mazoyer, B. (2010). Left hemisphere lateralization for language in right-handers is controlled in part by familial sinistrality, manual preference strength, and head size.Journal of Neuroscience, 30, 13314–13318.
    • Weiller, C., Isensee, C., Rijntjes, M., Huber, W., Müller, S., Bier, D., … Diener, H. C. (1995). Recovery from Wernicke's aphasia: A positron emission tomographic study.Annals of Neurology, 37, 723–732.
    • Wilson, S. M., Bautista, A., Yen, M., Lauderdale, S., & Eriksson, D. K. (2017). Validity and reliability of four language mapping paradigms.NeuroImage: Clinical, 16, 399–408.
    • Wilson, S. M., Eriksson, D. K., Brandt, T. H., Schneck, S. M., Lucanie, J. M., Burchfield, A. S., … Kidwell, C. S. (2019). Patterns of recovery from aphasia in the first two weeks after stroke.Journal of Speech, Language, and Hearing Research, 62, 723–732.
    • Wilson, S. M., Eriksson, D. K., Schneck, S. M., & Lucanie, J. M. (2018). A quick aphasia battery for efficient, reliable, and multidimensional assessment of language function.PLOS ONE, 13(2), e0192773.
    • Wilson, S. M., Yen, M., & Eriksson, D. K. (2018). An adaptive semantic matching paradigm for reliable and valid language mapping in individuals with aphasia.Human Brain Mapping, 39, 3285–3307.
    • Yen, M., DeMarco, A. T., & Wilson, S. M. (2019). Adaptive paradigms for mapping phonological regions in individual participants.NeuroImage, 189, 368–379.

    Author Notes

    Disclosure:The authors have declared that no competing interests existed at the time of publication.

    Correspondence to Stephen M. Wilson:

    Editor-in-Chief: Sean Redmond

    Editor: Swathi Kiran

    Publisher Note: This article is part of the Forum: Advances in Neuroplasticity Research on Language Recovery in Aphasia.

    Additional Resources