Sound waves received by the ear are turned into neural activity by a complex mechanism involving the eardrum, the bones in the middle ear, and the hair cells within the cochlea. The auditory nerve carries the signal from the ears to the brainstem, from where it passes via the thalamus to the auditory areas of the cerebral cortex. In the cortex, speech sounds are extracted from the incoming signal. There are neural circuits in the auditory cortex that are specialized for speech and language as opposed to other types of sound.
Speech production and speech perception takes place predominantly in one hemisphere of the brain, usually the left. Several areas within the left hemisphere are involved. Broca’s area, in the frontal lobe, seems to be crucial for syntactic operations in both production and perception of speech. Wernicke’s area, in the temporal lobe, seems to be crucial for accessing the concrete meanings of words. The evidence for the distinction in function between posterior and anterior language areas comes from the study of aphasia, that is, difficulties with language resulting from brain injury.
In addition, further evidence has come from brain scanning.
Methodologies that have proven useful in determining whether or not the view derived from aphasia is correct are positron emission tomography (PET) scanning and functional magnetic resonance imaging (fMRI) scanning. These are frequently used in research to establish which parts of the brain are active during particular tasks.
One difficulty is that a standard linguistic task, such as understanding a sentence’s meaning, involves phonology and syntax and semantics, and thus is not helpful when trying to tease out which of these subtasks happens in which areas.
Many studies have looked at the pattern of activation produced in the brain by single words. The areas especially active are widespread and somewhat variable, but generally include the auditory cortex on both sides, other parts of the left temporal lobe, and Wernicke’s area. An early study by Karin Stromswold from the Massachusetts Institute of Technology aimed to identify the areas specialized for the processing of syntax (Stromswold et al., 1996). Her team set up two different conditions of sentence processing. In one condition the participants heard sentences like:
The child spilled the juice that stained the rug
Whereas in the other condition they heard sentences like:
The juice that the child spilled stained the rug
Both of these contain the same words. The first is syntactically quite simple because the order of the nouns in the sentence mirrors their logical relations (child spilled juice, juice stained rug). The second is more complex as the order of its elements does not reflect the logical relations. We can hypothesize, therefore, that the areas of the brain specialized for syntax should be more active in the second condition than the first.
The most important difference between the first and second conditions was indeed that Broca’s area (Figure 1) was much more active in the second. This finding confirms that of several other studies.
Figure 1. The area of greatest additional brain activity in a series of syntactic as opposed to non-syntactic language tasks was found to be in Broca’s area.
Processing more than syntax
Thus the view from aphasia seems confirmed; the anterior language areas are specialized for syntax (and verbs and sentence construction), whereas the posterior and temporal ones are more specialized for individual word meanings (and nouns and concreteness). This is doubtless a simplification. There is evidence of substantial variability between individuals, and the distinction between areas and subtasks is not watertight. Moreover, we usually do all the subparts of linguistic processing interactively and almost simultaneously, so something that affects any one part will probably affect them all to a greater or lesser extent.
Indeed, recent research by Sahin et al (2009) has shown that the role of Broca’s area is much more generalized than earlier models suggest: it is not solely dedicated to processing a single kind of linguistic information. Studying pre-surgical patients with epilepsy using intracranial electrophysiology (ICE), Sahin and colleagues demonstrated that the contribution of Broca’s area to language processing involves semantics, syntax and phonology. They established that Broca’s area is differentiated into distinct circuits that processes lexical, grammatical and phonological information sequentially (i.e. in a step-wise fashion) over a period of about 450 milliseconds.
Research such as this is allowing us to understand the anatomy of the language faculty in greater and greater detail.
Sahin, N.T., Pinker, S., Cash, S.S., Schomer, D. and Halgren, E. (2009) ‘Sequential processing of lexical, grammatical, and phonological information within Broca’s area’ Science 326, 5951. 445-449.
Stromswold, K., Caplan, D., Alpert, N., & Rauch, S. (1996). ‘Localization of syntactic comprehension by positron emission tomography’ Brain and Language 52, 452–473.
[Information last accessed: 27 July 2017]
This article is adapted from ‘From sound to meaning: hearing, speech and language’. An OpenLearn (http://www.open.edu/openlearn/) chunk reworked by permission of The Open University copyright © 2016 – made available under the terms of the Creative Commons Licence v4.0 http://creativecommons.org/licenses/by-nc-sa/4.0/deed.en_GB. As such, it is also made available under the same licence agreement.