sensory integration

views updated

sensory integration Many of our perceptual and cognitive experiences involve the combination and interaction of different sensations. There are numerous ways in which our ability to recognize and localize events can be improved if we sample information from different sensory modalities that are present at the same time. One of the most compelling is the marked improvement of our ability to understand speech if we can observe the speaker's lips moving. On the other hand, if the cues from different senses are discordant, perception can be distorted. For example, if you listen to a recording of a particular speech sound (say ‘ba’) while watching a silent video recording of the movement of someone's lips uttering a slightly different sound (‘ga’), the sound that you actually perceive will change to something closer to that being produced by the lips (the McGurk Effect).

Many well-defined areas of the brain (such as the sensory areas of the cerebral cortex) are each devoted to the analysis of signals from a particular sensory system. But if information from different sensory systems is to be integrated and co-ordinated, these signals must be combined within the central nervous system. Indeed, nerve cells that receive converging sensory inputs are quite widespread in the brain. ‘Multisensory convergence’ is occasionally found right at the level of the peripheral receptors: for instance, the smallest, naked nerve endings in the mammalian skin respond to damaging stimuli that may be mechanical, thermal, or chemical in nature, and many of them also respond to certain categories of non-damaging stimuli, such as gentle touch. However, neurons that combine different sensory inputs become more prevalent at higher levels of the central nervous system.

A simple example of interaction between sensory modalities is the way in which the sensation of pain from a cut or a blow to the skin can be reduced by rubbing the affected or surrounding area. This is thought to be due to interactions in the spinal cord: the large sensory fibres that respond to the rubbing exert an inhibitory influence on neurons that receive input from the smaller fibres that detect the noxious stimulus.

Nerve cells that have input from more than one sensory system are particularly abundant in specialized ‘multisensory’ regions or ‘association fields’ in the temporal, parietal, and frontal lobes of the mammalian cerebral cortex. But even areas of the cortex that appear to be devoted to a single sense may be influenced by other sensory modalities. Thus, recent studies involving the imaging of activity in the human brain have found that crossmodal influences on perception and attention can be associated with increased activity in the visual and auditory cortices.

Sensory integration plays an important role in a variety of neural functions. Neurons in the so-called ‘reticular activating system’, which extends through much of the brain stem, receive signals from more than one sense organ. They in turn modulate the activity of neurons in the midbrain, thalamus, and cortex, depending on the overall levels of incoming sensory stimulation, hence influencing the level of alertness and arousal.

The integration of gustatory, olfactory, and other sensory stimuli in areas of the limbic system plays an important role in emotion. Even neurons in those parts of the brain that are involved in the control of bodily posture can be modulated by different sensory cues. This allows motor commands to be modified to take account of the current position or motion of the eyes, head, or limbs.

Detecting the positions of objects in the environment (for instance, predators or prey) is of crucial importance to animals, and several of the sensory systems, especially vision and hearing, are particularly specialized for this task. It is obvious that the various messages about the positions of stimuli must be co-ordinated for effective and accurate control of behaviour. One region of the brain that seems to be particularly important in this respect is the superior colliculus (colliculus means little hill) in the roof of the midbrain, which is concerned with the control of ‘orienting’ movements of the eyes, the head, and the rest of the body towards objects of interest. In mammals, this nucleus receives nerve fibres conveying signals from the eyes, the ears, and the body. Each sensory input is distributed across the superior colliculus to form a neural ‘map’ in a particular layer of the nucleus. The maps of the visual and auditory worlds and of the body surface are superimposed in such a way that, say, a visual stimulus and a sound at a particular point in space will excite cells at the same position in the superior colliculus. This registration of different sensory maps provides an efficient arrangement by which any stimulus, irrespective of its modality, can activate the pathways that control appropriate movements of the eyes or head towards the position of the stimulus in space. The specialization of this part of the brain for combining sensory information about spatial location is very widespread among vertebrates, even those with quite different sense organs from our own. For example, rattlesnakes have heat-detecting organs in their cheeks, which form an image of the infra-red radiation from warm objects (including potential warm-blooded prey). These infra-red ‘eyes’, like the real eyes, send signals to the superior colliculus and the two maps of space are superimposed.

Multisensory integration also provides a way in which one sensory system, most often visual, can ‘calibrate’ the neural representations of other modalities during development. There is evidence, for instance, that the projection from the eyes to form a visual map in the superior colliculus, which is itself probably largely genetically programmed, can, after birth, ‘teach’ the synapses formed by incoming fibres from the auditory pathway, enabling them to form a map of auditory space that is matched to the visual map.

Vision generally dominates human perception. When conflicts between the senses occur, vision tends to bias both auditory and tactile perception. For example, in ventriloquism, the visual cues (the movement of the puppet's mouth) can readily ‘capture’ the sound of the ventriloquist's voice. On the other hand, early blindness leads to substantial reorganization in the remaining senses. These changes include the heightened auditory and tactile sensitivity, and improved auditory localization of blind people. Recent studies of activity in the living human brain, using imaging techniques, have shown that the regions of the cortex responding to auditory and especially tactile stimulation can expand enormously in people who have been blind from birth or childhood, even involving areas that would normally respond only to visual stimulation. This increase in the area of cortex occupied by the remaining senses may play a part in the improvement of sensitivity and discrimination.

Andrew J. King

Bibliography

King, A. J. and and Hartline, P. H. (1999). Multisensory convergence. In Encyclopedia of neuroscience, (ed. G Adelman and B. H. Smith, 2nd edn.), Elsevier pp. 1236–40. Also available on CD-ROM.
Stein, B. E. and and Meredith, M. A. (1993). The merging of the senses. MIT Press, Cambridge MA.


See also hearing; sense organs; somatic sensation; synaesthesia; vision.