Sensory systems are the peripheral parts of the nervous system responsible for the transformation of physical stimuli into a neural code. Receptors of each sensory system are sensitive to a distinct kind of energy, like the hair cells of the inner ear to sound energy and the mechanoreceptors of the tactile system to mechanical energy or the visual receptors to electromagnetic energy. Each system encodes four essential features of a stimulus. Besides stimulus modality, these are intensity, duration, and spatial location. Vision, hearing, kinesthesia, and touch serve as initial functions for generating information within the central nervous system (CNS), which in turn contributes sensations and bottom-up input to perception and up to higher cognition. Vision is discussed more in depth in a separate entry, whereas the roles of hearing, kinesthesia, and touch are discussed more fully in this entry.
Nearly all kinds of goal-directed actions in sports rely on currently updated information about the environment as well as about one’s own posture and kinesthesia. Though vision is often understood as the most important sensory modality concerning the regulation of actions in sports, it is also crucial to realize that vision only can work in relation to the perceived self. Information about the physical self from different modalities—tactile, proprioceptive and vestibular information—is integrated within a multisensory supramodal coherent representation, the body scheme. This body representation is broadly involved into spatial sensorimotor processing of skilled action and it is continuously updated with ongoing movements. Patrick Haggard and Daniel Wolpert distinguished between a body scheme and a body image much more dependent on conscious visual representations.
The organization of the somatosensory representation—as a main element of the body scheme— is related to the localization of a distinct area on the body surface. The region of skin from which a tactile stimulus will alter the firing rate of a receptor neuron is called the receptive field. Receptive fields have been found for neurons of the somatosensory, the visual, and the auditory systems. Neighboring cutaneous receptive fields from the human skin are represented also in adjacent cortical fields, leading to a somatotopic organization of the somatosensory cortex in the form of a homunculus. A topographic organization of the cortical projection pattern can also be found for the visual system in the primary visual cortex as well as a tonotopy pattern in the primary auditory cortex, altogether known as sensory maps. Nevertheless visual input is integrated into the body scheme in a mostly unconscious manner, as shown by Michael Graziano and coauthors at the end of the 20th and beginning of the 21st century, and is further modulated by auditory input. Multisensory integration is particularly crucial for the emergence of a coherent body scheme, and some empirical findings illustrate the continuous impact of intersensory adjustments. Examples include the rubber hand illusion, reported by Matthew Botvinick and Jonathan Cohen, and an altering of perceived size of a single body segment evoked by the vibration of related tendons, reported by Jim Lackner.
Sensation Versus Perception
In contrast to perception, which explicitly includes the level of awareness, sensory sensations do not usually reach the level of conscious awareness. Sensation is the detection of basic features of a stimulus. It is understood as the first stage of an activation of a sensory system, as a peripheral internal representation of distal stimuli, clearly dominated by bottom-up processes. Nevertheless sensory information can directly influence motor behavior and ascending sensory pathways have several linkages to descending motor functions. Besides such physiological indications for sensory motor circuits, there are supplemental references from the field of applied kinesiology. Here, special training procedures as well as scientific approaches in fields like sensorimotor control, sensorimotor training, or sensorimotor adaptation can be found; they involve certain subdomains of motor behavior with a particular low impact of conscious attention such as the regulation of balance or posture, as well as speech articulation. On the other hand, perception is based on higher brain functions and the bottom-up fractions are supplemented much more by top-down fractions like experiences and expectations when interpreting objects and events in the world. However, as the title of James Jerome Gibson’s book, The Senses Considered as Perceptual Systems, suggests, it is utterly impossible to definitively separate levels and functions of the sensory system from that of the perceptual system.
Proprioception: Kinesthesia, Touch, and the Vestibular System
Proprioception can be subdivided into two or three subsystems describing the tactile system as part of proprioception.
- Kinesthesia (literally, sense of movement) generates information about the positions and movements of the limbs and other body parts including strength sensation. Based on kinesthesia, humans possess information about the rate and the direction of limb movements. This information can be self-generated or externally imposed and can be encoded by different kinds of mechanoreceptors in joints and capsules (slowly adapting Ruffini endings), muscles (muscle spindle receptors), tendons (Golgi tendon organs) and ligaments (slowly adapting Golgi receptors). Muscle spindles signal limb position and movement by encoding muscle length; Golgi tendon organs encode muscle tension. Subtlety of kinesthesia can be determined by measuring the just notable difference or detection threshold for a joint, the threshold displacement in degrees as well as the smallest angular excursions or the smallest rotation angle of a joint. Reported values vary between joints and can be related to the initial joint angle, muscle loading, active versus passive effectuation, and age of the participants.
- The tactile system, as well as parts of kinesthesia, is based on mechanoreceptors in the skin. Besides detection and specification of mechanical deformation of the skin (force, pressure), about 12 different kinds of cutaneous mechanoreceptors play a significant role in kinesthesia. Cutaneous mechanoreceptors can be subdivided into rapidly responding (A) vs. slowly adapting receptors (B).
- A. Receptors of the first subcategory are well suited for the specification of rapid movements and fine motor skills. The loss of skin sensation of the hand—as an example of a body part with a high degree of tactile resolution and a large receptive field in the cerebral cortex—causes severe deficits in fine motor hand skills.
- B. Slowly adapting receptors signal static position information as well as information about slow displacements. Several functions of the tactile system rely on such slowly adaptive proprioceptors, as shown by Type 1 and Type 2 receptors, which are associated with Merkel cell complexes and Ruffini endings and continue discharge with maintained deformation of the skin.
Kinesthetic information and tactile information are essential parts of the somatosensory system. Tactile information is entangled closely to kinesthetic information, as demonstrated by the bending of any joint that stretches a related region of skin around the joint and relaxes another area proportionately.
- The vestibular system is located in the inner ear as three semicircular converging canals—the superior, the posterior, and the horizontal canal— which indicate rotational accelerations; they are supplemented by the otolith organs in the saccule and utricle, which indicate linear accelerations. Besides balance control, the vestibular system contributes to spatial orientation and, via mediating structures, posture regulation, as well as control of eye movements. Balance regulation, as a well-known example of sensorimotor regulation, is not solely governed by the vestibular system; kinesthetic information and visual information are integrated seamlessly.
Hearing and the Emergence of Acoustic Events
Although it is obvious that vision is the dominating sense for many sports as well as for motor learning, which is frequently based on visual models, it might be surprising that hearing is a sense that is exclusively sensitive to the perception of motion. Sound is an acoustic consequence of a kinetic event—the existence of a kinetic event is essential for generating a sound event. If there is no movement in the surroundings, there is nothing to hear. The air or other media such as water and ground must be set into vibration by motion to generate and to transmit audible sounds within the hearing range of the human ear. Only sound events within the frequency range of about 20 hertz (Hz) and up to 20,000 Hz can be transformed within the hair cells located in the cochlea. Additionally, a minimal pressure of the sound wave—also dependent on frequency—is necessary to elicit an auditory sensation. As noted by Bertram Scharf and Søren Buus, a sound of 2 kilohertz (kHz) is audible at a sound pressure level (SPL) of 0 decibels (dB), a sound of 3.8 kHz at about –5 dB, as has been measured on young adults; both indicate the extreme sensitivity of the human ear. In comparison to the optic nerve, consisting of about 1,000,000 nerve fibers, the auditory nerve subsumes only about 30,000 fibers; though the number and interpretation is arguable, this may indicate that the ear is much more designed to analyze temporal features contrary to vision, which is more sophisticated in analyzing spatial features.
When setting the frequency range of human movements in relation to the hearing range, it is evident that humans are unable to hear their own movements directly. Highest motion frequencies have not been observed in sportsmen but in world-class pianists performing trills at about 16 Hz and thereby below the threshold of the hearing range at 20 Hz. Only the impact of human movements on ambient surfaces, such as hitting the tennis ball with the racket or stemming the skis into an icy snow, induces audible sounds. But once a kinetic event generates an audible sound, there is much kinetic information coded acoustically. On a wooden surface, a bouncing Ping-Pong ball sounds different from a tennis ball. Kinetic features as well as properties of involved materials (balls, wood) specify the sound parameters systematically, which can be subdivided into two essential categories: material induced sound features versus kinetically induced sound features.
The material category specifies sound features such as spectral composition and sound envelope, which includes the components attack, sustain, and decay. These parameters are specified by the physical parameters of the related materials and the related medium, the air. When hitting the tennis ball, the density of the racket as well as the tension of the racket string specify the sound in interaction with the special features of the tennis ball (material, pressure). Further features of the sound are assigned to the kinetic category, like amplitude and duration of sound, which are specified by kinematic and dynamic parameters. The kinetic energy of the approaching tennis ball as well as direction and spin specify loudness, hardness, timbre, and sound duration; the selected technique also influences these characteristics: A slice sound exhibits a longer duration and is of less hardness, so it sounds smoother than a straight backhand. There is much information about the kinetics encoded in sound, and motor control and motor learning are indeed supported by auditory information. Only a few experts are aware of the auditory impact on motor behavior because auditory information processing takes place widely in the background of conscious awareness.
Sensory Auditory Information in Sports
The impact of music and sound on motor behavior is multifaceted. Football players adjust with team members to fans’ acclamation; they respond to the acoustical atmosphere within the arena, and they react to the rhythm and the accents of the referee’s whistles. Sport gymnasts and dancers synchronize their movements to accompanying music or arrange certain elements related to the melody individually, together with a partner or within an ensemble. Runners synchronize the step cadence to perceived music and even pace themselves over a race by arranging a special sequence of different music pieces with varying tempi for tuning their running speed over the course. Music can be used to modulate the heart rate (HR), the perceived exertion, or the general arousal as well as the basic activation of distinct brain regions. An overview on effects of music on sports has recently been published by Costas Karageorghis and David-Lee Priest.
In most of these correlations between music and motor behavior perceptions, higher cognitive functions are broadly involved: Music is audible only after auditory stimuli are processed by a widespread cortical network. However, a more direct, sensory-based impact of sound on motor behavior occurs as acoustic information from earlier stages of auditory processing is integrated into motor control. Natural motion attendant sounds are used for controlling and optimizing motor behavior: The performance of a tennis player decreases if auditory perception is occluded, and similar findings have been reported on table tennis. Synchronization in team rowing is partially realized by analyzing sounds of the boat and of the swivels. In a diversity of sports, the “all-around sense” of hearing directs the spatially restricted visual attention. For instance, in team sports the orientation of gaze is partially affected by surrounding sounds induced by the steps of teammates or opponents. Auditory input affects gaze behavior as well as orientation behavior fundamentally straight on the sensory level, when integrated with visual and proprioceptive input in the colliculi superiores. These structures located in the brain stem are on the other hand emitting motor efferences, which are directly specified by the special formation of the current multisensory input. The multisensory motor characteristics of the colliculi superiores have been studied intensely by Barry E. Stein and M. Alex Meredith. This region can be understood as an outstanding example for a straight but flexible responding sensorimotor interface. It indicates how seamlessly multisensory information is merged together already on the subcortical level—and even down to the level of a single neuron: A multisensory convergence neuron integrates auditory, visual, and proprioceptive input, and the same neuron responds with descending motor output.
Further Perspectives for Sports and Exercise
There is no doubt that skilled action in sports relies on supramodal representations. It has become evident that the integration of information from different sensory modalities in terms of an early multisensory integration is crucial for the emergence of a body scheme as well as for spatial representations, where visual, auditory, and body segment-related sensory maps are continuously integrated into supramodal representations. These representations act as a reference framework for the regulation of skilled action. For developing new methods of training and exercises prospectively, mechanisms of multisensory integration can be focused. Information from different modalities is integrated according to certain rules (spatial convergence, temporal coherence). If one of the engaged sensory streams can be shaped in a certain form or an additional perceptual stream can be created in a way that is integrated into the supramodal representation, a diversity of applications can be addressed in the future, to include enhancement of mental training, recalibration of the body scheme, or a directed adaptation of sensations of a distinct modality. The approach of realtime movement sonification, developed by Alfred Effenberg and currently neurophysiologically supported by Gerd Schmitz and colleagues, could be a method prospectively used in many different applications in sports and motor rehabilitation.
- Boff, K. R., Kaufman, L., & Thomas, J. P. (Eds.). (1986). Handbook of perception and human performance. Vol. I. Sensory processes and perception. New York: Wiley.
- Botvinick, M., & Cohen, J. (1998). Rubber hands “feel”touch that eyes see. Nature, 391(6669), 756.
- Bregman, A. S. (1990). Auditory scene analysis. Cambridge: MIT Press.
- Calvert, G. A., Spence, C., & Stein, B. E. (Eds.). (2004).The handbook of multisensory processes. Cambridge: MIT Press.
- Effenberg, A. O. (2005). Movement sonification: Effects on perception and action. IEEE Multimedia, 12(2),53–59.
- Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin.
- Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.
- Goldstein, B. E. (2010). Sensation and perception (8th ed.). Belmont, CA: Wadsworth.
- Haggard, P., & Wolpert, D. M. (2005). Disorders of body schema. In H.-J. Freund, M. Jeannerod, M. Hallett, & R. Leiguarda (Eds.). Higher-order motor disorders (pp. 261–272).
- Oxford, UK: Oxford University Press. Karageorghis, C. I., & Priest, D.-L. (2012). Music in the exercise domain: A review and synthesis (Part I & Part II). International Review of Sport and Exercise Psychology, 5(1), 44–84.
- Scharf, B., & Buus, S. (1986). Audition I—Stimulus, physiology, thresholds. In K. R. Boff, L. Kaufman, & J. P. Thomas (Eds.), Handbook of perception and human performance. Vol. I. Sensory processes and perception (pp. 14-1–14-71). New York: Wiley.
- Schmitz, G., Mohammadi, B., Hammer, A., Heldmann, M., Samii, A., Münte, T. F., et al. (2013). Observation of sonified movements engages a basal ganglia frontocortical network. BMC Neuroscience, 14(32),1–11. doi: 10.1186/1471-2202-14-32
- Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge: MIT Press.
- Stoffregen, T. A., & Bardy, B. G. (2001). On specification and the senses. Behavioral Brain Sciences, 24(2),195–261.
- Thomson, R. F. (1993). The brain: A neuroscience primer (2nd ed.). New York: W. H. Freeman & Co.