From active touch to tactile communication: What’s tactile cognition got to do with it? Jude Nicholas




Дата канвертавання27.04.2016
Памер51.46 Kb.
From active touch to tactile communication: What’s tactile cognition got to do with it?
Jude Nicholas
Although visual and auditory cognition is well researched and better understood, relatively little is known about tactile cognition in general. It is perhaps unsurprising then that the majority of theoretical insights concerning the mechanisms and principles governing cognitions have been developed on the basis of the research on the visual and auditory systems. One should expect there to be a number of similarities and important differences between tactile cognition and cognitions that have been experienced via sight or hearing. Tactile cognition refers to the higher order processing and integration of tactile information through active touch. Until recently few studies had attempted to investigate the effects of tactile cognition. What is more, recent developments in cognitive neuroscience (neuroimaging and neuropsychology) mean that we now know far more about the mechanisms underlying tactile cognition than ever before.
Making sense of our touch
Touch provides a rich variety of information about the world around us. The sense of touch is the first sense to develop, and it functions even after seeing and hearing begin to fade. Just before the eighth week of gestation an embryo may develop sensitivity to tactile stimulation. Touch does not constitute of a more “primitive” sensory modality when compared with vision and audition. The human sense of touch is an active, informative, and useful perceptual system (Klatzky & Lederman, 2002). Touch is our most social sense, and it provides us with our most fundamental means of contact with the external world. Interpersonal touch plays an important role in governing our emotional wellbeing. Touching typically implies an interaction with another person. The sense of touch provides us with an often-overlooked channel of communication. The notion “to touch with fingertips” is very much related to communication as portrayed by Michelangelo Buonarotti on the Sistine chapel ceiling and to today’s “touch generation” consisting of a range of software, games, iPods and mobile phones which let people connect with each other through interactive experiences.
Active touch, also described as haptics, is when the individual deliberately chooses his or her actions in the exploration and manipulation of an object. Active touch plays a regular and frequent role in our everyday life. Whenever we retrieve keys or lipstick from the bottom of a pocket or purse, or awake at night to switch on a lamp or answer a phone, we must identify by active touch the desired objects as distinct from other objects on which our hands might alight. It is only our sense of touch that enables us to modify and manipulate the world around us (McLaughlin, Hespanha, & Sukhatme, 2002).

Understanding the tactile brain; cognitive mechanisms and brain representations
It is through the sense of touch we process the tactile information of our environment. Touch messages are the first link in the “chain” of information properties required for the processing of tactile information. The tactile processing system reflects a continuum along the aspects of tactile sensation, tactile perception and tactile cognition. The foundational assumption of this approach is to view the human brain as an information processor that registers, encodes stores and manipulates various types of symbolic representations, through the tactile modality. The attributes of tactile representations in the human information processing system consists of the following: (1) low-level tactile sensory processing which includes sensations on the body’s surface, proprioceptive sensations, kinesthetic senses of bodily movement and balance and as well as the those detecting vibration and spatial position, (2) tactile motor functioning which includes manual exploration and manipulatory skills, (3) tactile perceptual processing which includes the discrimination of the tactile features of objects (texture, substance, size, or shape), tactile-spatial perception, tactile part-whole relationship and tactile figure-ground perception and (4) high-level tactile cognitive processing which involves tactile attention, tactile short-term memory, tactile working memory, tactile learning, tactile memory and tactile language.
Research on the topic of tactile processing have found evidence for bidirectional exchange of information between tactile sensation, perception and cognition; that is streams of processing occurring both ways (Spence & Gallace 2007). This is referred as the bottom-up processing (peripheral/tactile sensation to central/tactile cognition) and the top-down processing (peripheral/tactile cognition to central/tactile sensation). Top-down processing occurs any time higher-level concept influences our interpretation of lower level sensory data. Such top-down processing capacity permits our brains to analyze complex tactile information in one-tenth of a second, and allowing us to experience the richness of the world.
Top-down and bottom-up processing are two distinct yet highly interactive modes of neuronal activity underlying normal and abnormal human cognition. Pathological changes either in top-down or bottom-up processing may cause different clinical disorders. Such as tactile agnosia (a neural disorder caused by brain lesions affecting people’s ability to recognise an object only by means of touch despite having relatively preserved primary and discriminative somaesthetic perception), finger agnosia (a neural disorder affecting people’s ability to identify which finger has been touched) or tactile defensiveness (hypersensitivity to touch due to distortion in central nervous system’s ability to process tactile sensory input).

The tactile processing system involves the peripheral mechanisms of the somatosensory pathways and are divided into different central regions and distinct streams of information processing. Tactile sensations reach the brain through a complex interplay between the somatosensory pathways (receptors and nerve endings in the skin of the hand, foot & body) and the somatosensory cortex (the brain regions involved in the processing of tactile information).


The somatosensory cortex is located in the parietal lobe of the human brain and receives tactile information from the hand, foot & body. It is well known that a relatively larger proportion of the somatosensory cortex is given over to the representation of the hands than to other parts of the body, given their relative surface area. The mapping of the body surfaces in the brain is called a homunculus. The homunculus shows that, in determining how much space is needed in the cortex, the size of the body part is less important than the density of its nerves. However, we often think that it is our hands that give us the most of touch information because we use them to manipulate objects, but everything we do, including sitting, walking, and feeling pain, depends on touch.
The somatosensory cortex is involved in processing information related to touch. Tactile processing involves two distinct areas of the somatosensory cortex; the primary (S1) and the secondary somatosensory (S2) cortex. Neuroimaging studies have shown primary and secondary somatosensory cortex involvement during tactile sensory and perceptual processing.

The human brain is separated into two distinct cerebral hemispheres, connected by the corpus callosum, and the functions of each cortical hemisphere are different. A study has shown that hemispheric dominance appears to be an organizing principle for cortical processing of tactile form and location; a left hemispheric dominance for tactile form recognition (what one is touching) and a right hemispheric dominance for tactile localization (where one is being touched) (van Boven et al., 2005). Furthermore, this study discovered a left-lateralized processing of fine spatial details (local elements) and a right-lateralized processing of holistic spatial details (global elements) in the somatosensory system. Notably, a left-hemisphere advantage for processing local spatial details and a right hemisphere advantage for processing global spatial has been described in the visual system.


A study by Reed and colleagues (2005) suggests the segregation of information processing to different pathways in the cortical tactile system; a ventral stream (abdomen region of the brain) for tactile object recognition (a “what” system) and a dorsal stream (back region of the brain) for tactile object localisation (a “where” system). This study also suggested that in active touch the “how” (grasping the target) and “where” (reporting its location) are intimately connected. Notably, it seems that the role of spatial information might be more relevant in the case of tactile modality, given the close link between the awareness of tactile information and the spatial processing of that information. Tactile neglect, which is associated with a lesion to right hemisphere of the brain, causes the patient to behave as if the left side of sensory space is nonexistent, can also be found in combination with visual neglect (Shindler et. al., 2006). This study supports the view that there is a common or unitary spatial map accessible by means of either tactile or visual sensory modality.

"Touch to emotion": neural correlates of the emotional aspects of tactile processing
The sense of touch provides a very powerful means of eliciting and modulating human emotion. We use touch to share our feeling with others, and to enhance the meaning of other forms of verbal and non-verbal communication. Despite its importance for our emotional well-being, the study of the emotional aspects of touch have been somewhat neglected by scientists over the years.

In the case of emotions, it is not our hands but the body, which is crucial to emotional experiences. It is difficult to imagine emotions in the absence of their bodily expressions. Different emotions are induced in the brain and are played out in the theatre of the body (Damasio, 1999). Given the apparent relationship between bodily-tactile information processing and emotion, it is not surprising that recent neuroscientific research have found evidence for strong neural connections between the somatosensory cortex and the brain regions involved in the processing of emotions (the insular cortex). The insular cortex has increasingly become the focus of attention for its role in body representation and subjective emotional experience. Subjective emotional experience (i.e. feelings) arises from our brain's interpretation of bodily states that are elicited by emotional events. This is an example of the concept known as embodied cognition. Views of embodied cognition discuss how our neural and developmental embodiment shapes both our mental and linguistic categorizations and argue that all aspects of cognition are shaped by aspects of the body.

Furthermore, the strong link between the processing of tactile information and emotions has been demonstrated in the clinical condition of touch-emotion synesthesia. Synesthesia is a condition in which a sensory stimulus presented in one modality evokes a sensation in a different modality. Ramachandran & Brang (2008) have shown that in individuals with touch-emotion synesthesia, specific textures (e.g., denim, wax, sandpaper, silk, etc.) evoked equally distinct emotions (e.g., depression, embarrassment, relief, and contentment, respectively), suggesting an increased cross-activation between somatosensory cortex and the emotion processing regions of the brain.
The neurodevelopmental mechanisms underlying tactile-emotional connections are still unclear. However, a study using an integrative psychophysiological approach to investigate the features of development of corticosubcortical and limbic–reticular mechanisms (regulation of the wakefulness level), found a correlation with the features of the emotional and cognitive development of six- to seven-year old children to an unfamiliar situation; tactile interaction with dolphins (Ilyukhina et al., 2008).

These studies give evidence for the influence of emotions on tactile cognition. Thus, it is important to consider emotions as a powerful motivator to tactile learning.


Understanding tactile cognitions: tactile representations in the human information processing system

In the last decade there has been a dramatic increase in the number of studies directed at studying the different concepts of tactile cognition. A brief overview of the research studies concerning tactile cognitive concepts will be given below.

The tactile short-term memory can be described as the capacity for holding a small amount of tactile information in mind in an active, readily available state. Visual and auditory short-term memory is said to hold a small amount of information– from about 3 or 4 elements (i.e., words, digits, or letters) to about 9 elements: a commonly cited capacity is 7±2 elements, referred as the magic number. Is the span of serially presented tactile stimuli very limited when compared to vision or audition? Research has shown that the span for serially presented tactile stimuli is similar as in vision (Heller, 1989). Millar (1999) has claimed that “there seems to be no reason why memory spans for tactual patterns should be any worse than for the same patterns in vision, if the tactual patterns are coded spatially as global shapes” (p. 753).

Moreover, within the short-term memory system, a sensory memory of short duration (few hundreds of ms) has been reported. This “ultra” short-term memory has been explored most intensively for visual stimuli (iconic memory), auditory stimuli (echoic memory) and recently for tactile stimuli-a tactile sensory memory (Gallace et al., 2008).

The tactile working memory refers to the ability to hold and manipulate tactile information for short periods (transformation of information while in short-term memory storage). For instance, when deciding which apple or avocado needs eating first or which drink has the right temperature to be consumed on a warm day, we are likely to explore and compare hardness or temperature using our hands. This process that enables us to keep the relevant tactile information active for task performance over a short period of time is the tactile working memory.

Working memory allows us to hold the tactile stimulus characteristics on-line to guide behaviour in the absence of external cues or prompts. Without active working memory, initial tactual percepts may decay quickly. Although working memory is an outstanding mental resource with limited capacity and temporary storage, research supports that mental exercises and training on working memory may lead to effects that go beyond specific training effect (Jaeggi, et al., 2008).

Studies investigating the neural basis of working memory have shown that the prefrontal cortex (the frontal system of the brain) becomes active while subjects perform working memory tasks, either in the visual or in the auditory modality. Likewise, a brain activity study during tactile spatial working memory tasks have identified the involvement of prefrontal cortical areas (Kostopoulas et al., 2007). Common brain regions may subserve the generation of higher order representations involved in working memory for visual, auditory and tactile information.

Recently, there has been a great interest to study working memory for tactile information. A study on the different working memory capacities for visual and tactile working memory, found that the tactile working memory was generally more limited and showed more variability than visual working memory in the normal sighted participants (Bliss & Hamalainen, 2005). However, the better performance on the visual tactile memory compared to the tactile working memory could be explained by the basic differences in exploration between visual and tactile modalities. Visual exploration of complex forms is considerably faster than haptic exploration (Butter & Bjorklund, 1976).

A similar study on the different types of interference in visual and tactile working memory found that spatial interference was selectively deteriorated in both visual and tactile working memory but strongly in the later (Mayas, et al., 2008). This strengthens the claim that the processing of tactile stimuli is highly connected to the processing of information regarding the spatial attributes of the stimuli. Finally, it should be noted that the mental rotation of the tactile layouts seems to be related to certain aspects of the tactile working memory (Ungar et al. 1995)

Working memory capacity may also reflect the efficiency of attention functions. Tactile attention allows us to select particular elements of tactile sensory input for more detailed cognition. Tactile attention can be described as a multidimensional cognitive capacity based on the theoretical model advanced by Sohlberg & Mateer (1987), such as in the ability to focus attention (attending to the tactile stimuli), ability to sustain attention (keeping the tactile attention) and most notably in the ability to select attention (maintaining the tactile attention in the face of distracting irrelevant information).



A neuroimaging study has shown attention-related changes in activated somatosensory brain areas. This means that when specific attention is given to the tactile stimuli the more active the tactile brain becomes and better is the ability to focus and maintain attention, particularly when other events are serving to capture attention. Studies on how active touch may improve learning have suggested that involving students in consciously choosing to investigate the properties of an object increases attention to learning (Sathian 1998).
An important aspect of tactile attention is tactile information processing speed or mental speed. Mental speed reflects how efficiently the attention system is functioning. Slowed processing speed often underlies attentional deficits (Lezak, et al., 2004).

Tactile learning is the process of acquiring new information through tactile exploration. Research studies of tactile information processing in humans have shown that people can be trained to perceive a large amount of information by means of their sense of touch. A neuroimaging study of tactile learning with normally sighted participants, who had undergone long-term tactile training, showed that long-term training modified the tactile-to-visual cross-modal responses in the primary visual cortex of sighted subjects (Saito et al., 2006). This study suggests that the involvement of visual areas when participants perform tactile tasks might be related to the strengthening of crossmodal connections as a function of intensive practice; practice makes perfect.

Tactile memory refers to the persistence of learning in a state that can be revealed at a later occasion (from the long-term memory). The knowledge we store in the long-term memory affects our perceptions of the world, and influences what information in the environment we attend to. Modern cognitive theories often distinguish between two forms of knowledge stored in memory: procedural and declarative.
Procedural memory refers to the ability to remember how to perform a task or to employ a strategy. Declarative memory is our fund of factual information about the world. Declarative memory, in turn, takes two basic forms: semantic and episodic. Semantic memory stores facts and generalized information in networks or schemata, whereas episodic memory refers to our ability to recall events and personal experiences from our past and stores information as images. An important aspect of episodic memory is autobiographical memory. Autobiographical memory is a personal representation of general or specific events and personal facts; memory of a person’s history.
Relatively few studies have attempted to address the characteristics and functioning of our memory for manipulated objects (tactile memory), as compared to the large number of studies that have addressed people’s memory for visually presented objects. At least under specific conditions of stimulus presentation, the tactile modality is a reliable system for the processing and storage of haptically explored stimuli (Klatzky et al., 1985). People’s memory for haptically explored stimuli seems to be strictly related to the nature of the material presented (i.e., two-dimensional/three-dimensional, size, location) and to the time available for stimulus encoding (i.e., the amount of time provided to the participants to “haptically scan” the stimuli). Longer exploration time is required by the (“serial”) tactile modality as compared to the (“parallel”) visual modality (Newell et al., 2005). Specifically, optimal performance is obtained when people are allowed to scan three-dimensional everyday objects in their own time, while impaired performance is typically obtained with the brief presentation of bi-dimensional raised lines stimuli (Gallace & Spence, 2009). This observation would seem to lead to the conclusion that the integration of movement and tactile information of the stimuli might play an important role in the storage of tactile information in the brain.
The studies discussed thus far suggest that active touch, understood from an information-processing standpoint, is a fully functional cognitive system.
While various different studies have investigated the nature of people’s tactile representations of real objects virtually no research has addressed questions related to more ‘autobiographic’ forms of tactile memory, such as where participants were explicitly asked to recollect information regarding their previous life experiences with tactile stimuli/sensations. There is also no published research on the question of how tactile memories deteriorate over time and whether the pattern of deterioration observed for this form of memory is correlated with the pattern of deterioration observed when visual and auditory materials are presented. These important issues would need to be extensively addressed by researchers if we want to fill the gap between our knowledge regarding the visual and auditory aspects of memory and those relative to its tactile aspects (Gallace & Spence, 2009).
Tactile cognitions in the deafblind: From active touch to tactile communication
Deafblind people use active touch in ways that no one else does to explore objects and the environment, to perceive feelings and to act and communicate. There are various tactile communication and tactile language interventions, which are used within the deafblind field, such as haptic communication, full co-active signs, one hand coactive signs and hand-over-hand signing.
Recently there has been an interest in understanding the involvement of tactile cognitions in the various tactile communication methods. The theoretical and clinical understanding of tactile cognitions is necessary in the deafblind field.
Deafblind individuals are generally more experienced in recognizing stimuli by active touch. What is the impact of combined vision and hearing impairment on tactile cognitions? Can studies with persons with deafblindess help us understand tactile cognition such as tactile information processing speed, tactile working memory or tactile memory?
A study investigating the tactile working memory ability of an adventiously deafblind woman found higher average performance level in a tactile memory span test (Nicholas & Christensen, in press). The tactile memory span subtests (tactile forward and tactile reverse memory) measure sequential memory span. Tactile forward memory is thought to be related to the efficiency of attention, whereas tactile reverse memory is thought to be associated with working memory. Similarly, results from a neuropsychological investigation showed that an adventitiously deafblind person took significantly lesser time to feel and remember objects on a Tactile Form Recognition test. This increased tactile processing speed may reflect how efficiently the person’s attention system was functioning. Furthermore, results also showed superior performance in tactile memory for the location of objects on the Tactual Performance Test (Nicholas & Koppen, 2007).
Taken as a whole, the results of these two studies indicate that deafblind individuals perform more effectively than sighted-hearing people on tasks of tactile working memory and tactile memory. A possible explanation for the better performance is that deafblind individuals are expected to have more tactile experience since this is the sensory system that they must rely on for information about their environment. The deafblind can recognize an object by feeling a portion of it, which then acts as a signal for the whole image; a brief touch of the object would be enough to prompt full recognition (Meshcheryakov, 1974).
Likewise, the performance of ten deafblind and ten sighted-hearing participants on four tactile memory tasks was investigated and the result showed that the deafblind people's encoding of tactile spatial information is more efficient than that of sighted-hearing people and that it was probable that their short-term storage and retrieval were normal (Arnold & Heiron, 2002). The explanation given for the superior tactual performance of the deafblind people was that it was a product of more tactual experience. This view appears to be consistent with Rönnberg’s (1995) claim that compensation of a deficit by means of unrelated cognitive functions (cognitive neuroplasticity) rather than perceptual compensations accounts for the improvement in performance seen in deafblind individuals in different tasks.
Which neural networks are involved in tactile language processing when hearing and vision are lost simultaneously? A study which compared neural activation during tactile presentation of words and non-words in a postlingually deaf-blind subject and six normal volunteers, found that the tactile language activated the language systems as well as many higher-level systems of the postlingually deafblind subject. Thus, the deafblind subject was heavily involved in interpreting tactile language by enhancing cortical activation of cognitive and semantic processing (Yasuhiro, et al., 2004). This means that tactile languages are equipped with the same expressive power that is inherent in spoken languages.
On the basis of these studies it will be important for future research to address the following questions; what is the relationship between tactile working memory abilities and the use of linguistic constructions in tactile communication or tactile language, what is the neurologic relationship between dual sensory impairment and tactile defensiveness, how are the ‘autobiographic’ forms of tactile memory established and how do tactile memories deteriorate over time compared to visual and auditory memory.
Finally, it should be noted that the understanding of tactile cognitions is needed in the functional assessment (tactile strengths & weaknesses) of deafblind individuals. The outcomes of an assessment of tactile cognition, in addition to “embodied experiences” and “bodily-tactile emotions”, can be used as a basis for intervention or intervention planning. The opportunities for bodily feedback during emotional reactions to influence tactile information processing are enormous.
The tactile demands the deafblind person has to meet in its environment can serve as a starting point for assessment. Furthermore, assessment should take into account which type of deafblindness is being investigated. When assessing the adventiously deafblind, modified psychometric instruments or checklists measuring the tactile prerequisites of every-day activities could be applied. However, when assessing the tactile processing abilities of congenital deafblind individuals, an interdisciplinary integrated assessment is necessary. One should also consider the assessment in a dynamic and broader context, thus considering communicational situations. Tactile cognition must be understood in relation to sensation and perception but also in relation to emotion and communication.

 
REFERENCES

Arnold, P., & Heiron, K. (2002). Tactile memory of deaf-blind adults on four tasks.

Scandinavian Journal of Psychology, 43, 73-79.
Bliss, I., & Hamalainen, H. (2005). Different working memory capacity in normal young adults

for visual and tactile letter recognition task. Scandinavian Journal of Psychology, 46,

247-251.
Butter, E.J & Bjørklund, D.F. (1976). Are two hands better than one? Assessing information acquired from one-and two handed haptic exploration of random forms. Perceptual and motor skills, 43,115-120.
Damasio, A. (1999). The Feeling of What Happens: Body, Emotion and the Making of Consciousness. Heinemann: London.
Gallace, A., Tan, H. Z., Haggard, P., & Spence, C. (2008). Short term memory for tactile stimuli.

Brain Research, 1190, 132-142.
Gallace, A. & Charles, (2009).The cognitive and neural correlates of tactile memory. Psychological Bulletin, 135 (3), 380-406.
Heller, M. A. (1989). Tactile memory in sighted and blind observers: The influence of

orientation and rate of presentation. Perception, 18, 121-133.


Ilyukhina, V.A., Krivoshchapova, M.N., & Manzhosova,.G.V. (2008). Psychophysiological Study of the Effects of Adaptation to Tactile Interaction with Dolphins in Six- to Seven-Year-Old Children

Human Physiology, 2008, Vol. 34, No. 4, pp. 421–430
Jaeggi, S.M., Buschkuel, M., Jonides, J. & Perring, W.J. (2008). Improving fluid intelligence with training on working memory. PNAS, 105 (19), 6829-6933.
Klatzky, R. L., & Lederman, S. J. (2002). Touch. In A. F. Healy & R. W. Proctor (Eds.),

Experimental psychology (pp. 147-176). Volume 4 in I. B. Weiner (Editor-in-Chief)

Handbook of psychology. New York: Wiley.
Klatzky, R. L., Lederman, S. J., & Metzger, V. A. (1985). Identifying objects by touch: An

“expert system”. Perception & Psychophysics, 37, 299-302.


Kostopoulos, P., Albanese, M. C., & Petrides, M. (2007). Ventrolateral prefrontal cortex and

tactile memory disambiguation in the human brain. Proceedings of the National Academy



of Sciences USA, 104, 10223-10228.
Lezak, M.D., Howieson, D.B., & Loring, D.W. (2004). Neuropsychological Assessment (4th ed.). New York: Oxford University Press.
Mayas, M.S.J., Manso, A.J., & Ballesteros, S. (2008). Working memory for visual and haptic targets: A study using the interference paradigm. In M. Ferre (Ed.), EuroHaptics (pp. 395-399). Springer-Verlag Berlin Heidelberg.
McLaughlin, M., Hespanha, J., & Sukhatme, G. (2002). Touch in virtual environments:

Haptics and the design of interactive systems. Upper Saddle River, NJ: Prentice Hall.
Meshcheryakov, A. (1974). Awakening to life. Moscow: Progress Publishers.
Millar, S. (1999). Memory in touch. Psicothema, 11, 747-767.
Newell, F., Woods, A. T., Mernagh, M., & Bülthoff, H. H. (2005). Visual, haptic and crossmodal recognition of scenes. Experimental Brain Research, 161, 233-242.

Nicholas, J. & Koppen, A. (2007). Understanding the tactile brain. Conference proceedings.14th Deafblind International (Dbl) World Conference Western Australia, Perth,


Nicholas, J. & Christensen, M. Tactile working memory and deafblindness: a case study. In press.
Reed, C. L., Klatzky, R. L., & Halgren, E. (2005). What vs. where in touch: An fMRI study.

NeuroImage, 25, 718-726.

Ronnberg, J. (1995). Perceptual compensation in the deaf and the blind: Myth or reality? In R.A. Dixon & L. Backman (Eds), Compensating for psychological deficits and declines: Managing losses and promoting gains (pp 251-274). Mahwah, NJ: Lawrence Erlbaum Associates.


Sathian, K. (1998). Perceptual learning. Current Science, 75, 451–456.

Schindler I., Clavagnier S., Karnath, H.O, Derex L., & Perenin, M.T. (2006) A common basis for visual and tactile exploration deficits in spatial neglect? Neuropsychologia, 44(8):1444-51.
Sohlberg,M.M And Mateer, C.A. (1987). Effectiveness of an attention-training program. Journal of clinical Experimental Neuropsychology, 9: 117-130.
Spence, C., & Gallace, A. (2007). Recent developments in the study of tactile attention.

Canadian Journal of Experimental Psychology, 61, 196-207.
Suga N., Gao,E., Zhang, Y., Ma, X., & Olsen, J.F. (2000). The corticofugal system for hearing: Recent progress. Proc Natl Acad Sci U S A. 97(22): 11807–11814.
Ungar, S., Blades, M., & Spencer, C. (1995). Mental rotation of a tactile layout by young

visually impaired children. Perception, 24, 891-900.

Van Boven, R. W., Ingeholm, J. E., Beauchamp, M. S., Bikle, P. C., & Ungerleider, L. G.

(2005). Tactile form and location processing in the human brain. Proceedings of the



National Academy of Sciences USA, 102, 12601-12605.
Van Boven, R.W., Ingeholm, J.E., Beauchamp, M.S., Bikle, P.C., & Ungerleider , L.G. (2005). Tactile form and location processing in the human brain, PNAS, 102 (35) 12601-12605
Yasuhiro, O., Katsumi, D., Masashi, T., Kazuhiro, N., Hiroshi, N., Aya, I.., Takako, I., Masao, I., Takuya, Y., Naohiko, O., Jun, H., &Takeshi, K. (2004). Cortical processing of tactile language in a postlingually deaf-blind subject. NeuroReport, 15( 2 ), 287-291




База данных защищена авторским правом ©shkola.of.by 2016
звярнуцца да адміністрацыі

    Галоўная старонка