Jaime Sánchez - Multimodal Interfaces for the Cognition of Learners Who Are Blind

14:00
Jeudi
8
Oct
2015
Organisé par : 
L'équipe "Keynotes" du LIG
Intervenant : 
Jaime Sánchez, University of Chile, Santiago, Chile

Information détaillée : 

 

Jaime Sánchez received the M.A. (1983), M.Sc. (1984), and Ph.D. (1985) degrees from Columbia University, New York, USA. He has also been a postdoctoral research fellow at MIT Media Lab and Cornel University (1987). He is actually Professor of Human-Computer Interaction at the Department of Computer Science, University of Chile. He has developed several software applications and videogames for learning and cognition. He has also directed diverse projects on 3D sound-based virtual environments for enhancing learning and cognition in users who are blind. He spent a sabbatical leave in 2006-2007 as Visiting Professor at Columbia University, New York, USA and at the Center for Non-Invasive Brain Stimulation, Harvard University, Boston, USA.

Currently, he is the director of C5, Center of Computing and Communication for the Construction of Knowledge at the Department of Computer Science, University of Chile. Lately, he is working on audio and haptics-based multimodal mobile interfaces to help learners to develop and rehearse problem solving and navigation skills in real settings. He is also involved in long-term studies to help users who are blind to navigate and way finding in simple and complex indoor and outdoor settings using video gaming for the construction and use of mental maps. His research interests include multimodal interaction, usability and impact evaluation methods, game-based learning, mobile learning and the impact of using 3D audio on cognitive development in users who are blind.  
 
He has published profusely on these topics in diverse international renowned journals and ACM and IEEE Conferences, and has also authored several books and book chapters on interfaces for learning and multimodal interaction for the enhancement of the intellect of users who are blind.
 
He is ultimately involved in a join research project with Harvard University researchers to study changes and adaptations at the level of the brain related to navigation skill by incorporating multimodal videogame interfaces within a neuroimaging environment. This neuroplasticity or re-wiring of the brain may explain the compensatory and in some cases, even enhanced behavioral and cognitive skills in the learners who are blind. The research work compared brain activation patterns related to navigation in sighted and blind users, examining brain activation over time as subjects improve in their overall navigation skills by using multimodal interfaces. Learning how the brain and related sensory mechanisms provide structure for navigation in blind users and what areas are associated with the skill of navigating through space and learning will provide some clues for potential use in training, learning and rehabilitation. 
 
 
Réalisation technique : Antoine Orlandi | Tous droits réservés - Mi2S
Résumé : 
This presentation will introduce and discuss a long-term research on the design, development, and evaluation of multimodal interfaces for the cognition of learners who are blind.
 
Learners who are blind tend to learn and construct a world view through haptic perception. They understand the surrounding world through sensing and manipulation through the sense of touch by reading Braille text. The haptic sensory system comprises tactile (cutaneous sensing) and kinestesic (propioception) sensing. 
 
By 1994 a group of pioneer Chilean researchers began to explore the use of computer delivered audio to enhance learning and cognition. The idea was to exploit audition through 3D spatial sound and synthesized voice as substitute for the loss of vision to help children to construct, learning, cognition, and understanding. This was later extended to the use haptic interfaces to explore its impact on the development of the intellect of learner who are blind.
 
Since then a significant number of audio and haptic-based virtual environments together with cognitive tasks as companion methodology have been designed, applied and evaluated with learners who are blind. The role of sound and haptic-based interfaces in designing, developing, and rehearsing the cognition in children who are blind has been fully studied. Thus, the impact of audio and haptics on tempo-spatial cognitive structures, haptic perception, abstract, short-term and spatial memory, orientation and mobility, navigation, wayfinding, mental maps, problem solving, communication and multimedia composing, collaboration, and science and mathematics learning skills, has been studied to draw a picture about the role that auditory and tactile senses can play in the development of the intellect of learners who are blind. 
 
All of this work has been underlined by the hypothesis that computer-based audio and haptics can have a significant role in the intellectual development of the child who is blind. The approach of this work has been to tailor the sensory modality components of the virtual environment applications around the needs of the persons’ disabilities by simulating structures for people with visual disabilities by the use of enhanced 3D, stereo and haptic interfaces. Children using these interfaces (who never actually have seen the physical visual world) are able to use sound cues to create a spatial-cognitive map of the space and then accurately represent this space with physical objects. While tested mainly with children, it is possible to conceive of such audio and haptic-based interfaces as providing platforms for assessment and rehabilitation of persons with visual disabilities at any age.
 
We hypothesized that audio and haptics can play a relevant role in substituting the loss of vision for learning and cognition purposes. As a result of several full-experimental settings using audio-based virtual environments to enhance learning and cognition in children who are blind in real school settings, we have come across with the research-proof idea of discovering and learning worlds with sound by knowing and perceiving through the sense of hearing.
 
The application of this work started in Chilean schools for blind children, but it has been spread through schools in Argentina, Brazil, Colombia, Costa Rica, France, Mexico, Paraguay, Peru, The United States of America, Uruguay, and Venezuela, among others. All of these audio-based products and their accompanying methodology for using them are delivered with free cost to schools attended by blind children.
 
Finally, new research questions arose during the last decade concerning how the brain rewires itself to process increasing input information from audio cues using computer interfaces following visual deprivation, how interactive audio and haptic-based virtual interfaces can trigger brain organizational changes to represent the world of those without the benefit of sight, how brain plasticity through changes and adaptations can be related to navigation skill by incorporating multimodal videogame interfaces, and how these changes at the level of the brain may represent the neurophysiological basis for learning and cognitive skills in the blind when interacting with multimodal interfaces.