For information on exhibition and sponsorship opportunities at the conference, contact Yvonne Romaine at email@example.com
Download the Where 2.0 Sponsor/Exhibitor Prospectus
For media-related inquiries, contact Maureen Jennings at firstname.lastname@example.org
To stay abreast of conference news and to receive email notification when registration opens, please sign up for the Where 2.0 Conference newsletter (login required)
Have an idea for Where to share? Tell us!
View a complete list of Where 2.0 contacts
The ability to “perceive” the local, immediate and even global environment for
daily living activities is needed by all people. People with visual
impairments (or even cognitive impairments)
require technology that identifies obstacles in their
immediate path both inside and outside of buildings and assists in
For cognitive impairments (e.g., Alzheimer’s),
it is not the local or immediate environment placement that is at issue but
rather the global positioning, where is home, where am i going and where
am i now?
Sensory substitution can be found naturally with the phenomenon
Synesthesia means ``joined sensation’’ and describes the
rare capacity to ``hear colors, taste shapes, or experience other
equally strange sensory fusions whose quality seems difficult
for the rest of us to imagine’’. This phenomenon in nature
acts as a figurative inspiration to the possibility of
artificially inducing spatial perception
through the tactile senses.
Sensory substitution is the process of transferring stimuli characteristics
of one sensory modality (e.g., vision, tactile) into stimuli of another sensory
modality (e.g., touch, auditory). Sensory substitution in its purest form
would capture the richness of the replaced sense. None of the current
systems such as ones that substitute tactile touch on the tongue for vision
or substitute auditory for vision
come even close to replicating the richness of the visual sense.
What these systems (as well as ours) actually do is sensory addition
or sensory supplementation. There are arguments
that there may be some substitution actually occurring, as visual cortical
activity was noticed to be elevated when substituting audition for
vision. An experiment into whether using a tactile belt to provide
magnetic compass directions can lead to the development and integration of
a sixth sense showed that the new sensor was well
integrated into the general behavior of the subjects.
Maze navigation with blindfolded subjects wearing visuo-tactile devices
showed the evolution of the spatialization process towards distal attribution. Other unique applications of sensory substitution include
spatially mapping one area of the body to another.
There are approximately 11.4 million people with visual impairments in the US.
Blindness prevalence increases with age and the average age of the population
is gradually increasing. There are a variety of different types of visual
impairments, varying from retinal, cortical or even the optical nerve. In addition,
the person’s perceptual cognitive capabilities vary depending on when
sight was lost (i.e., child, young, old) and their general disposition.
It is estimated that approximately 180 million people
worldwide have visual impairments of which 40 to 45% are blind (WHO).
Alzheimer’s is a progressive disease, which means that over time, more parts of the brain are damaged and symptoms become worst. Some symptoms include confusion and forgetting of people, places and events; mood swings; and becoming withdrawn. Alzheimer’s is a form of dementia. There are 750,000 people with dementia in the UK, of which 18,000 are under 65 (UK Alzheimer society). There are 5 million people with dementia in Europe and nearly 18 million people with dementia worldwide.
By 2025, this number is expected to grow to 34 million, of which 71% will live in developing countries.
We have developed and experimented with two different sensory substitution devices
for people who are blind: (1) a visual to tactile glove device for local environment obstacle avoidance; and (2) a GPS, compass, inertial sensing system to tactile belt for global positioning and wayfinding.
Our focus has been on providing wayfinding capabilities to people who
are blind. The goal has not been to replace existing aids such as the long
cane or guide dog but rather to augment them and provide either medium
distance information (1 m to 10 m) or global information.
Our initial investigations into using a device that converted a depth image
obtained from a stereo camera into tactile obstacle avoidance patterns on
the hand indicated that economic issues are a necessity to address for an effective
system. The stereo vision camera was the expensive component and we are
currently exploring other ways of obtaining depth information using a monocular
camera. An economic upper limit for a system in this marketplace is $1,000.
Our next system uses a tactile belt to convey orientation and wayfinding information.
The unit is interesting because it is simple, low cost and has the capability
of being networked. The unit interfaces via bluetooth to a cell phone (or pda) which provides the bridge to the internet, thus access to global maps, other units, base stations, etc.
The camera is essential for
obtaining local environmental obstacle avoidance information.
Our current effort in visual routines has focused on computing depth (stereo, monocular), detecting obstacles, detecting objects (i.e., faces, signs, other objects) and detecting context. Our haptic research has focused on how to communicate as much information as possible through inexpensive wearable actuators. Engineering issues when integrating all components into a system include weight, power consumption, inter-operable of components, etc. User feedback indicates that we have made progress in the right direction with respect to our objective of haptic perception of the spatial environment as well as fulfilling the market need.
We are now
exploring acquiring depth from monocular cameras which are ubiquitously
available as inexpensive features in many cell phones. In the future, we plan to integrate a camera with our inexpensive belt platform.
The initial reception to both devices has been highly positive from the associations, especially the associations for people who are blind. The first system was tested with 13 volunteers who are blind in an indoor setting. One of the ways we intend to test the belt is to provide the belt to a group of 20 to 50 volunteers (people who are blind) for use for a week and get their feedback.
We have also engaged with a Rehabilitation Research Institute to test our device with about 30 Alzheimer patients. Preliminary results from this experimentation indicate navigation performance with the belt is far superior to auditory directions given by another person.
John Zelek is Associate Professor (in Systems Design Engineering at the University of Waterloo), with expertise in the area of intelligent Mechatronic control systems that interface with humans; specifically, the areas are (1) wearable sensory substitution and assistive devices; (2) probabilistic visual and tactile perception; (3) wearable haptic devices including their design, synthesis and analysis; and (4) human-robot interaction. Zekek has co-founded 2 start-up companies. He was awarded the best paper award at the international IEEE/IAPRS Computer and Robot Vision conference. He was awarded a 2006 Distinguished Performance Award from the Faculty of Engineering at the University of Waterloo. He was also awarded the 2004 Young Investigator Award by the Canadian Image Processing & Pattern Recognition society for his work in robotic vision.
Comments on this page are now closed.