Wayfinding Technologies: Odds and Ends

Everyday when I open my email, open the latest tech magazine, or surf the TV or net, I run into the "latest great invention for the blind." The media overhypes and misunderstands pretty much everything, so most of the "stories" have to be disregarded. There are usually kernals of hope, good research, and potential present in most of the stories also; but it is difficult to separate hype from anything useful. There are good reporters in the media, and good people throughtout the system of course, so this is just an institutional reality that we need to look beyond. The inventors of these "next great blind technologies" are also often bright and dedicated individuals, but they often overhype and misunderstand the big picture as well. That caveat aside, this is just a list of items found but not evaluated; we just don't want to lose track of who's doing what and where. also, be aware that many of these are projects that have come and gone.

bar

Kyoto Institute of Technology: Road-crossing aid for the blind: working on a computerised system (Japan) that measures the width of the street crossing and can tell what colour the lights are. The system uses a tiny camera that employs projective geometry to measure the width of crossings and that can see the colour of the lights and communicate the information to the blind person. The research is published in the journal "Measurement Science and Technology", from the UK's Institute of Physics. Kyoto Institute of Technology, Japan; tel: +81 75 724 7014; www.kit.ac.jp.

The Seeing Eye Glove: Two University of Guelph engineers are developing a seeing-eye glove to communicate surroundings to the visually impaired through touch.

Prof. John Zelek and PhD candidate Sam Bromley wanted to develop technology to aid the visually impaired with a system that is almost as intuitive as seeing. The system they've developed consists of cameras that process information to a computer that sends an array of vibrations to a form-fitting glove worn by the user.

Two small cameras that act as eyes are worn chest height and communicate upcoming obstacles through the glove's vibrating motors. Images from the cameras are processed in a computer the size of a Palm Pilot, which provides tactile feedback about obstacles up to 30 feet away.

The glove is worn on the non-dominant hand and has motors strategically placed on the fingers and hand. For example, if the glove is worn on the left hand, an obstruction lying straight ahead would trigger a vibration - similar to a pager or cell phone vibration - on the middle finger. If the obstacle is to the right of centre, the index finger's motor would vibrate.

bar

For The Blind, A Computer Navigation System With Its Own "Map"

GAINESVILLE, Fla. -- University of Florida researchers have wedded speech recognition software, wearable computers, satellite positioning technology and other emerging technologies in a 21st-century navigational aid for the blind. Composed of a waist-worn computer and headset connected remotely to a map database server, the prototype delivers and responds to instructions verbally. It keeps track of the user's location while giving directions to a destination - and may even warn the user against veering off a sidewalk or stepping into a road.

"When we started this project, we were looking for a compelling mobile application of wearable computing that would be not just for fun from a research perspective, but also useful to society," said Steve Moore, who designed the system for his master's degree in computer science and engineering.

Computer engineering Professor Sumi Helal and civil and coastal engineering doctoral student Balaji Ramachandran also helped with the project, which the researchers named DRISHTI, after the Sanskrit word for vision. While in the early stages, the system is a promising attempt to address the difficult problem of helping the blind get around in a world designed for sighted people.

Nationwide, about 1.1 million people suffer from blindness. Speaking into the microphone, the user tells the system his location and where he or she wants to go - for example, from the UF student union to the computer science building. The system responds with directions based on the user's starting point, saying, for example, to turn 15 degrees and walk along a sidewalk for 230 feet. If the user veers off the sidewalk or travels too far, the system provides a verbal correction. It also may warn against impediments or hazards, such as picnic tables or streets.

To achieve such contextual real-time directions, the system relies on numerous hardware and software components, both mobile and fixed. In addition to the headset, a blind person using the system carries a commercially available personal computer about half the size of an egg carton. Workers building jet aircraft use such systems to access wiring schematics, as do workers conducting large-scale inventories. In this case, however, the wearable computer contains voice-recognition and other software. The user also carries a cell phone for wireless communication, an antenna and a backpack containing a Global Positioning System, receiver, batteries and other equipment.

Housed in a lab in UF's computer engineering building, the database server holds a Geographic Information System, or GIS, database of the UF campus. Far too immense to fit onto the wearable computer, the database contains the latitudes and longitudes of thousands of points of reference on campus, from sidewalks to buildings to streets. It also can be easily updated to include construction activities or other temporary landscape changes on campus.

The system matches the user's location - obtained using Global Positioning System technology - with the information provided by the database server in real time. The voice-recognition/wearable computer provides the user interface for the data.

A demonstration revealed the promises of the system as well as some of its challenges. On one hand, the system provided specific directions as requested and communicating with the computer by voice was surprisingly easy despite its limited vocabulary. However, its style - more command and response than conversational - took some practice. Additionally, because the GIS database consists only of the UF campus, the current system could not be used outside the university. But in the future, Moore says, similar GIS databases could be accessible for use in many other locations.

"What you would like is to be able to offer this as a service," Moore says. "You go to a city, and say, 'OK, I need to be navigated,' and it taps into the GIS database for that city."

Moore embarked on the project in part because his father, a UF math professor, is blind. Theral Moore, who tested the system, said he found it very helpful as an orientation tool. "If you're out there all alone with only a cane, you can make a little turn > here and there, and first thing you know you have no idea which way is west," he said. "But if you get directions from the voice telling you what direction you're going, how far off course you are, and/or if you are leaving the middle of the sidewalk, you just feel much more comfortable."

Pat Maurer, director of community relations for the National Federation of the Blind, said there are no similarly comprehensive electronic navigation systems currently on the market. UF's Moore hopes to develop the system into a commercial product in the next two years. As they become more practical, such systems could be helpful to some blind people, Maurer said.

Mikes May's comments concerning the GPS project at the University of Florida.

"It is always interesting to hear about research projects to improve navigation and access to location information. Having worked on these technologies now for 8 years, I hear about various efforts about once a quarter. It is ironic to hear in this article that "there are no similarly comprehensive electronic navigation systems currently on the market. "In fact, the BrailleNote GPS from Pulse Data HumanWare is the only accessible GPS system on the market. The Sendero Group laptop in 2000 was the first commercially available system based upon an earlier prototype from Arkenstone. Jack Loomis at UC Santa Barbara has had a campus-based GPS system since 1993. There have been a couple "comprehensive" systems developed in Europe and one at least is still underway in the UK and it will be presented at the upcoming GPS conference in Portland (2002).

"There is a huge gap between product feasibility and reality. One can get centimeter accuracy with GPS at a given facility and this is cost effective for example at a large agricultural concern where every crop, row, fence and water line can be mapped. This does not however translate into 100% mapped points with centimeter accuracy nationwide.

"The cost of the super-accuracy hardware is high not to mention the cost of creating the data. Good general map data has a resolution of 12 meters and GPS accuracy without augmentation is 10 meters on average. This is the current reality for a couple years at least.

"As a blind person who has been hungry for location information since I could ask for it, I am thrilled by the points of interest and map data now available commercially. I am the first person to be working on improved data and better accuracy. I do hope that research efforts that point out the possibilities of technology do not discourage blind users from learning to use and appreciate the location information we can already get with the BrailleNote GPS and other systems coming on the market rather than waiting for the ultimate solution. After being on a very lonely soap box for many years, I am thrilled to see several R&D efforts underway worldwide on GPS and related navigation devices. My role in all of this is to get the devices out of the research labs and test environments and into the hands of blind people.

bar

MEMS helps the blind to see again.

By Paul Marsh, EE Times UK. September 17, 2002.

A recent arrival in the race to give sight to the blind is a system that will use microelectromechanical systems (MEMS)-based electrodes to perform the functions normally carried out by cells in the eye.

The idea, funded by a $9m grant from the US Department of Energy, is being developed by a multidisciplinary team led by researchers from Oak Ridge National Laboratory.

A matrix of 1000 tiny electrodes will be placed on the retinas of people whose photo-transducing rod and cone cells have been damaged by the macular degeneration caused by aging and diseases such as retinitis pigmentosa. These conditions leave intact the neural pathways that carry electrical impulses from the eye to the brain.

The MEMS elements will be implanted inside the eye's vitreous humour, directly stimulating the nerve endings to produce images of sufficient quality to read large print and distinguish between objects in a room.

At first, the system will use a "crude, shotgun approach" that fires groups of nerves. But the ultimate aim is to stimulate individual nerves. The project will increase the sensor resolution from a 10 x 10 array to a 33 x 33 array in 2004.

The principal challenge with implantable devices is packaging and biocompatibility, says Mike Daily, one of the researchers. Other issues being investigated include finding which electronic waveforms best stimulate the nerves.

Source URL: http://www.eetuk.com/story/OEG20020917S0005

bar

Scientist develops method for sound navigation

Electronic system mimics acoustic navigation abilities of blind people.

By Nicolle Wahl.

Sept. 16, 2002 - Drawing on the expertise of the blind, a University of Toronto professor is "teaching" electronic devices how to navigate using surrounding sounds.

"The goal was to build a system that mimics the acoustic navigation abilities of blind people," says Professor Parham Aarabi of the Edward S. Rogers Sr. Department of electrical and Computer Engineering. He has developed a method by which a device fitted with as few as two microphones can combine the information from sounds around it to locate and orient itself, in the same way that an animal uses its two ears. This method achieves the same result as radar but is more adaptable to different technologies, he adds.

Eventually, the technology could be used in robotics or personal communication devices, such as cell phones or hand-held computers. For example, says Aarabi, cell phones that combine the signals from many microphones could filter out background noise and transmit only the clear voice of the cell phone user.

Aarabi says that communications devices using this technology could become available to consumers within five to 10 years. The study, funded by the Canada Research Chairs Program and the Canada Foundation for Innovation, will appear in an upcoming issue of IEEE Transactions on Systems, Man and Cybernetics, Part B.

Source URL: http://www.newsandevents.utoronto.ca/bin3/020916c.asp

bar

Blind Audio Tactile Mapping System (BATS)

From Wired News, about an audio tactile mapping system that is being developed at the University of North Carolina.

A New Way to Read, Not See, Maps. By Mark Tosczak.

September 25, 2002

Jason Morris uses a trackball to move a cursor across a map of ancient Britain dotted with Roman forts and cities. As he passes over a location, a speech synthesizer pronounces the name - and will spell it, too, as sometimes the computer's Latin pronunciation isn't up to snuff.

When the cursor passes over land, the sound of horses galloping comes from the computer's speakers. Move it over water and the sound of waves breaking on a beach emanates.

If he's wearing stereo headphones, Morris will even hear the sound in the correct "location" relative to the cursor - to the left or right, for instance.

The software, developed as part of an undergraduate computer science class project, could give Morris, a graduate student in the classics department at the University of North Carolina at Chapel Hill, access to maps that sighted students take for granted.

"Up until this time, the blind have been more or less shut out of geographic research," he said.

The map-navigation software, dubbed Blind Audio Tactile Mapping System (BATS), takes digital map information and provides nonvisual feedback as a user moves a cursor across the map. BATS began as a software engineering class project last spring. Computer science professor Gary Bishop had been looking for a blind student to help with accessibility projects when he met Morris on a street.

Morris, who uses a guide dog to help him navigate, asked Bishop what street he was on; Bishop told him he was on a sidewalk and the two began chatting.

As it turns out, Morris had been developing classical world maps accessible to blind people at UNC-Chapel Hill's Ancient World Mapping Center. He had been working with a technology that allows raised bumps to be printed on paper, which means Braille symbols, for instance, could be printed on a map, along with some simple features like coastlines, rivers and cities.

But such a map doesn't offer as much information as a similarly sized traditional map, because Braille letters take up more space, and it shares the same ultimate page-size limitation of any printed map.

Bishop said he could do better. Last spring, he presented the problem as a choice for a required class project in his software engineering class.

Of the projects the student teams could choose from, "This one actually seemed somewhat interesting and useful," said Chad Haynes, a student on the project who has since graduated. "It was definitely something that hadn't been done before."

The team chose Python as their programming language because they could write cleaner, faster code more easily in it. But Haynes was the only one in the group who had coded in that language before, so the other four students had to learn Python as well as solve the various technical and interface problems.

The students - Haynes, plus Thomas Logan, Shawn Hunter, Elan Dassani and Anthony Perkins - were so excited about the project that toward the end of the class they asked Bishop if they could continue to work on it during the summer. Bishop made some phone calls and got funding from Microsoft to pay the students while they added improvements and refinements this summer.

Having Morris available to discuss solutions and test ideas was invaluable to the group. An early prototype used a stylus and touch screen, but Morris found holding the pen up to the screen tiring. A trackball turned out to be simpler and cheaper.

A new group of students, under Bishop's supervision, is working to add tactile feedback, using vibrating and force-feedback mice and trackballs.

Bishop envisions the software as an open-source project, and executable code and an installer can be downloaded from the project site.

Even while the software was in a fairly primitive stage last spring, Morris used it to help write a paper. "Without that map I don't think I would have been able to do any of the things I did," he said. "I drool over the possibilities of what we could have done with what we have now."

bar

The opportunity for blind individuals to drive vehicles of one kind or another will increase as technology moves forward. Some blind individuals use wheelchairs for travel, and it makes sense to put motorized systems on these vehicles and provide them with high technology. Below is an article from Austrialia about using high technology with a tricycle.

National Post (f/k/a The Financial Post); Thursday, November 07, 2002

A trike built with vision: Tricycle uses sensors from Jaguar cars to guide visually impaired riders

By King Lee

VICTORIA

A research team of volunteers at the University of Victoria has adapted a feature found in Jaguar cars to build a tricycle for blind children. Dr. Nigel Livingston, a biology professor, said the idea came from a suggestion from the Queen Alexandra Centre for Children's Health -- a care centre for children with mental and physical disabilities.

"It had never occurred to me that there would be such a need," he said. "It's critical [that visually impaired children] get exercise, and exercise they enjoy," Dr. Livingston said. The challenge involved developing a sensor system to warn the rider of impending danger. The bike also needed to be very stable with a low seat so the child could touch the ground and feel reassured. The trike had to be adjustable so it could accommodate children of different ages and sizes. Another requirement: Costs had to be kept as low as possible.

The team used parts from two old tricycles to make one prototype and, through donations, were able to buy two sensors -- the same type used in the reversing systems of Ford-made Jaguars -- as well as rechargeable batteries and a recharger.

Dr. Livingston estimated the prototype trike's parts cost $400. "My dream, actually, is to be able to give these away for nothing."

Dr. Livingston had been involved with the Queen Alexandra centre where his daughter, who is mentally disabled, has received care. "I realized that there are a lot of things that could be done for the disabled community. "I thought, 'why not utilize all the resources on campus? "

He began to assemble his University of Victoria Assistive Technology Team in 1999. At about the same time, he began working on a project that used brain waves as a communicative device. That project earned him national and international attention.

Dr. Livingston's team has grown to about 40, including faculty, staff and students at the university. They volunteer time and expertise to develop and test devices for the disabled. On the team are machinists, computer scientists, electrical engineers, biologists, physiologists, psychologists, neuroscientists and technicians.

When Dr. Livingston proposed the trike project, he had no shortage of interested volunteers. "I had students chasing after me, saying that they would really like to do the project," he said.

His project team started with the five who came up with the original design for the trike and grew to include a professor in mechanical engineering, another student to carry out further design work and construction of the prototype, a machinist and an electronics technician.

The sensor system is able to warn children of an impending obstacle, such as a wall or person, within two to three metres. The prototype has been tested at the Queen Alexandra centre and after final adjustments are made, a second one will be built.

They are destined for homes in Langford, in Victoria's West Shore communities, and Port Alberni. Dr. Livingston said they will not apply for a patent. "I don't want to. We'll give it away. We just don't want to make money out of this."

bar

Article on GPS, published by the European Space Agency:

More autonomy for blind people thanks to satellite navigation.

Published June 4, 2003

"When blind people take a taxi, they will be able to give directions to the taxi driver!" says Jose Luis Fernandez Coya. The man speaking really knows what he is talking about: he is blind but also heads the R&D department of ONCE, the National Organization of Spanish Blind people.

This association has always been looking for helpful innovations and has just developed a system based on GPS to guide blind people. The system called "Tormes", named after a famous Spanish 16th century story, is a computer with a Braille keyboard and satellite navigation technology that gives verbal directions.

This personal navigator was presented to the press in Madrid recently. The European Space Agency (ESA) was involved in this event because ONCE and ESA are already working on how to improve "Tormes".

The accuracy given by GPS is not precise enough and not guaranteed. A new tool, developed by ESA could be the breakthrough: EGNOS (European Geostationary Navigation Overlay Service). EGNOS corrects the GPS signals and gives an accuracy of 2 m while GPS provides an accuracy of only 15 to 20 m. It also warns the users of any problem with the signal thus giving integrity information.

EGNOS is transmitted to the ground via geostationary satellites, so sometimes signals are blocked by buildings. This is called the canyon effect. To solve this problem, ESA engineers had the idea of getting the data through the Internet via a GSM connection, a project called SISNeT (Signal In Space through Internet). This makes EGNOS available anywhere downtown. Blind people who are able to access this information could distinguish streets. For a blind person walking in a town this will make all the difference because with an accuracy of 15 m (with GPS) the street is already crossed, with 2 m and less (with EGNOS) you know which pavement you are on.

Tormes, the hand held device developed by ONCE along with the Spanish company GMV Sistemas, speaks to the user, like any GPS device in a car, but weighing less than one kilo it can be carried over the shoulder. It can be used in two ways: to guide the user to their destination or to tell them where they are as they walk around.

Ruben Dominguez, a blind mathematician who has tried out the device says 'This completes what exists for assisting blind people: the dog or the white cane, but furthermore it will really improve the life of the blind community by giving a lot more autonomy when moving around town, specially in unknown places.'

Testing of the prototype is ongoing but already the results indicate that this is a revolution under way for blind people. When EGNOS is operational in spring 2004, blind people can expect unprecedented assistance giving them more autonomy. EGNOS, an initiative of ESA, the European Commission and Eurocontrol, for use in civil aviation and other new services, paves the way for Galileo, the first civil global satellite navigation system.

Source URL: http://www.esa.int/export/esaCP/SEMVQOS1VED_index_0.html