[go: up one dir, main page]

GB2440583A - A portable route planning and object identification device for the visually impaired - Google Patents

A portable route planning and object identification device for the visually impaired Download PDF

Info

Publication number
GB2440583A
GB2440583A GB0615559A GB0615559A GB2440583A GB 2440583 A GB2440583 A GB 2440583A GB 0615559 A GB0615559 A GB 0615559A GB 0615559 A GB0615559 A GB 0615559A GB 2440583 A GB2440583 A GB 2440583A
Authority
GB
United Kingdom
Prior art keywords
data
portable device
user
voice
scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0615559A
Other versions
GB0615559D0 (en
Inventor
Tom Pey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guide Dogs for Blind Association
Original Assignee
Guide Dogs for Blind Association
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guide Dogs for Blind Association filed Critical Guide Dogs for Blind Association
Priority to GB0615559A priority Critical patent/GB2440583A/en
Publication of GB0615559D0 publication Critical patent/GB0615559D0/en
Priority to PCT/GB2007/002649 priority patent/WO2008015375A1/en
Publication of GB2440583A publication Critical patent/GB2440583A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/006Teaching or communicating with blind persons using audible presentation of the information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • G09B29/008Touring maps or guides to public transport networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Educational Technology (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Automation & Control Theory (AREA)
  • Rehabilitation Therapy (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Ecology (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

A voice-activated portable device to assist visually impaired people comprises scanning means for scanning an object selected by a user and a memory for storing information regarding a scanned object. A sensor and identification means locate an object previously scanned. The scanning means may be able to read electronic tags attached to objects and the portable device may store a voice signal together with the identity of the scanned object. Additionally, means for planning a route to a destination selected by the user are provided, which includes identifying the correct transport for travel to the selected destination. The sensor identification and journey means are able to communicate with a wireless system through a communicating means. A monitoring means may also be provided for monitoring and storing medical information related to the user.

Description

<p>ASSISTANCE DEVICE FOR BLIND AND PARTIALLY SIGHTED</p>
<p>PEOPLE</p>
<p>The present invention relates to a voice activated device suitable for assisting blind and partially sighted people.</p>
<p>In physical space, visually impaired users still depend mainly on basic low technology aids such as canes, guide dogs or spoken directions to reach only known destinations along familiar learned routes. Tactile maps exist, but are hard to use and, as a result are often ineffective, and so are not widely used. In general, low technology aids are not intuitive enough for travel to new or unknown destinations and access to unfamiliar social environments such as a hospital or shopping mall is restricted.</p>
<p>Visually impaired people have great difficulties participating in necessary daily activities such as shopping, going to hospital, a bank or just visiting a friend.</p>
<p>More recently, there have been attempts to use technology to detect nearby objects. For example, some high-tech devices detect potential hazards at head level. Several more advanced navigation systems such as GPS-Talk utilise Global Positioning System (GPS) and offer potential to develop outdoor way-finding solutions. However, to realise this potential, devices must be integrated with real time sensory information about users' dynamic environment. Such a device is the front-end of a complex technologies' integration, and must be affordable, wearable and intuitive.</p>
<p>In virtual space, access to the Web is very difficult -often impossible.</p>
<p>Sensor based technology applications; such as smart homes and telecare offer users some limited alternatives. However, in order to fully address the needs of blind and partially sighted people, Internet mediated services have to be dedicated to users' needs and living environments.</p>
<p>The global incidence of blindness has increased in the last twelve years but the incidence of blindness in the population over 50 years of age has increased even more. Therefore, there is the need, not only to prevent blindness and to find effective strategies to combat its progression, but also there is an urgent requirement to find appropriate, effective rehabilitative and assistance strategies in order to promote independent functioning in the growing visually impaired population.</p>
<p>In the United Kingdom, services to people with significant sight loss are well established and comprise, in the main, mobility and rehabilitation.</p>
<p>However, these services, which tend to be fragmented and/or geographically dependent, are limited in their potential and are not intuitive in function.</p>
<p>People with sight loss experience significant difficulties in undertaking every day tasks and achieving daily goals. Various aids, ranging from tactile maps to touch pad orientation systems exist. However the inherent limitations of such systems prevent their widespread use by people with sight loss.</p>
<p>Considerable advances in both science and technology have, however, resulted in a variety of low and high tech mobility devices ranging from sonic canes, guides and pathfinders to sophisticated sensory substitutions such as echo-locators, and to more complex navigation, orientation and location aware systems.</p>
<p>For example, there are products from Sonar Canes, Sonic Guide and Sonic Pathfinders, which detect obstacles and hazards. VOICe, on the other hand translates video images from a regular PC camera into sounds so that users can "see with their ears". GPS-talks, together with MoBIC and Visuaide Trekers are navigation systems that utilise GPS (Global Positioning Systems) and wireless communication networks for outdoor way-finding. Orientation and navigation are achieved by the use of sophisticated systems such as Talking Sign technology; the system consists of short audio signals sent by invisible infrared light beams from permanently installed transmitters, which are placed at sign locations or on key environmental features, to a hand-held receiver that decodes the signal and delivers the voice message through its speaker or headset.</p>
<p>Other systems such as indoor location systems are all encompassing and are, in the main, sensor-based technologies providing location information, space identifiers, and position coordinates. Cricket, for example, piggybacks onto applications running on handheld computers, laptops, and sensors and is designed for use indoors and in urban areas where outdoor systems such as Global Positioning Systems (GPS) cannot be used.</p>
<p>Other relevant technologies to the present invention are RFID (Radio Frequency Identification) tags. The usage of RFID tags is increasing strongly and they are already being used for the following applications: * cards for transport * access controllcorporate ID * libraries/archives * raw material inventory in industry There are also some original applications such as RFID tags embedded in plates at sushi bars to calculate the customer's bill. New applications are expected in baggage delivery and in inventory management, as well as medical supervision. RFID technology is also being studied to develop a system to help visually impaired people navigate.</p>
<p>Card shaped products for personal usage, such as telephone cards or commuters passes, have traditionally accounted for the majority of RFID usage, but in future growth is especially expected in retail and logistics related applications. The international standardization for RFID at high-frequency band (ISO/LEC 14443 proximity application, ISO/IEC 15693 vicinity application) is expected to facilitate wider usage of RFID.</p>
<p>It is evident that there is an abundance of assistive technologies, however they exist in isolation and are, as a consequence, of limited use to the visually impaired person. Echo locators for example, offer users little or no information about an object, only its location. Access to transport and to unfamiliar destinations remains problematic, time consuming and challenging and individual preferences and personal objectives when travelling are frequently only realised with difficulty or, indeed, not at all.</p>
<p>As mentioned above, there are a number of relevant technologies that provide information for blind and partially sighted people. They are generally however, one service providers and thus are severely limited in use. A device, which suits a person who is partially sighted, can be completely useless for a totally blind person.</p>
<p>There is thus an urgent requirement to address the diverse and changing needs of the blind and partially sighted population by integrating existing technologies such as sensor and data networks, mobile and wearable smart devices and remote healthcare monitoring into one simple device i.e. a multi-modal integrated technology aid. Such technology is not available at the present time.</p>
<p>SUMMARY OF THE INVENTION</p>
<p>The objective of the present invention is to solve the above problems and to pull together diverse technologies, which would otherwise evolve independently.</p>
<p>In general terms the invention provides a device comprising a laser scanning technology, mobile service facility, voice recognition facilities, sensory identification, memory, data acquisition and storage.</p>
<p>A first specific object of the present invention is to use the device in "smart homes" by virtual travel communities, and emergency response agencies. Moreover, the device could be used in shopping and supermarket for automated check-out.</p>
<p>To attain the above objective in the invention's primary mode, it is configured as a voice-activated, portable device to assist visually impaired people comprising: communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination; wherein the sensor identification and journey means are able to communicate with the network through the communicating means.</p>
<p>In its secondary mode, the invention is configured with a wireless system for assisting visually impaired people comprising: a number of portable devices according to the first aspect of the invention and connected to a sensor network; receiving means adapted to receive data from the portable devices and to transmit said data to a server which is adapted to store the data; transmitting means adapted to transmit data from the server to the portable devices; selecting means adapted to select the data transmitted by the transmitting means according to each portable device; wherein each portable device is adapted to receive and store the selected data, and to update its existing data.</p>
<p>In addressing a third aspect of its use, the present invention is configured as a method for assisting visually impaired people using a wireless system comprising a number of portable devices according to its primary mode and connected to a sensor network comprising: receiving data from the portable devices and transmitting said data to a server for storage; transmitting data from the server to the portable devices; selecting the data to be transmitted by the transmitting means according to each portable device. In this way, each portable device receives and stores the selected data, and updates its own existing data.</p>
<p>BRIEF DESCRIPTION OF THE DRAWINGS</p>
<p>FIG. 1 is a schematic view of a device according to the invention; and FIG. 2 is a block diagram of functionality according to an embodiment of the present invention.</p>
<p>DESCRIPTION OF THE PREFERRED EMBODIMENTS</p>
<p>Below, preferred configurations of the present invention are explained by way of example with reference to the accompanying drawings.</p>
<p>Figure 1 is a schematic view showing the first configuration of the present invention. Figure 2 is a table setting out a range of desired functionality for the device and uses and advantages of the different functionalities.</p>
<p>The device of the present invention comprises laser scanning means, voice recognition means, sensory identification means and storage means.</p>
<p>The device is portable, suitable to be hand or pocket held, wireless and lightweight, weighing less than 250 grams. It can be upgraded on demand, in a similar way to the upgrading of a computer and can be configured for either home base services, i.e. indoor use only, using a form of "wall-mounted control board" or for indoor and outdoor use. If the latter, subscription to the following TalkingGadget service will usually be preferred.</p>
<p>The device is able to provide and support three main functionalities: talkingobj ect, talkingtravel and talkingcare.</p>
<p>The selection of the mode of operation of the device is by push buttons generally indicated in Figure 1. Textured or tactile buttons of various shapes will select specific functions and can be easily identified by touch by a visually impaired user.</p>
<p>For example, a square button can be provided to select the talkingobject function, a round button to select the talkingtravel function and a triangular button to select the ta1kingcare function.</p>
<p>The user is provided with a supply of thin electronic tags that the device can read using the scanning facility. Each tag is attached to an object selected by the user if the object is not already provided with a tag or tag equivalent, which can be read by the scanning facility already inbuilt.</p>
<p>The user can select the talking(äjobject mode using the square button.</p>
<p>Then, the user can uses the device in the talkingobject mode, also referred to as "objectFinder" mode, to scan the object once and the user assigns a preferred name for the object via the device, which is recorded and associated with the tag identity.</p>
<p>Subsequently, when the user wants to know the location of the tagged object, he or she speaks the assigned name so the device can recognize it using the scanning means and tell the user where the object is.</p>
<p>Similarly, if a user wishes to identify an object, the device in the ta1kingobject mode can scan the object and use the scanned tag identity to inform the user what the object is.</p>
<p>Further information about the tagged object can also be stored in the device or an associated storage device able to communicate with the device, such as a home computer so that, on scanning the tag associated with the object, the device will inform the user of this further information.</p>
<p>For example, in addition to what the object is, its colour or other attributes can be described.</p>
<p>This further information allows the device in the ta1kingobject or objectFinder mode to be used as a "mirror". The device reads the tagged clothes and gives a description of the clothes, e.g. colour to the user.</p>
<p>The talkingtravel or "travelFinder" mode can be selected using the round button. In this mode, before travelling, the destination is fed into the device, preferably using a voice recognition system. The device then either plans the route or updates previous information and tells the individual how to get to the desired destination and how long it will take.</p>
<p>The device will identify the correct bus, for example, and announce the stop at which the user has to alight.</p>
<p>The talkingcare or "careFinder" mode is selected using the triangular button. In this mode, the device, when attached to a specific add-on medical sensor, will wirelessly collect and transport data, such as blood pressure, weight, and blood sugar levels to a home computer to be analysed. Alternatively, the device could either transmit the data to a doctor's surgery immediately or stores it temporarily for later transmission or periodic retrieval.</p>
<p>In Fig 2, the benefits to the user achieved by the careFinder are listed in</p>
<p>a table.</p>
<p>The laser scanning means is based on a principle similar to bar code recognition and thus will be multi-purpose, allowing for identification of tagged objects of all kinds. Foodstuffs, for example, will usually be bar coded, at source, i.e. by the manufacturer. The bar code would ideally identify the product, its cost and contents. It may also be possible to extend the bar code so that cooking instructions and recipes can also be included on the packet, tin or bag. At the present time, some clothing manufacturers include "Braille information" on clothing tags. However, there are relatively few who provide this facility and indeed relatively few visually impaired people who use Braille as a communication method. Consequently, bar coding clothing to include cost, size, colour, material and washing instructions would be, without doubt, hugely helpful to the visually impaired person.</p>
<p>Sensor identification means will allow the visually impaired person to locate specific objects. Each object when bought may have an identification bar code or alternatively could be "tagged" ideally, by the manufacturer andlor by the visually impaired person. The object information, when scanned, would then be stored either in the device itself or sent to a home-based computer and/or a central sever. The latter could be provided by an Internet agency, phone company or an independent provider. The device would behave like a personal server and data distributor (data hub). Thus data storage and management would be possible both locally and remotely. If stored locally, i.e. in the device or on a home computer, vital information could still also be sent to the central server as a back-up facility.</p>
<p>Another aspect of the present invention relates to a system for public transport comprising a service provider, a computer and detecting means.</p>
<p>Bus and train companies could subscribe to a service provider and each mode of transport has inbuilt sensors capable of communicating with a computer into which is programmed route information, timetables and travel times, in much the same way as GPS and navigation systems currently operate. Even if bus numbers were altered mid route, this would be recognised by the computer and the visually impaired person would be informed accordingly. The system would also be capable of identifying the correct bus and where it was in relation to the individual and/or bus stop.</p>
<p>Another functional aspect of the invention is provided by a med-alert, which is an added health and safety feature and can be tailored to each user. The user monitors various aspects of body function by attaching body sensors. The information is picked up by the device and then sent to a remote server, which is then accessed by the medical practice responsible for the individual's care. The practice would require, however, subscribing to such a service. Thus significant changes in a visually impaired person's health status would be recorded. Immediate action by the health agencies involved would then be possible.</p>
<p>Preferably, the device will have a face recognition and fingerprint security facility requiring biometric verification of user identity to enable use of the device.</p>
<p>A further extension of the invention is to use the device to support the TalkingGadget service which tailors the device to the specific needs of each user, for example, personalising voice commands, unique privacy protection, and the creation of home based services as alternative mobility solutions are all possible.</p>
<p>The service uses the voice commanded device to enable users to identify, find and manage tagged objects and object information indoors and/or in and out of doors. The device, as previously indicated, can be designed to access a wireless sensor network which will then send the information to either a home based server or a central (remote) service centre. The device will act like a personal service gateway once services are created, subscribed and activated.</p>
<p>The service may have particular appeal and could ultimately be extended to other ad hoc mobile services, and provide a platform on which various services could be developed.</p>
<p>The embodiments described above are exemplary and not exhaustive, the skilled user will be able to envisage other alternatives within the scope of the invention, as set out in the claims.</p>

Claims (1)

  1. <p>WHAT IS CLAIMED IS: A voice-activated portable device to assist
    visually impaired people comprising: communicating means for transmitting and receiving data to and/or from a network; scanning means for scanning an object selected by a user; memory means for storing information regarding a scanned object; sensor and identification means for locating an object previously scanned; journey means for planning a route to a destination selected by the user and for identifying the correct transport for travel to a selected destination, wherein the sensor identification and journey means are able to communicate with the network through the communicating means.</p>
    <p>2. The voice-activated portable device according to claim 1, further comprising monitoring means for monitoring medical information related to the user and storing said information in the memory means.</p>
    <p>3. The voice activated portable device according to claim 1 or claim 2, further comprising textured and shaped tactile control buttons.</p>
    <p>4. The voice activated portable device according to any preceding claim, further comprising biometric means adapted to identify the user and enable operation of the device.</p>
    <p>5. The voice activated portable device according to claim 4, in which the biometric means is adapted to identify the user's face or fingerprints.</p>
    <p>6. The voice activated portable device according to any preceding claim, wherein the communication to the network is wireless and the device is adapted to be used both indoors and outdoors.</p>
    <p>7. The voice activated portable device according to any preceding claim, in which the scanning means is adapted to read electronic tags attached to an object selected by the user when scanning the object.</p>
    <p>8. The voice activated portable device according to claim 7 and adapted to store a voice signal together with the identity of said object after scanning the object and to subsequently recognise said stored voice signal.</p>
    <p>9. The voice activated portable device according to claim 8, adapted to respond to a voice signal recognised as said stored voice signal by informing the user of the location of said object by using the sensor and identification means.</p>
    <p>10. The voice activated portable device according to any preceding claim and adapted to be upgradeable on demand by the user.</p>
    <p>11. The voice activated portable device according to any preceding claim and having a weight of less than 250 grams.</p>
    <p>12. A wireless system for assisting visually impaired people comprising: a plurality of portable devices as defined in any preceding claim connected to a sensor network; receiving means adapted to receive data from the portable devices and to transmit said data to a server which is adapted to store the data; transmitting means adapted to transmit data from the server to the portable devices; selecting means adapted to select the data transmitted by the transmitting means according to each portable device; wherein each portable device is adapted to receive and store the selected data, and to update its existing data.</p>
    <p>13. A wireless system according to claim 12, wherein the transmitting means is adapted to transmit medical data to a service centre to be analysed.</p>
    <p>14. A method for assisting visually impaired people using a wireless system comprising a plurality of portable devices as defined in any one of claims 1 to 13 connected to a sensor network; and comprising the steps of: receiving data from the portable devices and transmitting said data to a server which stores the data; transmitting data from the server to the portable devices; selecting the data to be transmitted by the transmitting means according to each portable device; whereby each portable device receives and stores the selected data, and updates its existing data.</p>
    <p>15. A device substantially as shown in or as described with reference to Figure 1 of the accompanying drawings.</p>
GB0615559A 2006-08-04 2006-08-04 A portable route planning and object identification device for the visually impaired Withdrawn GB2440583A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0615559A GB2440583A (en) 2006-08-04 2006-08-04 A portable route planning and object identification device for the visually impaired
PCT/GB2007/002649 WO2008015375A1 (en) 2006-08-04 2007-07-13 Assistance device for blind and partially sighted people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0615559A GB2440583A (en) 2006-08-04 2006-08-04 A portable route planning and object identification device for the visually impaired

Publications (2)

Publication Number Publication Date
GB0615559D0 GB0615559D0 (en) 2006-09-13
GB2440583A true GB2440583A (en) 2008-02-06

Family

ID=37027279

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0615559A Withdrawn GB2440583A (en) 2006-08-04 2006-08-04 A portable route planning and object identification device for the visually impaired

Country Status (2)

Country Link
GB (1) GB2440583A (en)
WO (1) WO2008015375A1 (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102641198B (en) * 2012-04-27 2013-09-25 浙江大学 Blind person environment sensing method based on wireless networks and sound positioning
US9307073B2 (en) 2013-12-31 2016-04-05 Sorenson Communications, Inc. Visual assistance systems and related methods
US10024679B2 (en) 2014-01-14 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9629774B2 (en) 2014-01-14 2017-04-25 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10248856B2 (en) 2014-01-14 2019-04-02 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9578307B2 (en) 2014-01-14 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US10360907B2 (en) 2014-01-14 2019-07-23 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US9915545B2 (en) 2014-01-14 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
CN103919663B (en) * 2014-03-31 2016-05-11 浙江大学 Blind person's outdoor environment cognitive method
US10024667B2 (en) 2014-08-01 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US9922236B2 (en) 2014-09-17 2018-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
US10024678B2 (en) 2014-09-17 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable clip for providing social and environmental awareness
US9576460B2 (en) 2015-01-21 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device for hazard detection and warning based on image and audio data
US10490102B2 (en) 2015-02-10 2019-11-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for braille assistance
US9586318B2 (en) 2015-02-27 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Modular robot with smart device
US9677901B2 (en) 2015-03-10 2017-06-13 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing navigation instructions at optimal times
US9811752B2 (en) 2015-03-10 2017-11-07 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable smart device and method for redundant object identification
US9972216B2 (en) 2015-03-20 2018-05-15 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for storing and playback of information for blind users
US10395555B2 (en) 2015-03-30 2019-08-27 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing optimal braille output based on spoken and sign language
US9898039B2 (en) 2015-08-03 2018-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Modular smart necklace
US10912281B2 (en) 2016-02-24 2021-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for communicating with a guide animal
US9829322B2 (en) 2016-03-03 2017-11-28 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for directing a vision-impaired user to a vehicle
US10024680B2 (en) 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
US9996730B2 (en) 2016-03-18 2018-06-12 Toyota Motor Engineering & Manufacturing North America, Inc. Vision-assist systems adapted for inter-device communication session
US9958275B2 (en) 2016-05-31 2018-05-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for wearable smart device communications
US10561519B2 (en) 2016-07-20 2020-02-18 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device having a curved back to reduce pressure on vertebrae
US10432851B2 (en) 2016-10-28 2019-10-01 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable computing device for detecting photography
US10012505B2 (en) 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10521669B2 (en) 2016-11-14 2019-12-31 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for providing guidance or feedback to a user
US10172760B2 (en) 2017-01-19 2019-01-08 Jennifer Hendrix Responsive route guidance and identification system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5470233A (en) * 1994-03-17 1995-11-28 Arkenstone, Inc. System and method for tracking a pedestrian
US20020017992A1 (en) * 2000-06-05 2002-02-14 Hitoshi Hidaka Article identifying system
NL1016812C2 (en) * 2000-12-06 2002-06-07 Sjirk Van Der Zee Route navigation method, especially for blind or visually impaired people, detects user position and sends information from central database server to user
US20020121986A1 (en) * 2001-02-07 2002-09-05 William Krukowski Method and system for identifying an object and announcing a voice message
FR2839805A1 (en) * 2002-05-17 2003-11-21 Florence Daumas Speech synthesising transport information unit includes microphone and voice recognition unit linked to algorithm providing itinerary information
JP2004117094A (en) * 2002-09-25 2004-04-15 Nec Fielding Ltd Personal navigation system
US20050099318A1 (en) * 2003-11-06 2005-05-12 International Business Machines Corporation Radio frequency identification aiding the visually impaired with synchronous sound skins

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2388194A (en) * 2002-05-02 2003-11-05 Nec Technologies Remote medical monitor utilising a mobile telephone
WO2005008914A1 (en) * 2003-07-10 2005-01-27 University Of Florida Research Foundation, Inc. Mobile care-giving and intelligent assistance device
EP1685794B1 (en) * 2003-11-18 2013-02-27 Sony Corporation Input device and input method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5470233A (en) * 1994-03-17 1995-11-28 Arkenstone, Inc. System and method for tracking a pedestrian
US20020017992A1 (en) * 2000-06-05 2002-02-14 Hitoshi Hidaka Article identifying system
NL1016812C2 (en) * 2000-12-06 2002-06-07 Sjirk Van Der Zee Route navigation method, especially for blind or visually impaired people, detects user position and sends information from central database server to user
US20020121986A1 (en) * 2001-02-07 2002-09-05 William Krukowski Method and system for identifying an object and announcing a voice message
FR2839805A1 (en) * 2002-05-17 2003-11-21 Florence Daumas Speech synthesising transport information unit includes microphone and voice recognition unit linked to algorithm providing itinerary information
JP2004117094A (en) * 2002-09-25 2004-04-15 Nec Fielding Ltd Personal navigation system
US20050099318A1 (en) * 2003-11-06 2005-05-12 International Business Machines Corporation Radio frequency identification aiding the visually impaired with synchronous sound skins

Also Published As

Publication number Publication date
WO2008015375A1 (en) 2008-02-07
GB0615559D0 (en) 2006-09-13

Similar Documents

Publication Publication Date Title
WO2008015375A1 (en) Assistance device for blind and partially sighted people
US6977579B2 (en) Radio frequency identification aiding the visually impaired
KR101835832B1 (en) Calling system for using local area wireless communication
US7688211B2 (en) Apparatus and method for enhancing face-to-face communication
US20060109083A1 (en) Method and apparatus for accessing electronic data about at least one person of interest
US8094012B1 (en) Active composite RFID tag for object localization and instruction
USRE41171E1 (en) System for monitoring a person&#39;s location in a defined area
WO2004032019A3 (en) Universal communications, monitoring, tracking, and control system for a healthcare facility
KR101253337B1 (en) System for prrotection missing using location-aware
US9626697B2 (en) Method and apparatus for accessing electronic data via a plurality of electronic tags
US9977938B2 (en) Method and apparatus for accessing electronic data via a plurality of electronic tags
MX2015002352A (en) Guiding users in an area.
KR20150067417A (en) Handle in popular traffic with NFC tag and advertisement system and method thereof
KR100754548B1 (en) Electronic tag positioning mobile communication terminal and location information providing system and service method
Murad et al. RFAIDE—An RFID based navigation and object recognition assistant for visually impaired people
US9429446B1 (en) Navigation device for the visually-impaired
US20170269799A1 (en) Method and apparatus for accessing electronic data via a plurality of electronic tags
US7375641B2 (en) Centralized implementation of portal announcing method and system
EP3907985B1 (en) Systems and methods for providing a shoulder speaker microphone device with an integrated thermal imaging device
US20180017393A1 (en) Handicap assist apparatus
Liu et al. On smart-care services: Studies of visually impaired users in living contexts
US11687754B1 (en) Automated location capture system
JP2004334439A (en) Corporeal thing information management system
KR20110115205A (en) User-tailored content providing system and method using wireless recognition technology
JP2004102629A (en) Movable visitor ID acquisition device, information distribution device, information distribution method, information distribution program

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)