[go: up one dir, main page]

US10912281B2 - Systems and methods for communicating with a guide animal - Google Patents

Systems and methods for communicating with a guide animal Download PDF

Info

Publication number
US10912281B2
US10912281B2 US15/052,495 US201615052495A US10912281B2 US 10912281 B2 US10912281 B2 US 10912281B2 US 201615052495 A US201615052495 A US 201615052495A US 10912281 B2 US10912281 B2 US 10912281B2
Authority
US
United States
Prior art keywords
command
vision
user
guide animal
animal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US15/052,495
Other versions
US20170238509A1 (en
Inventor
Rajiv Dayal
II Fredrick W. Mau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Engineering and Manufacturing North America Inc
Original Assignee
Toyota Motor Engineering and Manufacturing North America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Engineering and Manufacturing North America Inc filed Critical Toyota Motor Engineering and Manufacturing North America Inc
Priority to US15/052,495 priority Critical patent/US10912281B2/en
Assigned to TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC. reassignment TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAU II, FREDRICK W., DAYAL, RAJIV
Publication of US20170238509A1 publication Critical patent/US20170238509A1/en
Application granted granted Critical
Publication of US10912281B2 publication Critical patent/US10912281B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K27/00Leads or collars, e.g. for dogs
    • A01K27/009Leads or collars, e.g. for dogs with electric-shock, sound, magnetic- or radio-waves emitting devices
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices; Toys specially adapted for animals
    • A01K15/021Electronic training devices specially adapted for dogs or cats
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1604Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1609Neck
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1614Shoulder, e.g. for neck stretching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5012Control means thereof computer controlled connected to external computer devices or networks using the internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless

Definitions

  • Embodiments described herein generally relate to communicating with a guide animal and, more specifically, to a vision-assist apparatus that communicates with a guide animal.
  • One embodiment of a method includes determining a first command to provide to a user to proceed to a destination, determining a second command to provide to the guide animal to correspond with the first command, and outputting the first command to direct the user toward the destination. Some embodiments include outputting the second command to direct the guide animal toward the destination, where the second command is outputted to be imperceptible by the user but is perceptible by the guide animal.
  • a system in another embodiment, includes a first output device that provides a first output type that is perceptible by a vision-impaired user and a second output device that outputs a second output type that is imperceptible by the vision-impaired user but is perceptible by the guide animal utilized by the vision-impaired user.
  • Embodiments of the system may also include a vision-assist computing device that includes a processor and a memory component.
  • the memory component may store logic that causes the vision-assist computing device to determine instructions to a destination, determine a first command to provide to the vision-impaired user to proceed to the destination, and determine a second command to provide to the guide animal to correspond with the first command.
  • the logic causes the system to provide the first command via the first output device and provide the second command via the second output device.
  • a system in yet another embodiment, includes a vision-assist apparatus that includes a vision-assist apparatus that includes an output device that provides a first output type that is perceptible by a vision-impaired user and a vision-assist computing device comprising a processor and a memory component, the memory component storing logic that causes the vision-assist apparatus to determine directions to a destination, determine a first command to provide to the vision-impaired user to proceed to the destination, and determine a second command to provide to the guide animal to correspond with the first command.
  • the logic causes the vision-assist apparatus to output the first command via the first output device and provide data related to the second command to a guide animal apparatus.
  • the data is received by the guide animal apparatus, where the guide animal apparatus includes a guide animal output device.
  • the second command is output via the guide animal output device.
  • FIG. 1 depicts a communication environment for communicating with a guide animal, according to embodiments described herein;
  • FIG. 2 depicts a flowchart for communicating with a guide animal, according to embodiments described herein;
  • FIG. 3 depicts a flowchart for utilizing a vision-assist apparatus and a guide animal apparatus to communicate with a guide animal, according to embodiments described herein;
  • FIG. 4 depicts a flowchart for training a guide animal, according to embodiments described herein.
  • FIG. 5 depicts computing infrastructure for communicating with a guide animal, according to embodiments described herein.
  • Embodiments disclosed herein include systems and methods for communicating with a guide animal. Some embodiments are directed to utilizing a vision-assist apparatus to more effectively communicate information to which the guide animal might not otherwise be aware.
  • the vision-assist apparatus may include a transducer, a speaker, and a tactile output device.
  • the transducer may be configured to output high frequency signals that are imperceptible by a user, such as a vision-impaired user, but may be recognized by the guide animal.
  • the high frequency signals correspond with the guide animal's training and may also correspond with tactile output from the tactile output device and/or lower frequency audio outputs provided to the user for navigating an environment.
  • the guide animal may have a collar or harness that includes a tactile output device that communicates with the vision-assist apparatus to provide tactile commands to the guide animal that coincide with the commands provided to the user.
  • some embodiments may be configured for training the guide animal or the user of the vision-assist apparatus.
  • the guide animal may be trained with traditional commands, as well as human inaudible high frequency signals that instruct the animal.
  • the vision-assist apparatus may also include sensors to determine the traditional commands that are provided to the guide animal by the user. When the user is using the vision-assist apparatus, the user may provide traditional commands to the guide animal. If a command provided by the user does not match the desired action for the guide animal (and thus the inaudible command from the vision-assist apparatus), output may be provided to the user for instructing that the command was incorrect and providing an indication of the correct command.
  • FIG. 1 depicts a communication environment for communicating with a guide animal 106 , according to embodiments described herein.
  • the communication environment may include a network 100 , which is coupled to a user computing device 102 , a vision-assist apparatus 104 , a guide animal apparatus 109 , and a remote computing device 110 .
  • the network 100 may include a wide area network, such as the internet, a public switched telephone network, a cellular network, and the like.
  • the network 100 may include a local area network, such as a wireless fidelity network, an Ethernet, a Bluetooth network, a Zigbee network, and the like.
  • the user computing device 102 may include a memory component 140 a that stores direction logic 144 a and vision-assist logic 144 b .
  • the direction logic 144 a may cause the user computing device 102 to receive global positioning data and/or other data and determine directions and/or a route to a destination.
  • the vision-assist logic 144 b may be configured as a mobile application and may cause the user computing device 102 to communicate with the vision-assist apparatus 104 to provide location data, user preferences for the vision-assist apparatus 104 and/or the guide animal apparatus 109 .
  • the communication between the user computing device 102 and the vision-assist apparatus 104 my include data related to directions, destinations, and/or commands that may be provided by the vision-assist apparatus 104 to the user and/or guide animal 106 .
  • the vision-assist apparatus 104 may include a vision-assist computing device 105 and a memory component 140 b for communicating with a guide animal 106 and/or guide animal apparatus 109 .
  • the vision-assist apparatus 104 may include a necklace module 104 a and/or an eyeglass module 104 b .
  • the eyeglass module 104 b may include at least one camera 122 (such as an eye tracking camera and/or environment camera), at least one speaker 124 , and/or at least one tactile output device 126 .
  • the necklace module 104 a may include at least one camera 121 a , 121 b (such as an eye tracking camera and/or an environment camera), at least one speaker 123 a , 123 b (such as an audible speaker and/or a high frequency speaker), and/or at least one tactile output device 125 a , 125 b .
  • the vision-assist apparatus 104 may also include one or more inertial measurement units, tactile input hardware, one or more microphones, one or more tactile feedback devices, one or more location sensors, one or more lights, one or more proximity sensors, one or more batteries, one or more charging ports, global positioning hardware, and/or other hardware or software.
  • vision-assist apparatus 104 is depicted in FIG. 1 as including the eyeglass module 104 b and/or the necklace module 104 a , this is just an example. Some embodiments may be configured as a bracelet, handheld device, clothing, headset, etc.
  • the speakers 123 a , 123 b , 124 may be configured for providing audio output in the audible frequency range for humans (such as from about 20 Hertz to about 20 Kilohertz). Additionally, at least one of the speakers 123 a , 123 b , 124 may be configured to output signals in the inaudible frequency range for humans, but in the audible frequency range for the respective guide animal 106 . As an example, if the guide animal 106 is a canine, the frequency range of the inaudible output may reside from about 23 Kilohertz to about 64 Kilohertz, such that the guide animal 106 may perceive the command, but the user (and other persons) cannot.
  • the same speaker may provide both the user-perceptible output (e.g., a first output type) and the user-imperceptible output (e.g., a second output type).
  • different speakers are dedicated to the different types of audio outputs described herein.
  • the vision-assist computing device 105 may include a memory component 140 b , which stores communication logic 144 c and command logic 144 d .
  • the communication logic 144 c may cause the vision-assist apparatus 104 to communicate with the user computing device 102 , the guide animal apparatus 109 , and/or the remote computing device 110 to request instructions and/or receive instructions related to an environment or destination.
  • the command logic 144 d may cause the vision-assist apparatus 104 to provide audible, inaudible, and/or tactile instructions to the user and/or guide animal 106 .
  • the vision-assist apparatus 104 may also include wireless communication hardware and software for communicating with the guide animal apparatus 109 .
  • the guide animal 106 may include a canine (dog), ferret, pig, monkey, and/or other animal for guiding the user to a destination.
  • the guide animal apparatus 109 may be configured as a collar, harness, leash, and/or may be a stand-alone device and may provide a tactile output and/or audio output (e.g., a human perceptible and/or a human imperceptible output) to the guide animal 106 .
  • the guide animal apparatus 109 may include a guide animal speaker 127 , a guide animal tactile output device 129 , and/or a guide animal visual output device 131 (which may be attached to the head of the guide animal 106 , or otherwise placed within the peripheral vision of the guide animal 106 ) to provide audio, tactile, and/or visual representations of commands to the guide animal 106 .
  • the guide animal apparatus 109 may include a guide animal computing device 111 that contains a memory component 140 c .
  • the memory component 140 c may store guide animal command logic 144 e and guide animal detection logic 144 f .
  • the guide animal command logic 144 f may be configured to cause the guide animal computing device 111 to communicate with the vision-assist apparatus 104 for coordinating the timing of first commands, which may include commands provided to the user, with second commands, which may be provided to the guide animal 106 .
  • the guide animal detection logic 144 f may cause the guide animal computing device 111 to detect actions of the guide animal 106 and/or the user for training purposes and route calculation.
  • some embodiments may be configured for the vision-assist apparatus 104 to communicate with the guide animal apparatus 109 such that the guide animal apparatus 109 communicates with the guide animal 106 .
  • the guide animal apparatus 109 may include wireless communication hardware and/or software, a power source, a guide animal tactile output device 129 , a guide animal speaker 127 , and/or components that are coupled to the guide animal computing device 111 .
  • the vision-assist apparatus 104 may provide audible and/or tactile commands to the user and wirelessly communicate with the guide animal apparatus 109 , which provides commands to the guide animal 106 .
  • the guide animal apparatus 109 may provide the tactile and/or audio output to the guide animal 106 .
  • the remote computing device 110 may be configured as a mapping server, a global positioning server, a command server, and the like. It should be understood that the depiction of the remote computing device 110 in FIG. 1 as a single computing device is merely for simplicity. Any number of computing devices for providing mapping data, routing data, command data, and the like may be utilized.
  • FIG. 2 depicts a flowchart for communicating with a guide animal 106 , according to embodiments described herein.
  • instructions to a destination may be received.
  • the user may identify a destination via a voice command to the vision-assist apparatus 104 and/or to the user computing device 102 .
  • the vision-assist apparatus 104 may determine a current position of the user and communicate with the user computing device 102 to determine the instructions to reach the destination.
  • the user computing device 102 may communicate with the remote computing device 110 to receive mapping data and/or routing data to guide the user and guide animal 106 to the destination from the current position.
  • some embodiments may be configured such that the vision-assist apparatus 104 communicates directly with the remote computing device 110 to receive this data.
  • a first command may be determined to provide the user with instructions to proceed to the destination.
  • the vision-assist apparatus 104 and/or the user computing device 102 may determine a command for providing to the user and/or guide animal 106 .
  • the vision-assist apparatus 104 and/or the user computing device 102 may determine an audible and/or tactile command that the user can perceive.
  • the first command may include an instruction, such as a right turn instruction, a left turn instruction, a stop instruction, a proceed instruction, and/or other instructions.
  • the first command may be provided by the tactile output device 125 a , 125 b , 126 and/or the speaker 123 a , 123 b , 124 . Additionally, in some embodiments at least a portion of the first command may be provided by the user computing device 102 .
  • a second command may be determined to be provided to the guide animal 106 , where the second command corresponds with the first command.
  • the second command that is provided may be inaudible to the user, but in the audible range for the guide animal 106 .
  • the second command may communicate similar information as the first command.
  • the guide animal 106 may be trained to respond to the inaudible commands provided by the vision-assist apparatus 104 .
  • the instructions include a command to take a left turn
  • the user may be provided with the first command, such as audible and/or tactile commands indicating that the user should turn left in 20 feet.
  • the second command may convey the same information to the guide animal 106 , but may be timed such that the guide animal 106 will know exactly when to turn, thereby coordinating the timing of the first command and second command.
  • the first command may be provided via the first output device.
  • the first command may be provided in an audible frequency range for the user to perceive the command and/or may be provided as a tactile output. Accordingly, the first command may be formatted for the user to traverse a predetermined route and reach the destination.
  • the user-perceptible output may be configured based on a user preference, an environmental factor, etc. As an example, the user may prefer tactile output in certain locations and/or areas of high human congestion. Similarly, some embodiments may configured to determine an environmental noise level and adjust the volume of the audio commands accordingly. In some embodiments, tactile output may be provided in response to a determination that the environmental noise is above a predetermined threshold.
  • the second command may be provided via a second output device.
  • the second output device may be a speaker 123 a , 123 b , 124 or other device for producing a sound that is inaudible by the user, but is perceptible by the guide animal 106 .
  • the second output may be timed such that the first command and the second command may be provided at different times, but coordinated to appropriately instruct the user and the guide animal 106 .
  • the first command may indicate that a right turn should be made in 20 feet.
  • the second command may instruct the guide animal 106 to take an immediate right turn.
  • the second command in this example may be provided after the first command. It should be understood that, depending on the embodiment, not every command that is provided to the user is also provided to the guide animal (and vice versa).
  • the second command may be configured based on one or more external factors, such as user preference, environment, training, and the like.
  • some embodiments may be provided such that the speaker 123 a outputs an inaudible beep for right turns, speaker 123 b outputs an inaudible beep for left turns, and the speaker 124 may be utilized for start and stop commands. Because the outputs originate from a predetermined source, the guide animal 106 may detect the source of the output and understand the command. Some embodiments, however, my provide different outputs, based on the command. As an example, a single inaudible beep may indicate a right turn command, with a double inaudible beep may indicate a left turn command.
  • some embodiments may be configured to determine environmental noise and customize the inaudible output accordingly.
  • the vision-assist apparatus 104 may determine that the guide animal 106 is located on the right of the user and detect environmental noise on the left of the user. As such, the vision-assist apparatus 104 may bias the output to the speaker 123 a.
  • some embodiments may be configured to detect unexpected obstacles on the route and provide at least one improvisational command for the user and guide animal 106 to avoid the obstacles and continue to the destination.
  • the camera 121 a , 121 b , 122 and/or other device on the vision-assist apparatus 104 may detect the unexpected obstacle, a current location of the unexpected obstacle, and a predicted path (if any) of the unexpected obstacle. Accordingly, the vision-assist apparatus 104 may determine a command (or series of commands) for avoiding the unexpected obstacle.
  • the vision-assist apparatus 104 may additionally provide a first command and second command to the user and guide animal 106 , respectively. The vision-assist apparatus 104 may then resume providing commands to the user and guide animal 106 to reach the destination.
  • embodiments may operate without a predetermined destination being established and/or may operate with additional functionality.
  • embodiments of the vision-assist apparatus 104 may detect an obstacle and may alert the user and/or guide animal 106 accordingly.
  • the alert for the user may include an identification of the obstruction and/or a command, while the guide animal 106 may be provided with a command.
  • FIG. 3 depicts a flowchart for utilizing a vision-assist apparatus 104 and a guide animal apparatus 109 to communicate with a guide animal 106 , according to embodiments described herein.
  • instructions to a destination may be determined. These instructions may be determined by the vision-assist apparatus 104 and/or may be determined via communication of the vision-assist apparatus 104 with the user computing device 102 and/or the remote computing device 110 .
  • a first command may be determined.
  • the first command may be a command that is provided for the user to reach the destination.
  • a second command may be determined.
  • the second command may be a command for the guide animal 106 to reach the destination.
  • the first command may be output via a user-perceptible audio signal and/or a tactile output provided by the vision-assist apparatus 104 .
  • the second command may be provided to the guide animal apparatus 109 for the guide animal apparatus 109 to output.
  • the guide animal apparatus 109 may provide tactile output to the guide animal 106 , audio output (which may be user perceptible or user imperceptible), and/or visual output to the guide animal 106 , depending on the particular embodiment.
  • FIG. 3 depicts that the guide animal apparatus 109 provides the commands to the guide animal 106 , this is just an example. Some embodiments may be provided such that the vision-assist apparatus 104 provides one or more commands to the guide animal 106 and/or the guide animal apparatus 109 provides commands to the user.
  • FIG. 4 depicts a flowchart for training a guide animal 106 , according to embodiments described herein.
  • a training command (such as a first training command) may be determined.
  • the training command may include an instruction (such as to turn left), as well as the audible command that is provided to the user (such as “turn left”), the corresponding tactile command that is provided to the user (such as a double tap by the tactile output device 125 b ), and the inaudible command that is provided to the guide animal 106 (such as an inaudible double beep from the speaker 123 b.
  • an inaudible training command (such as a second training command) is provided to the guide animal 106 .
  • user-perceptible command is provided to the user. Depending on the particular training selected, the user-perceptible command may include audible and/or tactile output.
  • a determination regarding whether the user and the guide animal 106 complied with the training command may be determined. The determination may be made by the vision-assist apparatus 104 , which may track movement and/or change of position of the user.
  • output indicating that the noncompliance may be provided.
  • the speaker 123 a , 123 b , 124 may provide an indication to the user of the error and/or a manner to correct the user.
  • the vision-assist apparatus 104 may provide the user with instructions on a manner to train the guide animal 106 to correct the error.
  • FIG. 5 depicts computing infrastructure for communicating with a guide animal 106 , according to embodiments described herein.
  • the vision-assist computing device 105 includes a processor 530 , input/output hardware 532 , network interface hardware 534 , a data storage component 536 (which stores command data 538 a , device data 538 b , and/or other data), and the memory component 140 b .
  • the memory component 140 b may be configured as volatile and/or nonvolatile memory and as such, may include random access memory (including SRAM, DRAM, and/or other types of RAM), flash memory, secure digital (SD) memory, registers, compact discs (CD), digital versatile discs (DVD), and/or other types of non-transitory computer-readable mediums. Depending on the particular embodiment, these non-transitory computer-readable mediums may reside within the vision-assist computing device 105 and/or external to the vision-assist computing device 105 .
  • the memory component 140 b may store operating system logic 542 , the communication logic 144 c and the command logic 144 d .
  • the communication logic 144 c and the command logic 144 d may each include a plurality of different pieces of logic, each of which may be embodied as a computer program, firmware, and/or hardware, as an example.
  • a local communications interface 546 is also included in FIG. 5 and may be implemented as a bus or other communication interface to facilitate communication among the components of the vision-assist computing device 105 .
  • the processor 530 may include any processing component operable to receive and execute instructions (such as from a data storage component 536 and/or the memory component 140 ). As described above, the input/output hardware 532 may include and/or be configured to interface with the components of FIGS.
  • the network interface hardware 534 may include and/or be configured for communicating with any wired or wireless networking hardware, including an antenna, a modem, a LAN port, wireless fidelity (Wi-Fi) card, WiMax card, BluetoothTM module, mobile communications hardware, and/or other hardware for communicating with other networks and/or devices. From this connection, communication may be facilitated between the vision-assist computing device 105 and other computing devices (such as user computing device 102 and/or the remote computing device 110 ).
  • Wi-Fi wireless fidelity
  • WiMax wireless fidelity
  • BluetoothTM BluetoothTM module
  • the operating system logic 542 may include an operating system and/or other software for managing components of the vision-assist computing device 105 .
  • the communication logic 144 c may reside in the memory component 140 b and may be configured to cause the processor 530 to receive environmental data, as well as data from the user, the user computing device 102 , and/or the remote computing device 110 and determine a destination and/or route accordingly.
  • the command logic 144 d may be utilized to provide human perceptible and/or human imperceptible commands for the user and/or guide animal 106 .
  • FIG. 5 it should be understood that while the components in FIG. 5 are illustrated as residing within the vision-assist computing device 105 , this is merely an example. In some embodiments, one or more of the components may reside external to the vision-assist computing device 105 . It should also be understood that, while the vision-assist computing device 105 is illustrated as a single device, this is also merely an example. In some embodiments, the communication logic 144 c and the command logic 144 d may reside on different computing devices. As an example, one or more of the functionalities and/or components described herein may be provided by the user computing device 102 and/or other devices, which may be communicatively coupled to the vision-assist computing device 105 . These computing devices may also include hardware and/or software for performing the functionality described herein.
  • vision-assist computing device 105 is illustrated with the communication logic 144 c and the command logic 144 d as separate logical components, this is also an example. In some embodiments, a single piece of logic may cause the vision-assist computing device 105 to provide the described functionality.
  • embodiments described herein may encourage vision-impaired persons to use a guide animal 106 for traversing a route to a destination. Additionally, embodiments described herein may be configured to facilitate training of a user and/or guide animal 106 for use with a vision-assist apparatus 104 .

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pain & Pain Management (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Epidemiology (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Zoology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Systems and methods for communicating with a guide animal are provided. One embodiment of a method includes determining a first command to provide to a user to proceed to a destination, determining a second command to provide to the guide animal to correspond with the first command, and outputting the first command to direct the user toward the destination. Some embodiments include outputting the second command to direct the guide animal toward the destination, where the second command is outputted to be imperceptible by the user but is perceptible by the guide animal.

Description

TECHNICAL FIELD
Embodiments described herein generally relate to communicating with a guide animal and, more specifically, to a vision-assist apparatus that communicates with a guide animal.
BACKGROUND
Oftentimes visually impaired persons choose not to use a guide animal (such as a guide dog) because they often do not remember the traditional commands for effectively using the guide animal or are uncomfortable speaking to a guide animal in public. While the guide animal may be effective when properly trained, current solutions require training of the person, as well as the dog to provide effective communication between the two. Additionally, even well trained guide animals may not be aware of all obstacles and dangers along a route to a destination. Thus, a need exists in the industry.
SUMMARY
Systems and methods for communicating with a guide animal are provided. One embodiment of a method includes determining a first command to provide to a user to proceed to a destination, determining a second command to provide to the guide animal to correspond with the first command, and outputting the first command to direct the user toward the destination. Some embodiments include outputting the second command to direct the guide animal toward the destination, where the second command is outputted to be imperceptible by the user but is perceptible by the guide animal.
In another embodiment, a system includes a first output device that provides a first output type that is perceptible by a vision-impaired user and a second output device that outputs a second output type that is imperceptible by the vision-impaired user but is perceptible by the guide animal utilized by the vision-impaired user. Embodiments of the system may also include a vision-assist computing device that includes a processor and a memory component. The memory component may store logic that causes the vision-assist computing device to determine instructions to a destination, determine a first command to provide to the vision-impaired user to proceed to the destination, and determine a second command to provide to the guide animal to correspond with the first command. In some embodiments, the logic causes the system to provide the first command via the first output device and provide the second command via the second output device.
In yet another embodiment, a system includes a vision-assist apparatus that includes a vision-assist apparatus that includes an output device that provides a first output type that is perceptible by a vision-impaired user and a vision-assist computing device comprising a processor and a memory component, the memory component storing logic that causes the vision-assist apparatus to determine directions to a destination, determine a first command to provide to the vision-impaired user to proceed to the destination, and determine a second command to provide to the guide animal to correspond with the first command. In some embodiments, the logic causes the vision-assist apparatus to output the first command via the first output device and provide data related to the second command to a guide animal apparatus. In some embodiments, the data is received by the guide animal apparatus, where the guide animal apparatus includes a guide animal output device. In some embodiments, the second command is output via the guide animal output device.
These and additional features provided by the embodiments of the present disclosure will be more fully understood in view of the following detailed description, in conjunction with the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the disclosure. The following detailed description of the illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
FIG. 1 depicts a communication environment for communicating with a guide animal, according to embodiments described herein;
FIG. 2 depicts a flowchart for communicating with a guide animal, according to embodiments described herein;
FIG. 3 depicts a flowchart for utilizing a vision-assist apparatus and a guide animal apparatus to communicate with a guide animal, according to embodiments described herein;
FIG. 4 depicts a flowchart for training a guide animal, according to embodiments described herein; and
FIG. 5 depicts computing infrastructure for communicating with a guide animal, according to embodiments described herein.
DETAILED DESCRIPTION
Embodiments disclosed herein include systems and methods for communicating with a guide animal. Some embodiments are directed to utilizing a vision-assist apparatus to more effectively communicate information to which the guide animal might not otherwise be aware. As an example, the vision-assist apparatus may include a transducer, a speaker, and a tactile output device. The transducer may be configured to output high frequency signals that are imperceptible by a user, such as a vision-impaired user, but may be recognized by the guide animal. The high frequency signals correspond with the guide animal's training and may also correspond with tactile output from the tactile output device and/or lower frequency audio outputs provided to the user for navigating an environment. In some embodiments, the guide animal may have a collar or harness that includes a tactile output device that communicates with the vision-assist apparatus to provide tactile commands to the guide animal that coincide with the commands provided to the user.
Additionally, some embodiments may be configured for training the guide animal or the user of the vision-assist apparatus. As an example, the guide animal may be trained with traditional commands, as well as human inaudible high frequency signals that instruct the animal. Similarly, the vision-assist apparatus may also include sensors to determine the traditional commands that are provided to the guide animal by the user. When the user is using the vision-assist apparatus, the user may provide traditional commands to the guide animal. If a command provided by the user does not match the desired action for the guide animal (and thus the inaudible command from the vision-assist apparatus), output may be provided to the user for instructing that the command was incorrect and providing an indication of the correct command.
Referring now to the drawings, FIG. 1 depicts a communication environment for communicating with a guide animal 106, according to embodiments described herein. As illustrated, the communication environment may include a network 100, which is coupled to a user computing device 102, a vision-assist apparatus 104, a guide animal apparatus 109, and a remote computing device 110. The network 100 may include a wide area network, such as the internet, a public switched telephone network, a cellular network, and the like. Similarly, the network 100 may include a local area network, such as a wireless fidelity network, an Ethernet, a Bluetooth network, a Zigbee network, and the like.
Coupled to the network 100 is the user computing device 102, which may include a memory component 140 a that stores direction logic 144 a and vision-assist logic 144 b. As described in more detail below, the direction logic 144 a may cause the user computing device 102 to receive global positioning data and/or other data and determine directions and/or a route to a destination. The vision-assist logic 144 b may be configured as a mobile application and may cause the user computing device 102 to communicate with the vision-assist apparatus 104 to provide location data, user preferences for the vision-assist apparatus 104 and/or the guide animal apparatus 109. The communication between the user computing device 102 and the vision-assist apparatus 104 my include data related to directions, destinations, and/or commands that may be provided by the vision-assist apparatus 104 to the user and/or guide animal 106.
The vision-assist apparatus 104 may include a vision-assist computing device 105 and a memory component 140 b for communicating with a guide animal 106 and/or guide animal apparatus 109. The vision-assist apparatus 104 may include a necklace module 104 a and/or an eyeglass module 104 b. The eyeglass module 104 b may include at least one camera 122 (such as an eye tracking camera and/or environment camera), at least one speaker 124, and/or at least one tactile output device 126. Similarly, the necklace module 104 a may include at least one camera 121 a, 121 b (such as an eye tracking camera and/or an environment camera), at least one speaker 123 a, 123 b (such as an audible speaker and/or a high frequency speaker), and/or at least one tactile output device 125 a, 125 b. The vision-assist apparatus 104 may also include one or more inertial measurement units, tactile input hardware, one or more microphones, one or more tactile feedback devices, one or more location sensors, one or more lights, one or more proximity sensors, one or more batteries, one or more charging ports, global positioning hardware, and/or other hardware or software.
It should also be understood that while the vision-assist apparatus 104 is depicted in FIG. 1 as including the eyeglass module 104 b and/or the necklace module 104 a, this is just an example. Some embodiments may be configured as a bracelet, handheld device, clothing, headset, etc.
As described in more detail below, the speakers 123 a, 123 b, 124 may be configured for providing audio output in the audible frequency range for humans (such as from about 20 Hertz to about 20 Kilohertz). Additionally, at least one of the speakers 123 a, 123 b, 124 may be configured to output signals in the inaudible frequency range for humans, but in the audible frequency range for the respective guide animal 106. As an example, if the guide animal 106 is a canine, the frequency range of the inaudible output may reside from about 23 Kilohertz to about 64 Kilohertz, such that the guide animal 106 may perceive the command, but the user (and other persons) cannot. In some embodiments, the same speaker may provide both the user-perceptible output (e.g., a first output type) and the user-imperceptible output (e.g., a second output type). In some embodiments, different speakers are dedicated to the different types of audio outputs described herein.
Also included with the vision-assist apparatus 104 is a vision-assist computing device 105. The vision-assist computing device 105 may include a memory component 140 b, which stores communication logic 144 c and command logic 144 d. The communication logic 144 c may cause the vision-assist apparatus 104 to communicate with the user computing device 102, the guide animal apparatus 109, and/or the remote computing device 110 to request instructions and/or receive instructions related to an environment or destination. The command logic 144 d may cause the vision-assist apparatus 104 to provide audible, inaudible, and/or tactile instructions to the user and/or guide animal 106. These and other hardware and software components are described in more detail below. The vision-assist apparatus 104 may also include wireless communication hardware and software for communicating with the guide animal apparatus 109.
The guide animal 106 may include a canine (dog), ferret, pig, monkey, and/or other animal for guiding the user to a destination. As such, the guide animal apparatus 109 may be configured as a collar, harness, leash, and/or may be a stand-alone device and may provide a tactile output and/or audio output (e.g., a human perceptible and/or a human imperceptible output) to the guide animal 106. Accordingly the guide animal apparatus 109 may include a guide animal speaker 127, a guide animal tactile output device 129, and/or a guide animal visual output device 131 (which may be attached to the head of the guide animal 106, or otherwise placed within the peripheral vision of the guide animal 106) to provide audio, tactile, and/or visual representations of commands to the guide animal 106.
Additionally, the guide animal apparatus 109 may include a guide animal computing device 111 that contains a memory component 140 c. The memory component 140 c may store guide animal command logic 144 e and guide animal detection logic 144 f. The guide animal command logic 144 f may be configured to cause the guide animal computing device 111 to communicate with the vision-assist apparatus 104 for coordinating the timing of first commands, which may include commands provided to the user, with second commands, which may be provided to the guide animal 106. The guide animal detection logic 144 f may cause the guide animal computing device 111 to detect actions of the guide animal 106 and/or the user for training purposes and route calculation.
Accordingly, some embodiments may be configured for the vision-assist apparatus 104 to communicate with the guide animal apparatus 109 such that the guide animal apparatus 109 communicates with the guide animal 106. As such, the guide animal apparatus 109 may include wireless communication hardware and/or software, a power source, a guide animal tactile output device 129, a guide animal speaker 127, and/or components that are coupled to the guide animal computing device 111. In these embodiments, the vision-assist apparatus 104 may provide audible and/or tactile commands to the user and wirelessly communicate with the guide animal apparatus 109, which provides commands to the guide animal 106. Based on the data received from the vision-assist apparatus 104, the guide animal apparatus 109 may provide the tactile and/or audio output to the guide animal 106.
Also coupled to the network 100 is a remote computing device 110. The remote computing device 110 may be configured as a mapping server, a global positioning server, a command server, and the like. It should be understood that the depiction of the remote computing device 110 in FIG. 1 as a single computing device is merely for simplicity. Any number of computing devices for providing mapping data, routing data, command data, and the like may be utilized.
FIG. 2 depicts a flowchart for communicating with a guide animal 106, according to embodiments described herein. As illustrated in block 250, instructions to a destination may be received. Specifically, the user may identify a destination via a voice command to the vision-assist apparatus 104 and/or to the user computing device 102. In embodiments where the vision-assist apparatus 104 receives the destination from the user, the vision-assist apparatus 104 may determine a current position of the user and communicate with the user computing device 102 to determine the instructions to reach the destination. The user computing device 102 may communicate with the remote computing device 110 to receive mapping data and/or routing data to guide the user and guide animal 106 to the destination from the current position. Similarly, some embodiments may be configured such that the vision-assist apparatus 104 communicates directly with the remote computing device 110 to receive this data.
Regardless, in block 252, a first command may be determined to provide the user with instructions to proceed to the destination. Specifically, upon receiving the directions to a destination, the vision-assist apparatus 104 and/or the user computing device 102 may determine a command for providing to the user and/or guide animal 106. Specifically, the vision-assist apparatus 104 and/or the user computing device 102 may determine an audible and/or tactile command that the user can perceive. The first command may include an instruction, such as a right turn instruction, a left turn instruction, a stop instruction, a proceed instruction, and/or other instructions. As described above, the first command may be provided by the tactile output device 125 a, 125 b, 126 and/or the speaker 123 a, 123 b, 124. Additionally, in some embodiments at least a portion of the first command may be provided by the user computing device 102.
In block 254, a second command may be determined to be provided to the guide animal 106, where the second command corresponds with the first command. Specifically, the second command that is provided may be inaudible to the user, but in the audible range for the guide animal 106. Additionally, the second command may communicate similar information as the first command. As an example, the guide animal 106 may be trained to respond to the inaudible commands provided by the vision-assist apparatus 104. As a result, if the instructions include a command to take a left turn, the user may be provided with the first command, such as audible and/or tactile commands indicating that the user should turn left in 20 feet. Accordingly, the second command may convey the same information to the guide animal 106, but may be timed such that the guide animal 106 will know exactly when to turn, thereby coordinating the timing of the first command and second command.
As illustrated in block 256, the first command may be provided via the first output device. As described above, the first command may be provided in an audible frequency range for the user to perceive the command and/or may be provided as a tactile output. Accordingly, the first command may be formatted for the user to traverse a predetermined route and reach the destination.
It should be understood that the user-perceptible output may be configured based on a user preference, an environmental factor, etc. As an example, the user may prefer tactile output in certain locations and/or areas of high human congestion. Similarly, some embodiments may configured to determine an environmental noise level and adjust the volume of the audio commands accordingly. In some embodiments, tactile output may be provided in response to a determination that the environmental noise is above a predetermined threshold.
As illustrated in block 258, the second command may be provided via a second output device. As described above, the second output device may be a speaker 123 a, 123 b, 124 or other device for producing a sound that is inaudible by the user, but is perceptible by the guide animal 106. Additionally, the second output may be timed such that the first command and the second command may be provided at different times, but coordinated to appropriately instruct the user and the guide animal 106. As an example, the first command may indicate that a right turn should be made in 20 feet. However, the second command may instruct the guide animal 106 to take an immediate right turn. As such, the second command in this example may be provided after the first command. It should be understood that, depending on the embodiment, not every command that is provided to the user is also provided to the guide animal (and vice versa).
It should also be understood that the second command may be configured based on one or more external factors, such as user preference, environment, training, and the like. As an example, some embodiments may be provided such that the speaker 123 a outputs an inaudible beep for right turns, speaker 123 b outputs an inaudible beep for left turns, and the speaker 124 may be utilized for start and stop commands. Because the outputs originate from a predetermined source, the guide animal 106 may detect the source of the output and understand the command. Some embodiments, however, my provide different outputs, based on the command. As an example, a single inaudible beep may indicate a right turn command, with a double inaudible beep may indicate a left turn command.
Similarly, some embodiments may be configured to determine environmental noise and customize the inaudible output accordingly. As an example, the vision-assist apparatus 104 may determine that the guide animal 106 is located on the right of the user and detect environmental noise on the left of the user. As such, the vision-assist apparatus 104 may bias the output to the speaker 123 a.
It should also be understood that some embodiments may be configured to detect unexpected obstacles on the route and provide at least one improvisational command for the user and guide animal 106 to avoid the obstacles and continue to the destination. As an example, the camera 121 a, 121 b, 122 and/or other device on the vision-assist apparatus 104 may detect the unexpected obstacle, a current location of the unexpected obstacle, and a predicted path (if any) of the unexpected obstacle. Accordingly, the vision-assist apparatus 104 may determine a command (or series of commands) for avoiding the unexpected obstacle. The vision-assist apparatus 104 may additionally provide a first command and second command to the user and guide animal 106, respectively. The vision-assist apparatus 104 may then resume providing commands to the user and guide animal 106 to reach the destination.
It should be understood that some embodiments may operate without a predetermined destination being established and/or may operate with additional functionality. Specifically, embodiments of the vision-assist apparatus 104 may detect an obstacle and may alert the user and/or guide animal 106 accordingly. The alert for the user may include an identification of the obstruction and/or a command, while the guide animal 106 may be provided with a command.
FIG. 3 depicts a flowchart for utilizing a vision-assist apparatus 104 and a guide animal apparatus 109 to communicate with a guide animal 106, according to embodiments described herein. As illustrated in block 350, instructions to a destination may be determined. These instructions may be determined by the vision-assist apparatus 104 and/or may be determined via communication of the vision-assist apparatus 104 with the user computing device 102 and/or the remote computing device 110.
Regardless, in block 352, a first command may be determined. The first command may be a command that is provided for the user to reach the destination. In block 354, a second command may be determined. The second command may be a command for the guide animal 106 to reach the destination. In block 356, the first command may be output via a user-perceptible audio signal and/or a tactile output provided by the vision-assist apparatus 104. In block 358 the second command may be provided to the guide animal apparatus 109 for the guide animal apparatus 109 to output. As discussed above, the guide animal apparatus 109 may provide tactile output to the guide animal 106, audio output (which may be user perceptible or user imperceptible), and/or visual output to the guide animal 106, depending on the particular embodiment.
It should also be understood that while the embodiment of FIG. 3 depicts that the guide animal apparatus 109 provides the commands to the guide animal 106, this is just an example. Some embodiments may be provided such that the vision-assist apparatus 104 provides one or more commands to the guide animal 106 and/or the guide animal apparatus 109 provides commands to the user.
FIG. 4 depicts a flowchart for training a guide animal 106, according to embodiments described herein. As illustrated in block 450, a training command (such as a first training command) may be determined. Specifically, when there is a desire to train the user and/or guide animal 106, a training command may be determined. The training command may include an instruction (such as to turn left), as well as the audible command that is provided to the user (such as “turn left”), the corresponding tactile command that is provided to the user (such as a double tap by the tactile output device 125 b), and the inaudible command that is provided to the guide animal 106 (such as an inaudible double beep from the speaker 123 b.
Once the training command is determined, in block 452, an inaudible training command (such as a second training command) is provided to the guide animal 106. In block 454, user-perceptible command is provided to the user. Depending on the particular training selected, the user-perceptible command may include audible and/or tactile output. In block 456, a determination regarding whether the user and the guide animal 106 complied with the training command may be determined. The determination may be made by the vision-assist apparatus 104, which may track movement and/or change of position of the user. In block 458, in response to determining that the user and/or guide animal 106 did not comply with the command, output indicating that the noncompliance may be provided. As an example, if it is determined that the user did not comply with the command, the speaker 123 a, 123 b, 124 may provide an indication to the user of the error and/or a manner to correct the user. Similarly, if it is determined that the guide animal 106 did not comply with the command, the vision-assist apparatus 104 may provide the user with instructions on a manner to train the guide animal 106 to correct the error.
FIG. 5 depicts computing infrastructure for communicating with a guide animal 106, according to embodiments described herein. The vision-assist computing device 105 includes a processor 530, input/output hardware 532, network interface hardware 534, a data storage component 536 (which stores command data 538 a, device data 538 b, and/or other data), and the memory component 140 b. The memory component 140 b may be configured as volatile and/or nonvolatile memory and as such, may include random access memory (including SRAM, DRAM, and/or other types of RAM), flash memory, secure digital (SD) memory, registers, compact discs (CD), digital versatile discs (DVD), and/or other types of non-transitory computer-readable mediums. Depending on the particular embodiment, these non-transitory computer-readable mediums may reside within the vision-assist computing device 105 and/or external to the vision-assist computing device 105.
The memory component 140 b may store operating system logic 542, the communication logic 144 c and the command logic 144 d. The communication logic 144 c and the command logic 144 d may each include a plurality of different pieces of logic, each of which may be embodied as a computer program, firmware, and/or hardware, as an example. A local communications interface 546 is also included in FIG. 5 and may be implemented as a bus or other communication interface to facilitate communication among the components of the vision-assist computing device 105.
The processor 530 may include any processing component operable to receive and execute instructions (such as from a data storage component 536 and/or the memory component 140). As described above, the input/output hardware 532 may include and/or be configured to interface with the components of FIGS. 1 and/or 5, including the cameras 121 a, 121 b, 122, speakers 123 a, 123 b, 124, tactile output devices 125 a, 125 b, 126, one or more inertial measurement units, tactile input hardware, one or more microphones, one or more tactile feedback devices, one or more location sensors, one or more lights, one or more proximity sensors, one or more batteries, one or more charging ports, global positioning hardware, which may be included with the vision-assist apparatus 104.
The network interface hardware 534 may include and/or be configured for communicating with any wired or wireless networking hardware, including an antenna, a modem, a LAN port, wireless fidelity (Wi-Fi) card, WiMax card, Bluetooth™ module, mobile communications hardware, and/or other hardware for communicating with other networks and/or devices. From this connection, communication may be facilitated between the vision-assist computing device 105 and other computing devices (such as user computing device 102 and/or the remote computing device 110).
The operating system logic 542 may include an operating system and/or other software for managing components of the vision-assist computing device 105. As discussed above, the communication logic 144 c may reside in the memory component 140 b and may be configured to cause the processor 530 to receive environmental data, as well as data from the user, the user computing device 102, and/or the remote computing device 110 and determine a destination and/or route accordingly. Similarly, the command logic 144 d may be utilized to provide human perceptible and/or human imperceptible commands for the user and/or guide animal 106.
It should be understood that while the components in FIG. 5 are illustrated as residing within the vision-assist computing device 105, this is merely an example. In some embodiments, one or more of the components may reside external to the vision-assist computing device 105. It should also be understood that, while the vision-assist computing device 105 is illustrated as a single device, this is also merely an example. In some embodiments, the communication logic 144 c and the command logic 144 d may reside on different computing devices. As an example, one or more of the functionalities and/or components described herein may be provided by the user computing device 102 and/or other devices, which may be communicatively coupled to the vision-assist computing device 105. These computing devices may also include hardware and/or software for performing the functionality described herein.
Additionally, while the vision-assist computing device 105 is illustrated with the communication logic 144 c and the command logic 144 d as separate logical components, this is also an example. In some embodiments, a single piece of logic may cause the vision-assist computing device 105 to provide the described functionality.
As illustrated above, various embodiments for training a guide animal 106 are disclosed. Accordingly, embodiments described herein may encourage vision-impaired persons to use a guide animal 106 for traversing a route to a destination. Additionally, embodiments described herein may be configured to facilitate training of a user and/or guide animal 106 for use with a vision-assist apparatus 104.
While particular embodiments and aspects of the present disclosure have been illustrated and described herein, various other changes and modifications can be made without departing from the spirit and scope of the disclosure. Moreover, although various aspects have been described herein, such aspects need not be utilized in combination. Accordingly, it is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the embodiments shown and described herein.
It should now be understood that embodiments disclosed herein includes systems, methods, and non-transitory computer-readable mediums for communicating with a guide animal. It should also be understood that these embodiments are merely exemplary and are not intended to limit the scope of this disclosure.

Claims (8)

What is claimed is:
1. A system for communicating with a guide animal, comprising:
a first output device that provides a first output type that is perceptible by a vision-impaired user;
a second output device that outputs a second output type that is imperceptible by the vision-impaired user but is perceptible by the guide animal utilized by the vision-impaired user; and
a vision-assist computing device comprising a processor and a memory component, the memory component storing logic that causes the vision-assist computing device to perform at least the following:
determine instructions to a destination;
determine a first command to provide to the vision-impaired user to proceed to the destination;
determine a second command to provide to the guide animal to correspond with the first command;
provide the first command via the first output device; and
provide the second command via the second output device.
2. The system of claim 1, wherein the first output device includes at least one of the following: a tactile output device and a speaker.
3. The system of claim 1, further comprising a user computing device that communicates with the vision-assist computing device to provide routing data associated with the destination.
4. The system of claim 1, wherein the logic further causes the vision-assist computing device to determine a manner of outputting the first command, wherein the manner of outputting includes at least one of the following: a user preference and an environmental factor.
5. The system of claim 1, wherein the logic further causes the vision-assist computing device to train at least one of the following: the vision-impaired user and the guide animal, and wherein training includes determining whether a training command was followed and, in response to determining that the training command was not followed, providing an output to indicate that the training command was not followed.
6. The system of claim 1, further comprising a guide animal apparatus that includes a guide animal tactile output device and a guide animal speaker.
7. The system of claim 6, wherein the guide animal apparatus receives the second command and outputs the second command via at least one of the following: the guide animal tactile output device and the guide animal speaker.
8. The system of claim 1, wherein the logic further causes the system to perform at least the following:
determine an unexpected obstacle on a route a user is taking to the destination; and
provide at least one improvisational command for the user and the guide animal to avoid the unexpected obstacle and continue to the destination.
US15/052,495 2016-02-24 2016-02-24 Systems and methods for communicating with a guide animal Expired - Fee Related US10912281B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/052,495 US10912281B2 (en) 2016-02-24 2016-02-24 Systems and methods for communicating with a guide animal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/052,495 US10912281B2 (en) 2016-02-24 2016-02-24 Systems and methods for communicating with a guide animal

Publications (2)

Publication Number Publication Date
US20170238509A1 US20170238509A1 (en) 2017-08-24
US10912281B2 true US10912281B2 (en) 2021-02-09

Family

ID=59630854

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/052,495 Expired - Fee Related US10912281B2 (en) 2016-02-24 2016-02-24 Systems and methods for communicating with a guide animal

Country Status (1)

Country Link
US (1) US10912281B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11849699B2 (en) 2021-07-20 2023-12-26 Canine Companions for Independence, Inc. System for alerting service animals to perform specified tasks

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6948325B2 (en) * 2016-08-05 2021-10-13 ソニーグループ株式会社 Information processing equipment, information processing methods, and programs
CN108095987B (en) * 2017-12-06 2020-10-16 英业达科技有限公司 Vision-impaired navigation system and method thereof
JP2019144790A (en) * 2018-02-20 2019-08-29 富士ゼロックス株式会社 Information processing device and program
US11058095B2 (en) * 2018-07-24 2021-07-13 International Business Machines Corporation Working animal reaction handling
US10806125B1 (en) 2019-08-13 2020-10-20 International Business Machines Corporation Service animal navigation
CN114245710A (en) * 2019-08-15 2022-03-25 卫星动物保护有限责任公司 Correction collar using geographical positioning technology
US20230380381A1 (en) * 2022-05-24 2023-11-30 Rebecca Ann Metcalfe Comprehensive Wireless Verbal Cue Training System for Domesticated Animals

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4215490A (en) 1977-11-16 1980-08-05 Fewell William B Braille deaf-blind communicators
US5047952A (en) 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US6232880B1 (en) * 1999-07-14 2001-05-15 The United States Of America As Represented By The Secretary Of Agriculture Animal control system using global positioning and instrumental animal conditioning
US20020046713A1 (en) * 2000-09-08 2002-04-25 Otto James R. Method for remotely controlling movement of an animal
US20070018890A1 (en) 2005-07-22 2007-01-25 Kulyukin Vladimir A Multi-sensor wayfinding device
WO2008015375A1 (en) 2006-08-04 2008-02-07 The Guide Dogs For The Blind Association Assistance device for blind and partially sighted people
CN201036605Y (en) 2007-03-30 2008-03-19 上海市杨浦区控江二村小学 Blind-guide machine dog
US7446669B2 (en) 2003-07-02 2008-11-04 Raanan Liebermann Devices for use by deaf and/or blind people
CN201251445Y (en) 2008-09-08 2009-06-03 众智瑞德科技(北京)有限公司 Voice blind guide system and portable type voice blind guide device
US20100263603A1 (en) 2009-04-16 2010-10-21 Matthew Baron Animal garment with integrated sound device
US7864991B2 (en) 2006-04-06 2011-01-04 Espre Solutions Inc. System and method for assisting a visually impaired individual
US20110307172A1 (en) 2010-06-11 2011-12-15 Tata Consultancy Services Limited Hand-held navigation aid for individuals with visual impairment
CN103226018A (en) 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Blind guiding method based on mobile terminal and mobile terminal
CN103413456A (en) 2013-08-14 2013-11-27 深圳市盛世任我行科技有限公司 Voice barrier-free guiding system for blind person
US8606316B2 (en) 2009-10-21 2013-12-10 Xerox Corporation Portable blind aid device
CN203619858U (en) 2013-11-25 2014-06-04 苏州天鸣信息科技有限公司 Voice prompt device assembly
US20140266571A1 (en) 2013-03-12 2014-09-18 Anirudh Sharma System and method for haptic based interaction
US8955462B1 (en) * 2011-06-16 2015-02-17 Wolfgis, Llc System and method for remote guidance of an animal to and from a target destination
KR20150088056A (en) 2014-01-23 2015-07-31 동서대학교산학협력단 Guide dog harness
US9111545B2 (en) 2010-05-17 2015-08-18 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
US20160100552A1 (en) * 2014-10-14 2016-04-14 E-Collar Technologies, Inc. Reconfigurable animal traning apparatus and system

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4215490A (en) 1977-11-16 1980-08-05 Fewell William B Braille deaf-blind communicators
US5047952A (en) 1988-10-14 1991-09-10 The Board Of Trustee Of The Leland Stanford Junior University Communication system for deaf, deaf-blind, or non-vocal individuals using instrumented glove
US6232880B1 (en) * 1999-07-14 2001-05-15 The United States Of America As Represented By The Secretary Of Agriculture Animal control system using global positioning and instrumental animal conditioning
US20020046713A1 (en) * 2000-09-08 2002-04-25 Otto James R. Method for remotely controlling movement of an animal
US7446669B2 (en) 2003-07-02 2008-11-04 Raanan Liebermann Devices for use by deaf and/or blind people
US20070018890A1 (en) 2005-07-22 2007-01-25 Kulyukin Vladimir A Multi-sensor wayfinding device
US7864991B2 (en) 2006-04-06 2011-01-04 Espre Solutions Inc. System and method for assisting a visually impaired individual
WO2008015375A1 (en) 2006-08-04 2008-02-07 The Guide Dogs For The Blind Association Assistance device for blind and partially sighted people
CN201036605Y (en) 2007-03-30 2008-03-19 上海市杨浦区控江二村小学 Blind-guide machine dog
CN201251445Y (en) 2008-09-08 2009-06-03 众智瑞德科技(北京)有限公司 Voice blind guide system and portable type voice blind guide device
US20100263603A1 (en) 2009-04-16 2010-10-21 Matthew Baron Animal garment with integrated sound device
US8606316B2 (en) 2009-10-21 2013-12-10 Xerox Corporation Portable blind aid device
US9111545B2 (en) 2010-05-17 2015-08-18 Tata Consultancy Services Limited Hand-held communication aid for individuals with auditory, speech and visual impairments
US20110307172A1 (en) 2010-06-11 2011-12-15 Tata Consultancy Services Limited Hand-held navigation aid for individuals with visual impairment
US8955462B1 (en) * 2011-06-16 2015-02-17 Wolfgis, Llc System and method for remote guidance of an animal to and from a target destination
US20140266571A1 (en) 2013-03-12 2014-09-18 Anirudh Sharma System and method for haptic based interaction
CN103226018A (en) 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Blind guiding method based on mobile terminal and mobile terminal
CN103413456A (en) 2013-08-14 2013-11-27 深圳市盛世任我行科技有限公司 Voice barrier-free guiding system for blind person
CN203619858U (en) 2013-11-25 2014-06-04 苏州天鸣信息科技有限公司 Voice prompt device assembly
KR20150088056A (en) 2014-01-23 2015-07-31 동서대학교산학협력단 Guide dog harness
US20160100552A1 (en) * 2014-10-14 2016-04-14 E-Collar Technologies, Inc. Reconfigurable animal traning apparatus and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
http://www.humanware.com/en-usa/products/deafblind_communication_solutions/deafblind_communicator, Deaf Blind Communicator; 3 pages, accessed Jan. 4, 2016.
Kishor Vijay Patil; "A Review on Voice Based Passenger Bus Predicting Arrival of Bus for Easy Navigation of Blind"; URL: http://www.ijmetmr.com/olseptember2015/KishorVijayPatil-A-43; Publication: International Journal & Magazine of Engineering, Technology, Management and Research, Sep. 2015, pp. 1384-1390, vol. No. 2, Issue No. 9.
Outreach website; URL: http://www.outreach1.org/paratransit/para_mainpage.html.
Santa Clara Valley Transportation Authority website; URL: http://www.vta.org/flex.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11849699B2 (en) 2021-07-20 2023-12-26 Canine Companions for Independence, Inc. System for alerting service animals to perform specified tasks
US12302865B2 (en) 2021-07-20 2025-05-20 Canine Companions for Independence, Inc. System for alerting service animals to perform specified tasks

Also Published As

Publication number Publication date
US20170238509A1 (en) 2017-08-24

Similar Documents

Publication Publication Date Title
US10912281B2 (en) Systems and methods for communicating with a guide animal
ES2754448T3 (en) Control of an electronic device based on speech direction
US20170254646A1 (en) Systems and methods for directing a vision-impaired user to a vehicle
US20150046018A1 (en) Autonomous moving body, obstacle sensing method, and obstacle avoiding method
US10842129B1 (en) Invisible pet fencing systems and methods
JP2014142345A5 (en)
KR20140098615A (en) Method for fitting hearing aid connected to Mobile terminal and Mobile terminal performing thereof
US12244994B2 (en) Processing of audio signals from multiple microphones
US10723352B2 (en) U-turn assistance
US20210154827A1 (en) System and Method for Assisting a Visually Impaired Individual
CN107026943A (en) voice interactive method and system
US20170289770A1 (en) Method And System Of Indoor Positioning Of A User And Delivery Of Information Thereto
US11181381B2 (en) Portable pedestrian navigation system
CN106200654A (en) The control method of unmanned plane during flying speed and device
JPWO2019049661A1 (en) Information processing equipment, information processing methods, and programs
TW202314684A (en) Processing of audio signals from multiple microphones
US9921801B2 (en) Control method and control device
US9993384B1 (en) Vision-assist systems and methods for assisting visually impaired users with navigating an environment using simultaneous audio outputs
US12487360B2 (en) Systems and methods for using an accessibility headset system for providing directions to audio and visually impaired users
CN116136415A (en) Navigation guidance method, device, electronic device and storage medium
US9591447B2 (en) Systems and methods for providing contextual environmental information
JP2015192433A (en) Voice message delivery system
WO2021209444A1 (en) A method and a system for guiding a blind or a visually impaired individual along a path
US20230243917A1 (en) Information processing device, user terminal, control method, non-transitory computer-readable medium, and information processing system
Rao et al. An electronic escort for the visually challenged

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC., KENTUCKY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAYAL, RAJIV;MAU II, FREDRICK W.;SIGNING DATES FROM 20160215 TO 20160223;REEL/FRAME:037818/0134

Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AME

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAYAL, RAJIV;MAU II, FREDRICK W.;SIGNING DATES FROM 20160215 TO 20160223;REEL/FRAME:037818/0134

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20250209