[go: up one dir, main page]

US20180345909A1 - Vehicle with wearable for identifying one or more vehicle occupants - Google Patents

Vehicle with wearable for identifying one or more vehicle occupants Download PDF

Info

Publication number
US20180345909A1
US20180345909A1 US16/040,799 US201816040799A US2018345909A1 US 20180345909 A1 US20180345909 A1 US 20180345909A1 US 201816040799 A US201816040799 A US 201816040799A US 2018345909 A1 US2018345909 A1 US 2018345909A1
Authority
US
United States
Prior art keywords
vehicle
user
control system
individual
earpieces
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/040,799
Inventor
Peter Vincent Boesen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bragi GmbH
Original Assignee
Bragi GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bragi GmbH filed Critical Bragi GmbH
Priority to US16/040,799 priority Critical patent/US20180345909A1/en
Publication of US20180345909A1 publication Critical patent/US20180345909A1/en
Assigned to Bragi GmbH reassignment Bragi GmbH EMPLOYMENT DOCUMENT Assignors: BOESEN, Peter Vincent
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • B60R25/257Voice recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00563Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys using personal physical data of the operator, e.g. finger prints, retinal images, voicepatterns
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00896Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys specially adapted for particular uses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Definitions

  • the illustrative embodiments relate to vehicles. More particularly, but not exclusively, the illustrative embodiments relate to a vehicle which integrates with or communicates with a wearable device such as an earpiece or a set of earpieces to identify one or more vehicle occupants.
  • a wearable device such as an earpiece or a set of earpieces to identify one or more vehicle occupants.
  • Vehicles may come with various types of electronics packages. These packages may be standard or optional and include electronics associated with communications or entertainment. However, there are various problems and deficiencies with such offerings. What is needed are vehicles with improved electronics options which create, improve, or enhance safety and overall experience of vehicles.
  • Yet another object, feature, or advantage of the illustrative embodiments is to allow a vehicle to obtain biometric information about a driver or passenger using one or more wearable devices.
  • a system includes a vehicle, the vehicle comprising a control system and a wireless transceiver operatively connected to the control system.
  • the control system is configured to wirelessly communicate with a wearable device worn by a user using the wireless transceiver.
  • the control system is configured to receive biometric input from one or more sensors of the wearable device to identify an occupant of the vehicle or individual proximate the vehicle.
  • a system includes a vehicle, the vehicle comprising a control system, a first wireless transceiver operatively connected to the control system, and a wearable device for being worn by a user, a second wireless transceiver disposed within the wearable device and configured to wirelessly communicate with the first wireless transceiver.
  • the wearable device includes at least one sensor for obtaining biometric input. The wearable device is configured to identify a wearer of the wearable device using the biometric input and convey an identity of the wearer of the wearable device to the control system.
  • a system includes a vehicle, the vehicle comprising a vehicle network with a plurality of devices in operative communication with the vehicle network and a wireless transceiver operatively connected to the vehicle network.
  • the wireless transceiver is configured to wirelessly communicate with a wearable device worn by a user and after the user is identified, convey sensor data from the wearable device over the vehicle network to one or more of the plurality of devices.
  • a method includes obtaining sensor data at a wearable device, determining a user's identity based on the sensor data and if the user has appropriate access rights, communicating data or commands over a vehicle network to perform vehicle functions.
  • FIG. 1 illustrates one example of a vehicle which integrates with wearable technology.
  • FIG. 2 illustrates one example of a set of wearable devices in the form of earpieces.
  • FIG. 3 is a block diagram of one example of a wearable device in the form of an earpiece.
  • FIG. 4 illustrates a vehicle network or bus allowing different electronic modules to communicate with a wearable device.
  • FIG. 5 illustrates one example of a method.
  • the illustrative embodiments allow for wearable devices including earpieces to enhance the experience of vehicles and according to some aspects, the illustrative embodiments allow for wearable devices, such as earpieces to enhance the overall safety of the vehicle. Therefore, it is expected the technology described herein will make any vehicle so equipped more desirable to customers, more satisfying to customers, and potentially more profitable for the vehicle manufacturer and the presence or absence of such technology may drive buying decisions of the consumer. Similarly, at least some of the various aspects may be added to existing vehicles as after-market accessories to improve the safety, accessibility, or experience of existing vehicles.
  • FIG. 1 illustrates one example of use of a wearable device in conjunction with a vehicle.
  • a vehicle 2 As shown in FIG. 1 there is a vehicle 2 .
  • the vehicle 2 shown is a full-size sedan, it is contemplated the vehicle 2 may be of any number of types of cars, trucks, sport utility vehicles, vans, mini-vans, automotive vehicles, commercial vehicles, agricultural vehicles, construction vehicles, specialty vehicles, recreational vehicles, buses, motorcycles, aircraft, boats, ships, yachts, trains, spacecraft, or other types of vehicles.
  • the vehicle 2 may be gas-powered, diesel powered, electric, fuel cell, hydrogen, solar-powered, or human-powered.
  • the vehicle 2 may be actively operated by a driver or may be partially or completely autonomous or self-driving.
  • the vehicle 2 may have vehicle control systems 40 .
  • the vehicle control systems 40 are systems which may include any number of mechanical and electromechanical subsystems.
  • such systems may include a navigation system 42 , an entertainment system 44 , a vehicle security system 45 , an audio system 46 , a safety system 47 , a communications system 48 preferably with a wireless transceiver, a driver assistance system 49 , a passenger comfort system 50 , and engine/transmission/chassis electronics systems 51 .
  • vehicle control sub-systems are contemplated.
  • there may be overlap between some of these different vehicle systems and the presence or absence of these vehicle systems as well as other vehicle systems may depend upon the type of vehicle, the type of fuel or propulsion system, the size of the vehicle, and other factors and variables. All or portions of the vehicle control systems 40 may be integrated together or in separate locations of the vehicle 2 .
  • examples of the driver assistance system 49 may include one or more subsystems such as a lane assist system, autopilot, a speed assist system, a blind spot detection system, a park assist system, and an adaptive cruise control system.
  • examples of the passenger comfort system 50 may include one or more subsystems such as automatic climate control, electronic seat adjustment, automatic wipers, automatic headlamps, and automatic cooling.
  • examples of the safety system 47 may include active safety systems such as air bags, hill descent control, and an emergency brake assist system. Aspects of the navigation system 42 , the entertainment system 44 , the audio system 46 , and the communications system 48 may be combined into an infotainment system.
  • One or more wearable devices such as a set of earpieces 10 including a left earpiece 12 A and a right earpiece 12 B may in operative communication with the vehicle control system 40 , such as through the communication system 48 .
  • the communication system 48 may provide a Bluetooth, Wi-Fi, or BLE link to wearable devices or may otherwise provide for communications with the wearable devices through wireless communications.
  • the vehicle 2 may communicate with the wearable device(s) directly, or alternatively, or in addition, the vehicle 2 may communicate with the wearable device(s) through an intermediary device such as a mobile device 4 which may be a mobile phone, a tablet, gaming device, media device, computing device, or other type of mobile device.
  • the earpieces 10 interact with the vehicle control system 40 in any number of different ways.
  • the earpieces 10 may provide sensor data, identity information, stored information, streamed information, or other types of information to the vehicle. Based on this information, the vehicle may take any number of actions which may include one or more actions taken by the vehicle control system (or subsystems thereof).
  • the vehicle 2 may communicate sensor data, identity information, stored information, streamed information or other types of information to the earpieces 10 .
  • FIG. 2 illustrates one example of a wearable device in the form of a set of earpieces 10 in greater detail.
  • FIG. 1 illustrates a set of earpiece wearables 10 which includes a left earpiece 12 A and a right earpiece 12 B.
  • Each of the earpieces 12 A, 12 B has an earpiece wearable housing 14 A, 14 B which may be in the form of a protective shell or casing and may be an in-the-ear earpiece housing.
  • a left infrared through ultraviolet spectrometer 16 A and right infrared through ultraviolet spectrometer 16 B is also shown.
  • Each earpiece 12 A, 12 B may include one or more microphones 70 A, 70 B. Note the air microphones 70 A, 70 B are outward facing so the air microphones 70 A, 70 B may capture ambient environmental sound. It is to be understood any number of microphones may be present including air conduction microphones, bone conduction microphones, or other audio sensors.
  • FIG. 3 is a block diagram illustrating an earpiece 12 .
  • the earpieces 12 may represent an over-ear headphone, headband, jewelry, or other wearable device.
  • the earpiece 12 may include one or more LEDs 20 electrically connected to an intelligent control system 30 .
  • the intelligent control system 30 may include one or more processors, microcontrollers, application specific integrated circuits, or other types of integrated circuits.
  • the intelligent control system 30 may also be electrically connected to one or more sensors 32 . Where the device is an earpiece 12 , the sensors 32 may include an inertial sensor 74 and another inertial sensor 76 .
  • Each inertial sensor 74 , 76 may include an accelerometer, a gyro sensor or gyrometer, a magnetometer, or other type of inertial sensor.
  • the sensors 32 may also include one or more contact sensors 72 , one or more bone conduction microphones 71 , one or more air conduction microphones 70 , one or more chemical sensors 79 , a pulse oximeter 76 , a temperature sensor 80 , or other physiological or biological sensor(s).
  • Further examples of physiological or biological sensors include an alcohol sensor 83 , glucose sensor 85 , or bilirubin sensor 87 . Other examples of physiological or biological sensors may also be included in the earpieces 12 .
  • a blood pressure sensor 82 may include a blood pressure sensor 82 , an electroencephalogram (EEG) 84 , an Adenosine Triphosphate (ATP) sensor, a lactic acid sensor 88 , a hemoglobin sensor 90 , a hematocrit sensor 92 , or other biological or chemical sensors.
  • EEG electroencephalogram
  • ATP Adenosine Triphosphate
  • a spectrometer 16 is also shown.
  • the spectrometer 16 may be an infrared (IR) through ultraviolet (UV) spectrometer although it is contemplated any number of wavelengths in the infrared, visible, X-ray, gamma ray, radio, or ultraviolet spectrums may be detected.
  • the spectrometer 16 may be adapted to measure environmental wavelengths for analysis and recommendations and thus may be located on or at the external facing side of the earpiece 12 .
  • a gesture control interface 36 is also operatively connected to or integrated into the intelligent control system 30 .
  • the gesture control interface 36 may include one or more emitters 82 and one or more detectors 84 for sensing user gestures.
  • the emitters 82 may be one of any number of types including infrared LEDs.
  • the earpiece 12 may include a transceiver 35 which may allow for induction transmissions such as through near field magnetic induction.
  • a short-range transceiver 34 using Bluetooth, BLE, UWB, Wi-Fi or other means of radio communication may also be present.
  • the short-range transceiver 34 may be used to communicate with the vehicle control system.
  • the intelligent control system 30 may be configured to convey different information using one or more of the LED(s) 20 based on context or mode of operation of the device.
  • the various sensors 32 , the processor 30 , and other electronic components may be located on one or more printed circuit boards, chips, or circuits of the earpiece 12 .
  • One or more speakers 73 may also be operatively connected to the intelligent control system 30 .
  • a magnetic induction electric conduction electromagnetic (E/M) field transceiver 37 or other type of electromagnetic field receiver is also operatively connected to the intelligent control system 30 to link the processor 30 to the electromagnetic field of the user.
  • the use of the E/M transceiver 37 allows the earpiece 12 to link electromagnetically into a personal area network or body area network or another device.
  • earpiece wearables may be used to identify one or more users.
  • Each earpiece wearable may include its own identifier (e.g., IMEI, RFID tag, unique frequency, serial number, electronic identifier, user-specified name, etc.).
  • each earpiece 12 may be used to determine or confirm identity of an individual wearing it. This may be accomplished in various ways including through voice imprint. An individual may speak, and their voice may be analyzed by the earpiece 12 and compared to known samples or metrics to identify the individual. Fingerprints, gestures, tactile feedback, height, skin conductivity, passwords, or other information may also be determined by the sensors 32 or the earpiece 12 and utilized for authentication.
  • an individual may be asked to specify other information to the earpiece 12 to confirm identity. This may include answering specific questions. For example, the earpiece 12 may ask multiple questions with yes, no, multiple choice, or free form answers which the correct individual will know but others are not likely to know. These questions may be stored within a database and are questions which the individual associated with the earpiece 12 specifically provided answers for. These questions may also be based on activities of the user which are stored on the earpiece 12 or are retrievable from a system in operative communication with the earpiece 12 . These may include information about physical activities, locations, or other activities.
  • earpiece performing the analysis associated with user identification and authentication
  • necessary information such as voice samples or voice or gestural responses may be collected by the earpiece 12 and communicated to the vehicle, mobile device, or other device for performing the analysis.
  • the vehicle may be unlocked such as by a person saying “unlock” or the vehicle may be remote started and environmental controls set by a person saying, “start my car and set temperature to 72 degrees.”
  • These actions may be taken by the vehicle control system or its subsystems such as an access and security subsystem or a climate control subsystem.
  • actions may be taken based on proximity of the individual to the user or based on other contextual information.
  • vehicle controls may be a part of the vehicle access and security subsystems. These may include actuators such as actuators associated with door locks or locks associated with other compartments. Other types of vehicle controls may include an ignition lock switch which may be unlocked or locked. Other types of vehicle controls may include actuators associated with windows. In addition to these functions, any number of different vehicle functions or related processes may be performed. The vehicle functions performed by a properly identified individual may be the same types of vehicle functions an individual may perform as a driver of the vehicle. Other types of vehicle controls may include any number of settings such as audio system settings, engine controls/components, temperature control settings, entertainment system settings, navigation settings, or other types of settings.
  • the earpiece 12 may also be utilized to control (e.g., initiate, end, adjust settings, etc.) vehicle tracking systems, camera systems, anti-locking breaks, traction control systems, four wheel drive systems, electronic stability control (ESC), dynamic steering response, driver wakefulness monitoring, assured clear distance ahead, adaptive headlamps, advanced automatic collision notification, automotive night vision, blind spot monitoring, precrash systems, safe speed governing, traffic sign recognition, dead man's switch, and so forth.
  • control e.g., initiate, end, adjust settings, etc.
  • vehicle tracking systems e.g., camera systems, anti-locking breaks, traction control systems, four wheel drive systems, electronic stability control (ESC), dynamic steering response, driver wakefulness monitoring, assured clear distance ahead, adaptive headlamps, advanced automatic collision notification, automotive night vision, blind spot monitoring, precrash systems, safe speed governing, traffic sign recognition, dead man's switch, and so forth.
  • ESC electronic stability control
  • FIG. 4 illustrates another example.
  • a vehicle network 100 is shown.
  • the left earpiece and the right earpiece 12 A, 12 B may communicate information through a vehicle network 100 associated with the vehicle 2 .
  • commands may be communicated over the vehicle network 100 or vehicle bus to perform one or more vehicle functions.
  • Protocols which are used may include a Controller Area Network (CAN), Local Interconnect Network (LIN), local area network (LAN), personal area network (PAN), or others including proprietary network protocols or network protocol overlays.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN local area network
  • PAN personal area network
  • Various types of electronic control modules 102 , 104 , 106 , 108 or electronic control units may communicate over the network 100 of the vehicle 102 .
  • These may include electronic modules such as an engine control unit (ECU), a transmission control unit (TCU), an anti-lock braking system (ABS), a body control module (BCM), a door control unit (DCU), an electric power steering control unit (PSCU), a human-machine interface (HMI), powertrain control module (PCM), speed control unit (SCU), telematic control unit (TCU), brake control unit (BCM), battery management system, and numerous others.
  • ECU engine control unit
  • TCU transmission control unit
  • ABS anti-lock braking system
  • BCM body control module
  • DCU door control unit
  • PSCU electric power steering control unit
  • HMI human-machine interface
  • PCM powertrain control module
  • SCU speed control unit
  • TCU telematic control unit
  • BCM brake control unit
  • battery management system and numerous others. Any number of electronic control modules
  • the commands may represent audio or verbal commands, tactile commands (e.g., taps, swipes, etc.), head gestures, hand motions near the earpieces 12 , or another detectable user feedback.
  • the various commands may be associated with different components and functions of the electronic control modules 102 , 104 , 106 , 108 .
  • the earpieces 10 , an associated wireless device, an electronic device, or the vehicle 2 interface may be utilized to associate the commands with specific actions.
  • Databases, memory data, macro, or scripts may associate the user command, input, or feedback with the implemented action.
  • an application or set of instructions executed by the vehicle 2 may associate a head gesture, such as two head nods, with an unlock function for the driver's side door of the vehicle 12 .
  • a wireless transceiver module 110 is operatively connected to a vehicle network 100 and it is the wireless transceiver module 110 which is in operative communication with one or more wearable devices, such as the wearable earpieces 10 .
  • the user is permitted to give instructions which are translated into commands which are communicated over the vehicle network 100 to an appropriate system or component of the vehicle or to communicate data such as data from one or more sensors of each of the earpieces 12 A, 12 B.
  • Data from the earpieces 10 may be used by any number of different electronic control modules or electronic control units 102 , 104 , 106 , 108 connected to the vehicle network 100 to perform any number of different vehicle functions.
  • FIG. 5 illustrates one example of a methodology.
  • the one or more wearable devices may represent one or more wireless earpieces, such as those shown and described in the various embodiments.
  • sensor data is obtained at one or more wearable devices.
  • the sensor data may be one of any number of types.
  • the sensor data may be voice data or other biometric data.
  • a determination is made of the user identity based on the sensor data. Where the sensor data is voice data this determination may be as the result of a voice print or voice sample analysis. Any number of different products or components may be used to perform this analysis. Examples of commercial products for performing such functionality include Nuance VocalPassword, Watson, Siri, Alexa, Google Voice, VoiceIT, and numerous others.
  • biometric data may be used.
  • the wearable device is a pair of glasses than retina identification and authentication may be used.
  • finger print analysis may be used.
  • wireless earpieces may be utilized to scan fingerprints as well. The determination of the user identity based on sensor data may be performed in one of several different locations based on the type of analysis and available computational resources.
  • the determination may be performed on or at the wearable device itself.
  • the determination may be performed on or at the vehicle.
  • the determination may be performed by a mobile device such as a smart phone which is in operative communication with either the wearable device(s) or the vehicle, or both.
  • step 124 a determination is made as to whether the user has access rights. In one implementation, if the user is identified then the user has appropriate access rights. In alternative implementations, identifying the user does not necessarily give the individual all rights.
  • step 126 data or commands may be communicated over the vehicle network to perform various vehicle functions. Data from the wearable device(s) may be used by any electronic control modules associated with the vehicle network to provide input to be used in any number of different decision-making processes. Similarly, commands may be given from the user to the vehicle using the wearable device such as when the wearable device is an earpiece and the commands may be given through voice input from the user.
  • any number of actions or access may be granted for implementation utilizing one or more of the earpieces, vehicle systems, wireless devices, or other networked devices.
  • the user may receive a phone call through a wireless device within the vehicle or by a communication system within the vehicle.
  • the user may provide feedback utilizing the wireless earpieces, such as a double head nod, thereby accepting the phone call for communication through the speakers and microphones of the vehicle.
  • the communications may be communicated through the wireless earpieces and augmented by the vehicle communication systems (e.g., displaying the caller, call length, etc.).
  • the user may provide a verbal command, such as “enter sport mode”, thereby providing a command to the vehicle to adjust the performance of the vehicle (e.g., engine torque/output, transmission performance, suspension settings, etc.).
  • the wireless earpieces may be configured to listen for or receive a command at any time.
  • a “listen” mode may be activated in response to an input, such as a finger tap of the wireless earpieces, initiation of a vehicle feature, head motion, or so forth. The listen mode may prepare the wireless earpieces to receive a command, input, or feedback from the user.
  • the wireless earpieces may provide a method of monitoring biometrics of the user, such as heart rate, blood pressure, blood oxygenation, respiration rate, head position, voice output, or other measurements or readings detectable by the various sensors within the wireless earpieces and/or the vehicle.
  • biometrics of the user such as heart rate, blood pressure, blood oxygenation, respiration rate, head position, voice output, or other measurements or readings detectable by the various sensors within the wireless earpieces and/or the vehicle.
  • the wireless earpieces may determine the user is fatigued based on the user's heart rate, respiration, and head motion to provide an alert through the vehicle systems, such as a message indicating the user should pull over communicated through the infotainment system, a heads-up display (e.g., electronic glass), or other vehicle systems.
  • the user settings may indicate the windows are rolled down and the music is turned up until the user can find a suitable place to stop or park.
  • the wireless earpieces may also warn the user if he is impaired based on a determined blood alcohol level, cognition test, slurred speech, or other relevant factors. As a result, the wireless earpieces may help protect the user from his or herself, passengers within the vehicle, and third parties outside the vehicle. In one embodiment, the wireless earpieces may be configured to lock out one or more vehicle systems in response to determining the user is impaired.
  • the wireless earpieces may also indicate biometrics in the event there is an accident, health event, or so forth. For example, the wireless earpieces may send a command for the vehicle to enter an emergency pullover mode in response to determining the user is experiencing a health event, such as a heart attack, stroke, seizure, or other event or condition preventing the user from safely operating the vehicle.
  • the wireless earpieces may also send one or more communications to emergency services, emergency contacts, or so forth.
  • the wireless earpieces may be utilized to monitor a younger or inexperienced user operating the vehicle. For example, to operate the vehicle, an administrator of the vehicle may require the wireless earpieces be worn to determine the watchfulness of the user determined by factors, such as head position, conversations or audio detected, activation/utilization of and associated cellular phone, the wireless earpieces, or the vehicle systems. As a result, the wireless earpieces may be utilized as a parental monitoring feature wall the user is within the vehicle.
  • the wireless earpieces may also be utilized to perform any number of small tasks significantly enhancing the user experience, such as opening individual doors, unlocking the trunk, opening windows/sunroofs, starting the vehicle, turning off the vehicle, turning on or off the air conditioning/heater, adjust a seat configuration, turning on a movie/music, or any number of other features commonly utilized by the user.
  • the wireless earpieces in conjunction with the vehicle systems may also learn the preferences of the user over time to perform automatic features and settings of the vehicle.
  • commands may be automatically communicated based on the identity of the user.
  • the vehicle may perform one or more vehicle functions automatically based on the identity of the user. These functions may be any number of different functions previously discussed including functions granting access or deny access to the user.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Lock And Its Accessories (AREA)
  • Selective Calling Equipment (AREA)
  • Telephone Function (AREA)

Abstract

A system includes a vehicle, the vehicle comprising a control system and a wireless transceiver operatively connected to the control system. The control system is configured to wirelessly communicate with a wearable device worn by a user using the wireless transceiver. The control system is configured to receive biometric input from one or more sensors of the wearable device to identify an occupant of the vehicle or individual proximate the vehicle. The wearable device may be a wearable earpiece with one or more sensors.

Description

    PRIORITY STATEMENT
  • This application is continuation of U.S. patent application Ser. No. 15/356,839 filed on Nov. 21, 2016, which claims priority to U.S. Provisional Patent Application No. 62/260,436, filed on Nov. 27, 2015, all of which are titled “Vehicle with Wearable for Identifying One or More Vehicle Occupants” and all of which are hereby incorporated by reference in their entirety.
  • FIELD OF THE INVENTION
  • The illustrative embodiments relate to vehicles. More particularly, but not exclusively, the illustrative embodiments relate to a vehicle which integrates with or communicates with a wearable device such as an earpiece or a set of earpieces to identify one or more vehicle occupants.
  • BACKGROUND
  • Vehicles may come with various types of electronics packages. These packages may be standard or optional and include electronics associated with communications or entertainment. However, there are various problems and deficiencies with such offerings. What is needed are vehicles with improved electronics options which create, improve, or enhance safety and overall experience of vehicles.
  • SUMMARY
  • Therefore, it is a primary object, feature, or advantage of the illustrative embodiments to improve over the state of the art.
  • It is another object, feature, or advantage of the illustrative embodiments to communicate between vehicle systems and wearable devices.
  • It is a further object, feature, or advantage of the illustrative embodiments to use wearable devices to increase safety in vehicles.
  • It is a still further object, feature, or advantage of the illustrative embodiments to allow a user to control one or more functions of a vehicle using one or more wearable devices, such as wireless earpieces.
  • It is a still further object, feature, or advantage of the illustrative embodiments to allow a vehicle to identify a driver based on the presence of a wearable device.
  • It is a still further object, feature, or advantage of the illustrative embodiments to allow a vehicle to identify one or more passengers of a vehicle based on the presence of wearable devices.
  • Yet another object, feature, or advantage of the illustrative embodiments is to allow a vehicle to obtain biometric information about a driver or passenger using one or more wearable devices.
  • It is another object, feature, or advantage of the illustrative embodiments to enhance an existing vehicle through addition of a wearable device.
  • One or more of these and/or other objects, features, or advantages of the illustrative embodiments will become apparent from the specification and claims following. No single embodiment need provide every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.
  • According to one aspect a system includes a vehicle, the vehicle comprising a control system and a wireless transceiver operatively connected to the control system. The control system is configured to wirelessly communicate with a wearable device worn by a user using the wireless transceiver. The control system is configured to receive biometric input from one or more sensors of the wearable device to identify an occupant of the vehicle or individual proximate the vehicle.
  • According to another aspect a system includes a vehicle, the vehicle comprising a control system, a first wireless transceiver operatively connected to the control system, and a wearable device for being worn by a user, a second wireless transceiver disposed within the wearable device and configured to wirelessly communicate with the first wireless transceiver. The wearable device includes at least one sensor for obtaining biometric input. The wearable device is configured to identify a wearer of the wearable device using the biometric input and convey an identity of the wearer of the wearable device to the control system.
  • According to another aspect, a system includes a vehicle, the vehicle comprising a vehicle network with a plurality of devices in operative communication with the vehicle network and a wireless transceiver operatively connected to the vehicle network. The wireless transceiver is configured to wirelessly communicate with a wearable device worn by a user and after the user is identified, convey sensor data from the wearable device over the vehicle network to one or more of the plurality of devices.
  • According to yet another aspect a method includes obtaining sensor data at a wearable device, determining a user's identity based on the sensor data and if the user has appropriate access rights, communicating data or commands over a vehicle network to perform vehicle functions.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates one example of a vehicle which integrates with wearable technology.
  • FIG. 2 illustrates one example of a set of wearable devices in the form of earpieces.
  • FIG. 3 is a block diagram of one example of a wearable device in the form of an earpiece.
  • FIG. 4 illustrates a vehicle network or bus allowing different electronic modules to communicate with a wearable device.
  • FIG. 5 illustrates one example of a method.
  • DETAILED DESCRIPTION
  • Some of the most important factors in selecting a vehicle such as a car may be the technology available to enhance the experience. This may be of importance in certain vehicle segments, such as for luxury vehicles. Another important factor in selecting a vehicle may be the available safety features. According to various aspects, the illustrative embodiments allow for wearable devices including earpieces to enhance the experience of vehicles and according to some aspects, the illustrative embodiments allow for wearable devices, such as earpieces to enhance the overall safety of the vehicle. Therefore, it is expected the technology described herein will make any vehicle so equipped more desirable to customers, more satisfying to customers, and potentially more profitable for the vehicle manufacturer and the presence or absence of such technology may drive buying decisions of the consumer. Similarly, at least some of the various aspects may be added to existing vehicles as after-market accessories to improve the safety, accessibility, or experience of existing vehicles.
  • FIG. 1 illustrates one example of use of a wearable device in conjunction with a vehicle. As shown in FIG. 1 there is a vehicle 2. Although the vehicle 2 shown is a full-size sedan, it is contemplated the vehicle 2 may be of any number of types of cars, trucks, sport utility vehicles, vans, mini-vans, automotive vehicles, commercial vehicles, agricultural vehicles, construction vehicles, specialty vehicles, recreational vehicles, buses, motorcycles, aircraft, boats, ships, yachts, trains, spacecraft, or other types of vehicles. The vehicle 2 may be gas-powered, diesel powered, electric, fuel cell, hydrogen, solar-powered, or human-powered. The vehicle 2 may be actively operated by a driver or may be partially or completely autonomous or self-driving. The vehicle 2 may have vehicle control systems 40. The vehicle control systems 40 are systems which may include any number of mechanical and electromechanical subsystems.
  • As shown in FIG. 1, such systems may include a navigation system 42, an entertainment system 44, a vehicle security system 45, an audio system 46, a safety system 47, a communications system 48 preferably with a wireless transceiver, a driver assistance system 49, a passenger comfort system 50, and engine/transmission/chassis electronics systems 51. Of course, other examples of vehicle control sub-systems are contemplated. In addition, it is to be understood there may be overlap between some of these different vehicle systems and the presence or absence of these vehicle systems as well as other vehicle systems may depend upon the type of vehicle, the type of fuel or propulsion system, the size of the vehicle, and other factors and variables. All or portions of the vehicle control systems 40 may be integrated together or in separate locations of the vehicle 2.
  • In the automotive context, examples of the driver assistance system 49 may include one or more subsystems such as a lane assist system, autopilot, a speed assist system, a blind spot detection system, a park assist system, and an adaptive cruise control system. In the automotive context, examples of the passenger comfort system 50 may include one or more subsystems such as automatic climate control, electronic seat adjustment, automatic wipers, automatic headlamps, and automatic cooling. In the automotive context, examples of the safety system 47 may include active safety systems such as air bags, hill descent control, and an emergency brake assist system. Aspects of the navigation system 42, the entertainment system 44, the audio system 46, and the communications system 48 may be combined into an infotainment system.
  • One or more wearable devices such as a set of earpieces 10 including a left earpiece 12A and a right earpiece 12B may in operative communication with the vehicle control system 40, such as through the communication system 48. For example, the communication system 48 may provide a Bluetooth, Wi-Fi, or BLE link to wearable devices or may otherwise provide for communications with the wearable devices through wireless communications. The vehicle 2 may communicate with the wearable device(s) directly, or alternatively, or in addition, the vehicle 2 may communicate with the wearable device(s) through an intermediary device such as a mobile device 4 which may be a mobile phone, a tablet, gaming device, media device, computing device, or other type of mobile device.
  • As will be explained in further details with respect to various examples, one or more of the earpieces 10 interact with the vehicle control system 40 in any number of different ways. For example, the earpieces 10 may provide sensor data, identity information, stored information, streamed information, or other types of information to the vehicle. Based on this information, the vehicle may take any number of actions which may include one or more actions taken by the vehicle control system (or subsystems thereof). In addition, the vehicle 2 may communicate sensor data, identity information, stored information, streamed information or other types of information to the earpieces 10.
  • FIG. 2 illustrates one example of a wearable device in the form of a set of earpieces 10 in greater detail. FIG. 1 illustrates a set of earpiece wearables 10 which includes a left earpiece 12A and a right earpiece 12B. Each of the earpieces 12A, 12B has an earpiece wearable housing 14A, 14B which may be in the form of a protective shell or casing and may be an in-the-ear earpiece housing. A left infrared through ultraviolet spectrometer 16A and right infrared through ultraviolet spectrometer 16B is also shown. Each earpiece 12A, 12B may include one or more microphones 70A, 70B. Note the air microphones 70A, 70B are outward facing so the air microphones 70A, 70B may capture ambient environmental sound. It is to be understood any number of microphones may be present including air conduction microphones, bone conduction microphones, or other audio sensors.
  • FIG. 3 is a block diagram illustrating an earpiece 12. In other embodiments, the earpieces 12 may represent an over-ear headphone, headband, jewelry, or other wearable device. The earpiece 12 may include one or more LEDs 20 electrically connected to an intelligent control system 30. The intelligent control system 30 may include one or more processors, microcontrollers, application specific integrated circuits, or other types of integrated circuits. The intelligent control system 30 may also be electrically connected to one or more sensors 32. Where the device is an earpiece 12, the sensors 32 may include an inertial sensor 74 and another inertial sensor 76. Each inertial sensor 74, 76 may include an accelerometer, a gyro sensor or gyrometer, a magnetometer, or other type of inertial sensor. The sensors 32 may also include one or more contact sensors 72, one or more bone conduction microphones 71, one or more air conduction microphones 70, one or more chemical sensors 79, a pulse oximeter 76, a temperature sensor 80, or other physiological or biological sensor(s). Further examples of physiological or biological sensors include an alcohol sensor 83, glucose sensor 85, or bilirubin sensor 87. Other examples of physiological or biological sensors may also be included in the earpieces 12. These may include a blood pressure sensor 82, an electroencephalogram (EEG) 84, an Adenosine Triphosphate (ATP) sensor, a lactic acid sensor 88, a hemoglobin sensor 90, a hematocrit sensor 92, or other biological or chemical sensors.
  • A spectrometer 16 is also shown. The spectrometer 16 may be an infrared (IR) through ultraviolet (UV) spectrometer although it is contemplated any number of wavelengths in the infrared, visible, X-ray, gamma ray, radio, or ultraviolet spectrums may be detected. The spectrometer 16 may be adapted to measure environmental wavelengths for analysis and recommendations and thus may be located on or at the external facing side of the earpiece 12.
  • A gesture control interface 36 is also operatively connected to or integrated into the intelligent control system 30. The gesture control interface 36 may include one or more emitters 82 and one or more detectors 84 for sensing user gestures. The emitters 82 may be one of any number of types including infrared LEDs. The earpiece 12 may include a transceiver 35 which may allow for induction transmissions such as through near field magnetic induction. A short-range transceiver 34 using Bluetooth, BLE, UWB, Wi-Fi or other means of radio communication may also be present. The short-range transceiver 34 may be used to communicate with the vehicle control system. In operation, the intelligent control system 30 may be configured to convey different information using one or more of the LED(s) 20 based on context or mode of operation of the device. The various sensors 32, the processor 30, and other electronic components may be located on one or more printed circuit boards, chips, or circuits of the earpiece 12. One or more speakers 73 may also be operatively connected to the intelligent control system 30.
  • A magnetic induction electric conduction electromagnetic (E/M) field transceiver 37 or other type of electromagnetic field receiver is also operatively connected to the intelligent control system 30 to link the processor 30 to the electromagnetic field of the user. The use of the E/M transceiver 37 allows the earpiece 12 to link electromagnetically into a personal area network or body area network or another device.
  • According to another aspect, earpiece wearables may be used to identify one or more users. Each earpiece wearable may include its own identifier (e.g., IMEI, RFID tag, unique frequency, serial number, electronic identifier, user-specified name, etc.). In addition, each earpiece 12 may be used to determine or confirm identity of an individual wearing it. This may be accomplished in various ways including through voice imprint. An individual may speak, and their voice may be analyzed by the earpiece 12 and compared to known samples or metrics to identify the individual. Fingerprints, gestures, tactile feedback, height, skin conductivity, passwords, or other information may also be determined by the sensors 32 or the earpiece 12 and utilized for authentication.
  • Other types of user identification and authentication may also be used. For example, an individual may be asked to specify other information to the earpiece 12 to confirm identity. This may include answering specific questions. For example, the earpiece 12 may ask multiple questions with yes, no, multiple choice, or free form answers which the correct individual will know but others are not likely to know. These questions may be stored within a database and are questions which the individual associated with the earpiece 12 specifically provided answers for. These questions may also be based on activities of the user which are stored on the earpiece 12 or are retrievable from a system in operative communication with the earpiece 12. These may include information about physical activities, locations, or other activities.
  • Alternatively, instead of the earpiece performing the analysis associated with user identification and authentication, necessary information, such as voice samples or voice or gestural responses may be collected by the earpiece 12 and communicated to the vehicle, mobile device, or other device for performing the analysis.
  • Once a user has been identified the user may be authorized to perform various functions in various ways. For example, the vehicle may be unlocked such as by a person saying “unlock” or the vehicle may be remote started and environmental controls set by a person saying, “start my car and set temperature to 72 degrees.” These actions may be taken by the vehicle control system or its subsystems such as an access and security subsystem or a climate control subsystem. In addition, actions may be taken based on proximity of the individual to the user or based on other contextual information.
  • Various types of vehicle controls may be a part of the vehicle access and security subsystems. These may include actuators such as actuators associated with door locks or locks associated with other compartments. Other types of vehicle controls may include an ignition lock switch which may be unlocked or locked. Other types of vehicle controls may include actuators associated with windows. In addition to these functions, any number of different vehicle functions or related processes may be performed. The vehicle functions performed by a properly identified individual may be the same types of vehicle functions an individual may perform as a driver of the vehicle. Other types of vehicle controls may include any number of settings such as audio system settings, engine controls/components, temperature control settings, entertainment system settings, navigation settings, or other types of settings. The earpiece 12 may also be utilized to control (e.g., initiate, end, adjust settings, etc.) vehicle tracking systems, camera systems, anti-locking breaks, traction control systems, four wheel drive systems, electronic stability control (ESC), dynamic steering response, driver wakefulness monitoring, assured clear distance ahead, adaptive headlamps, advanced automatic collision notification, automotive night vision, blind spot monitoring, precrash systems, safe speed governing, traffic sign recognition, dead man's switch, and so forth.
  • FIG. 4 illustrates another example. In FIG. 4, a vehicle network 100 is shown. According to one aspect, the left earpiece and the right earpiece 12A, 12B may communicate information through a vehicle network 100 associated with the vehicle 2. Thus, once an identity of a user has been established, commands may be communicated over the vehicle network 100 or vehicle bus to perform one or more vehicle functions. Protocols which are used may include a Controller Area Network (CAN), Local Interconnect Network (LIN), local area network (LAN), personal area network (PAN), or others including proprietary network protocols or network protocol overlays.
  • Various types of electronic control modules 102, 104, 106, 108 or electronic control units may communicate over the network 100 of the vehicle 102. These may include electronic modules such as an engine control unit (ECU), a transmission control unit (TCU), an anti-lock braking system (ABS), a body control module (BCM), a door control unit (DCU), an electric power steering control unit (PSCU), a human-machine interface (HMI), powertrain control module (PCM), speed control unit (SCU), telematic control unit (TCU), brake control unit (BCM), battery management system, and numerous others. Any number of electronic control modules may be operatively connected to the vehicle network 100. The commands may represent audio or verbal commands, tactile commands (e.g., taps, swipes, etc.), head gestures, hand motions near the earpieces 12, or another detectable user feedback. In one embodiment, the various commands may be associated with different components and functions of the electronic control modules 102, 104, 106, 108. The earpieces 10, an associated wireless device, an electronic device, or the vehicle 2 interface may be utilized to associate the commands with specific actions. Databases, memory data, macro, or scripts may associate the user command, input, or feedback with the implemented action. For example, an application or set of instructions executed by the vehicle 2 may associate a head gesture, such as two head nods, with an unlock function for the driver's side door of the vehicle 12.
  • In one embodiment, a wireless transceiver module 110 is operatively connected to a vehicle network 100 and it is the wireless transceiver module 110 which is in operative communication with one or more wearable devices, such as the wearable earpieces 10. Once an earpiece 12A, 12B or the vehicle 2 has identified a user, then the user is permitted to give instructions which are translated into commands which are communicated over the vehicle network 100 to an appropriate system or component of the vehicle or to communicate data such as data from one or more sensors of each of the earpieces 12A, 12B. Data from the earpieces 10 may be used by any number of different electronic control modules or electronic control units 102, 104, 106, 108 connected to the vehicle network 100 to perform any number of different vehicle functions.
  • FIG. 5 illustrates one example of a methodology. In one embodiment, the one or more wearable devices may represent one or more wireless earpieces, such as those shown and described in the various embodiments. As shown in FIG. 5 at step 120 sensor data is obtained at one or more wearable devices. As previously explained the sensor data may be one of any number of types. For example, the sensor data may be voice data or other biometric data. In step 122 a determination is made of the user identity based on the sensor data. Where the sensor data is voice data this determination may be as the result of a voice print or voice sample analysis. Any number of different products or components may be used to perform this analysis. Examples of commercial products for performing such functionality include Nuance VocalPassword, Watson, Siri, Alexa, Google Voice, VoiceIT, and numerous others. It should be further understood other types of biometric data may be used. For example, where the wearable device is a pair of glasses than retina identification and authentication may be used. Where the wearable device is a pair of gloves, finger print analysis may be used. Similarly, wireless earpieces may be utilized to scan fingerprints as well. The determination of the user identity based on sensor data may be performed in one of several different locations based on the type of analysis and available computational resources.
  • For example, the determination may be performed on or at the wearable device itself. Alternatively, the determination may be performed on or at the vehicle. Alternatively, still, the determination may be performed by a mobile device such as a smart phone which is in operative communication with either the wearable device(s) or the vehicle, or both.
  • Once the individual has been identified or recognized, in step 124 a determination is made as to whether the user has access rights. In one implementation, if the user is identified then the user has appropriate access rights. In alternative implementations, identifying the user does not necessarily give the individual all rights. Where the user has appropriate access rights, or none are required, in step 126 data or commands may be communicated over the vehicle network to perform various vehicle functions. Data from the wearable device(s) may be used by any electronic control modules associated with the vehicle network to provide input to be used in any number of different decision-making processes. Similarly, commands may be given from the user to the vehicle using the wearable device such as when the wearable device is an earpiece and the commands may be given through voice input from the user.
  • Any number of actions or access may be granted for implementation utilizing one or more of the earpieces, vehicle systems, wireless devices, or other networked devices. In one example, the user may receive a phone call through a wireless device within the vehicle or by a communication system within the vehicle. In response to the user being authorized or authenticated, the user may provide feedback utilizing the wireless earpieces, such as a double head nod, thereby accepting the phone call for communication through the speakers and microphones of the vehicle. In addition, the communications may be communicated through the wireless earpieces and augmented by the vehicle communication systems (e.g., displaying the caller, call length, etc.).
  • In another example, the user may provide a verbal command, such as “enter sport mode”, thereby providing a command to the vehicle to adjust the performance of the vehicle (e.g., engine torque/output, transmission performance, suspension settings, etc.). The wireless earpieces may be configured to listen for or receive a command at any time. In other embodiments, a “listen” mode may be activated in response to an input, such as a finger tap of the wireless earpieces, initiation of a vehicle feature, head motion, or so forth. The listen mode may prepare the wireless earpieces to receive a command, input, or feedback from the user.
  • In another example, the wireless earpieces may provide a method of monitoring biometrics of the user, such as heart rate, blood pressure, blood oxygenation, respiration rate, head position, voice output, or other measurements or readings detectable by the various sensors within the wireless earpieces and/or the vehicle. For example, the wireless earpieces may determine the user is fatigued based on the user's heart rate, respiration, and head motion to provide an alert through the vehicle systems, such as a message indicating the user should pull over communicated through the infotainment system, a heads-up display (e.g., electronic glass), or other vehicle systems. For example, the user settings may indicate the windows are rolled down and the music is turned up until the user can find a suitable place to stop or park. The wireless earpieces may also warn the user if he is impaired based on a determined blood alcohol level, cognition test, slurred speech, or other relevant factors. As a result, the wireless earpieces may help protect the user from his or herself, passengers within the vehicle, and third parties outside the vehicle. In one embodiment, the wireless earpieces may be configured to lock out one or more vehicle systems in response to determining the user is impaired.
  • The wireless earpieces may also indicate biometrics in the event there is an accident, health event, or so forth. For example, the wireless earpieces may send a command for the vehicle to enter an emergency pullover mode in response to determining the user is experiencing a health event, such as a heart attack, stroke, seizure, or other event or condition preventing the user from safely operating the vehicle. The wireless earpieces may also send one or more communications to emergency services, emergency contacts, or so forth.
  • In another example, the wireless earpieces may be utilized to monitor a younger or inexperienced user operating the vehicle. For example, to operate the vehicle, an administrator of the vehicle may require the wireless earpieces be worn to determine the watchfulness of the user determined by factors, such as head position, conversations or audio detected, activation/utilization of and associated cellular phone, the wireless earpieces, or the vehicle systems. As a result, the wireless earpieces may be utilized as a parental monitoring feature wall the user is within the vehicle.
  • The wireless earpieces may also be utilized to perform any number of small tasks significantly enhancing the user experience, such as opening individual doors, unlocking the trunk, opening windows/sunroofs, starting the vehicle, turning off the vehicle, turning on or off the air conditioning/heater, adjust a seat configuration, turning on a movie/music, or any number of other features commonly utilized by the user.
  • The wireless earpieces in conjunction with the vehicle systems, may also learn the preferences of the user over time to perform automatic features and settings of the vehicle.
  • It is further contemplated commands may be automatically communicated based on the identity of the user. In other words, once the user has been identified the vehicle may perform one or more vehicle functions automatically based on the identity of the user. These functions may be any number of different functions previously discussed including functions granting access or deny access to the user.
  • Various methods, system, and apparatus have been shown and described relating to vehicles with wearable integration or communication. The present invention is not to be limited to these specific examples but contemplates any number of related methods, system, and apparatus and these examples may vary based on the specific type of vehicle, the specific type of wearable device, and other considerations.

Claims (8)

What is claimed is:
1. A system comprising:
a vehicle, the vehicle comprising a control system; and
a wireless transceiver operatively connected to the control system;
a set of wireless earpieces in operative communication with the wireless transceiver, the set of wireless earpieces including one or more sensors for providing biometric input;
wherein the control system is configured to receive biometric input from the set of wireless earpieces and identify an individual using the set of wireless earpieces by using the biometric input, wherein the individual is an occupant of the vehicle;
wherein the set of wireless earpieces is configured to confirm identity of the individual using the wireless earpieces.
2. The system of claim 1 wherein the control system is configured to provide access to the vehicle after the individual has been identified by the control system using the biometric input and the identity of the individual is confirmed using the wireless earpieces.
3. The system of claim 1 wherein the access is provided by unlocking an ignition of the vehicle.
4. The system of claim 1 wherein the access is provided by opening a door or compartment of the vehicle.
5. The system of claim 1 wherein the control system is configured to deny access to the vehicle after identifying the individual.
6. The system of claim 1 wherein the control system is configured to alter one or more vehicle settings based on an identity of the individual.
7. The system of claim 1 wherein the one or more sensors comprises a microphone and wherein the biometric input comprises voice input from the individual.
8. The system of claim 1 wherein the biometric input comprises inertial data.
US16/040,799 2015-11-27 2018-07-20 Vehicle with wearable for identifying one or more vehicle occupants Abandoned US20180345909A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/040,799 US20180345909A1 (en) 2015-11-27 2018-07-20 Vehicle with wearable for identifying one or more vehicle occupants

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562260436P 2015-11-27 2015-11-27
US15/356,839 US10040423B2 (en) 2015-11-27 2016-11-21 Vehicle with wearable for identifying one or more vehicle occupants
US16/040,799 US20180345909A1 (en) 2015-11-27 2018-07-20 Vehicle with wearable for identifying one or more vehicle occupants

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/356,839 Continuation US10040423B2 (en) 2015-11-27 2016-11-21 Vehicle with wearable for identifying one or more vehicle occupants

Publications (1)

Publication Number Publication Date
US20180345909A1 true US20180345909A1 (en) 2018-12-06

Family

ID=57588954

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/356,839 Active US10040423B2 (en) 2015-11-27 2016-11-21 Vehicle with wearable for identifying one or more vehicle occupants
US16/040,799 Abandoned US20180345909A1 (en) 2015-11-27 2018-07-20 Vehicle with wearable for identifying one or more vehicle occupants

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/356,839 Active US10040423B2 (en) 2015-11-27 2016-11-21 Vehicle with wearable for identifying one or more vehicle occupants

Country Status (3)

Country Link
US (2) US10040423B2 (en)
EP (1) EP3416554A1 (en)
WO (1) WO2017089537A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2616446A (en) * 2022-03-09 2023-09-13 Continental Automotive Tech Gmbh Apparatus capable of being used as a primary device and an auxiliary device, and a system in association thereto

Families Citing this family (135)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9854372B2 (en) 2015-08-29 2017-12-26 Bragi GmbH Production line PCB serial programming and testing method and system
US9949008B2 (en) 2015-08-29 2018-04-17 Bragi GmbH Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method
US10122421B2 (en) 2015-08-29 2018-11-06 Bragi GmbH Multimodal communication system using induction and radio and method
US9972895B2 (en) 2015-08-29 2018-05-15 Bragi GmbH Antenna for use in a wearable device
US9905088B2 (en) 2015-08-29 2018-02-27 Bragi GmbH Responsive visual communication system and method
US9949013B2 (en) 2015-08-29 2018-04-17 Bragi GmbH Near field gesture control system and method
US9843853B2 (en) 2015-08-29 2017-12-12 Bragi GmbH Power control for battery powered personal area network device system and method
US9980189B2 (en) 2015-10-20 2018-05-22 Bragi GmbH Diversity bluetooth system and method
US10104458B2 (en) 2015-10-20 2018-10-16 Bragi GmbH Enhanced biometric control systems for detection of emergency events system and method
US9866941B2 (en) 2015-10-20 2018-01-09 Bragi GmbH Multi-point multiple sensor array for data sensing and processing system and method
US10040423B2 (en) * 2015-11-27 2018-08-07 Bragi GmbH Vehicle with wearable for identifying one or more vehicle occupants
US10715518B2 (en) * 2015-12-08 2020-07-14 Lenovo (Singapore) Pte. Ltd. Determination of device with which to establish communication based on biometric input
US9980033B2 (en) 2015-12-21 2018-05-22 Bragi GmbH Microphone natural speech capture voice dictation system and method
US9939891B2 (en) 2015-12-21 2018-04-10 Bragi GmbH Voice dictation systems using earpiece microphone system and method
US10085091B2 (en) 2016-02-09 2018-09-25 Bragi GmbH Ambient volume modification through environmental microphone feedback loop system and method
US10097919B2 (en) 2016-02-22 2018-10-09 Sonos, Inc. Music service selection
US10264030B2 (en) 2016-02-22 2019-04-16 Sonos, Inc. Networked microphone device control
US9965247B2 (en) 2016-02-22 2018-05-08 Sonos, Inc. Voice controlled media playback system based on user profile
US9947316B2 (en) 2016-02-22 2018-04-17 Sonos, Inc. Voice control of a media playback system
US9811314B2 (en) 2016-02-22 2017-11-07 Sonos, Inc. Metadata exchange involving a networked playback system and a networked microphone system
US10095470B2 (en) 2016-02-22 2018-10-09 Sonos, Inc. Audio response playback
US10085082B2 (en) 2016-03-11 2018-09-25 Bragi GmbH Earpiece with GPS receiver
US10045116B2 (en) 2016-03-14 2018-08-07 Bragi GmbH Explosive sound pressure level active noise cancellation utilizing completely wireless earpieces system and method
US10052065B2 (en) 2016-03-23 2018-08-21 Bragi GmbH Earpiece life monitor with capability of automatic notification system and method
US10015579B2 (en) 2016-04-08 2018-07-03 Bragi GmbH Audio accelerometric feedback through bilateral ear worn device system and method
US10013542B2 (en) 2016-04-28 2018-07-03 Bragi GmbH Biometric interface system and method
US9978390B2 (en) 2016-06-09 2018-05-22 Sonos, Inc. Dynamic player selection for audio signal processing
US10045110B2 (en) 2016-07-06 2018-08-07 Bragi GmbH Selective sound field environment processing system and method
US10201309B2 (en) 2016-07-06 2019-02-12 Bragi GmbH Detection of physiological data using radar/lidar of wireless earpieces
US10621583B2 (en) 2016-07-07 2020-04-14 Bragi GmbH Wearable earpiece multifactorial biometric analysis system and method
US10516930B2 (en) 2016-07-07 2019-12-24 Bragi GmbH Comparative analysis of sensors to control power status for wireless earpieces
US10134399B2 (en) 2016-07-15 2018-11-20 Sonos, Inc. Contextualization of voice inputs
US10115400B2 (en) 2016-08-05 2018-10-30 Sonos, Inc. Multiple voice services
US10397686B2 (en) 2016-08-15 2019-08-27 Bragi GmbH Detection of movement adjacent an earpiece device
US10977348B2 (en) 2016-08-24 2021-04-13 Bragi GmbH Digital signature using phonometry and compiled biometric data system and method
US10104464B2 (en) 2016-08-25 2018-10-16 Bragi GmbH Wireless earpiece and smart glasses system and method
US10409091B2 (en) 2016-08-25 2019-09-10 Bragi GmbH Wearable with lenses
US11086593B2 (en) 2016-08-26 2021-08-10 Bragi GmbH Voice assistant for wireless earpieces
US10313779B2 (en) 2016-08-26 2019-06-04 Bragi GmbH Voice assistant system for wireless earpieces
US11200026B2 (en) 2016-08-26 2021-12-14 Bragi GmbH Wireless earpiece with a passive virtual assistant
US10887679B2 (en) 2016-08-26 2021-01-05 Bragi GmbH Earpiece for audiograms
US10200780B2 (en) 2016-08-29 2019-02-05 Bragi GmbH Method and apparatus for conveying battery life of wireless earpiece
US11490858B2 (en) 2016-08-31 2022-11-08 Bragi GmbH Disposable sensor array wearable device sleeve system and method
US10580282B2 (en) 2016-09-12 2020-03-03 Bragi GmbH Ear based contextual environment and biometric pattern recognition system and method
US10598506B2 (en) 2016-09-12 2020-03-24 Bragi GmbH Audio navigation using short range bilateral earpieces
US10852829B2 (en) 2016-09-13 2020-12-01 Bragi GmbH Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method
US11283742B2 (en) 2016-09-27 2022-03-22 Bragi GmbH Audio-based social media platform
US9942678B1 (en) 2016-09-27 2018-04-10 Sonos, Inc. Audio playback settings for voice interaction
US10460095B2 (en) 2016-09-30 2019-10-29 Bragi GmbH Earpiece with biometric identifiers
US10049184B2 (en) 2016-10-07 2018-08-14 Bragi GmbH Software application transmission via body interface using a wearable device in conjunction with removable body sensor arrays system and method
US10181323B2 (en) 2016-10-19 2019-01-15 Sonos, Inc. Arbitration-based voice recognition
US10455313B2 (en) 2016-10-31 2019-10-22 Bragi GmbH Wireless earpiece with force feedback
US10698983B2 (en) 2016-10-31 2020-06-30 Bragi GmbH Wireless earpiece with a medical engine
US10771877B2 (en) 2016-10-31 2020-09-08 Bragi GmbH Dual earpieces for same ear
US10942701B2 (en) 2016-10-31 2021-03-09 Bragi GmbH Input and edit functions utilizing accelerometer based earpiece movement system and method
US10117604B2 (en) 2016-11-02 2018-11-06 Bragi GmbH 3D sound positioning with distributed sensors
US10617297B2 (en) 2016-11-02 2020-04-14 Bragi GmbH Earpiece with in-ear electrodes
US10205814B2 (en) 2016-11-03 2019-02-12 Bragi GmbH Wireless earpiece with walkie-talkie functionality
US10225638B2 (en) 2016-11-03 2019-03-05 Bragi GmbH Ear piece with pseudolite connectivity
US10062373B2 (en) 2016-11-03 2018-08-28 Bragi GmbH Selective audio isolation from body generated sound system and method
US10821361B2 (en) 2016-11-03 2020-11-03 Bragi GmbH Gaming with earpiece 3D audio
US10058282B2 (en) 2016-11-04 2018-08-28 Bragi GmbH Manual operation assistance with earpiece with 3D sound cues
US10045117B2 (en) 2016-11-04 2018-08-07 Bragi GmbH Earpiece with modified ambient environment over-ride function
US10045112B2 (en) 2016-11-04 2018-08-07 Bragi GmbH Earpiece with added ambient environment
US10063957B2 (en) 2016-11-04 2018-08-28 Bragi GmbH Earpiece with source selection within ambient environment
US10506327B2 (en) 2016-12-27 2019-12-10 Bragi GmbH Ambient environmental sound field manipulation based on user defined voice and audio recognition pattern analysis system and method
US10405081B2 (en) 2017-02-08 2019-09-03 Bragi GmbH Intelligent wireless headset system
US10582290B2 (en) 2017-02-21 2020-03-03 Bragi GmbH Earpiece with tap functionality
DE102017103931A1 (en) * 2017-02-24 2018-08-30 Eisenmann Se Conveyor system and method for the simultaneous transport of workpieces and workers
US10771881B2 (en) 2017-02-27 2020-09-08 Bragi GmbH Earpiece with audio 3D menu
US11544104B2 (en) 2017-03-22 2023-01-03 Bragi GmbH Load sharing between wireless earpieces
US11380430B2 (en) 2017-03-22 2022-07-05 Bragi GmbH System and method for populating electronic medical records with wireless earpieces
US11694771B2 (en) 2017-03-22 2023-07-04 Bragi GmbH System and method for populating electronic health records with wireless earpieces
US10575086B2 (en) 2017-03-22 2020-02-25 Bragi GmbH System and method for sharing wireless earpieces
US11183181B2 (en) * 2017-03-27 2021-11-23 Sonos, Inc. Systems and methods of multiple voice services
US10708699B2 (en) 2017-05-03 2020-07-07 Bragi GmbH Hearing aid with added functionality
US11116415B2 (en) 2017-06-07 2021-09-14 Bragi GmbH Use of body-worn radar for biometric measurements, contextual awareness and identification
US11013445B2 (en) 2017-06-08 2021-05-25 Bragi GmbH Wireless earpiece with transcranial stimulation
US10475449B2 (en) 2017-08-07 2019-11-12 Sonos, Inc. Wake-word detection suppression
US10048930B1 (en) 2017-09-08 2018-08-14 Sonos, Inc. Dynamic computation of system response volume
US10344960B2 (en) 2017-09-19 2019-07-09 Bragi GmbH Wireless earpiece controlled medical headlight
US11272367B2 (en) 2017-09-20 2022-03-08 Bragi GmbH Wireless earpieces for hub communications
JP6915481B2 (en) * 2017-09-27 2021-08-04 トヨタ自動車株式会社 Vehicle control system
US10446165B2 (en) 2017-09-27 2019-10-15 Sonos, Inc. Robust short-time fourier transform acoustic echo cancellation during audio playback
US10621981B2 (en) 2017-09-28 2020-04-14 Sonos, Inc. Tone interference cancellation
US10482868B2 (en) 2017-09-28 2019-11-19 Sonos, Inc. Multi-channel acoustic echo cancellation
US10051366B1 (en) 2017-09-28 2018-08-14 Sonos, Inc. Three-dimensional beam forming with a microphone array
US10466962B2 (en) 2017-09-29 2019-11-05 Sonos, Inc. Media playback system with voice assistance
US10880650B2 (en) 2017-12-10 2020-12-29 Sonos, Inc. Network microphone devices with automatic do not disturb actuation capabilities
US10818290B2 (en) 2017-12-11 2020-10-27 Sonos, Inc. Home graph
WO2019152722A1 (en) 2018-01-31 2019-08-08 Sonos, Inc. Device designation of playback and network microphone device arrangements
US11175880B2 (en) 2018-05-10 2021-11-16 Sonos, Inc. Systems and methods for voice-assisted media content selection
US10959029B2 (en) 2018-05-25 2021-03-23 Sonos, Inc. Determining and adapting to changes in microphone performance of playback devices
US10681460B2 (en) 2018-06-28 2020-06-09 Sonos, Inc. Systems and methods for associating playback devices with voice assistant services
US11076035B2 (en) 2018-08-28 2021-07-27 Sonos, Inc. Do not disturb feature for audio notifications
US10461710B1 (en) 2018-08-28 2019-10-29 Sonos, Inc. Media playback system with maximum volume setting
US10587430B1 (en) 2018-09-14 2020-03-10 Sonos, Inc. Networked devices, systems, and methods for associating playback devices based on sound codes
US11024331B2 (en) 2018-09-21 2021-06-01 Sonos, Inc. Voice detection optimization using sound metadata
US10811015B2 (en) 2018-09-25 2020-10-20 Sonos, Inc. Voice detection optimization based on selected voice assistant service
US11100923B2 (en) 2018-09-28 2021-08-24 Sonos, Inc. Systems and methods for selective wake word detection using neural network models
US10692518B2 (en) 2018-09-29 2020-06-23 Sonos, Inc. Linear filtering for noise-suppressed speech detection via multiple network microphone devices
US11899519B2 (en) 2018-10-23 2024-02-13 Sonos, Inc. Multiple stage network microphone device with reduced power consumption and processing load
EP3654249A1 (en) 2018-11-15 2020-05-20 Snips Dilated convolutions and gating for efficient keyword spotting
US11490843B2 (en) 2018-11-16 2022-11-08 Toyota Motor North America, Inc. Vehicle occupant health monitor system and method
US11183183B2 (en) 2018-12-07 2021-11-23 Sonos, Inc. Systems and methods of operating media playback systems having multiple voice assistant services
US11132989B2 (en) 2018-12-13 2021-09-28 Sonos, Inc. Networked microphone devices, systems, and methods of localized arbitration
US10602268B1 (en) 2018-12-20 2020-03-24 Sonos, Inc. Optimization of network microphone devices using noise classification
US10867604B2 (en) 2019-02-08 2020-12-15 Sonos, Inc. Devices, systems, and methods for distributed voice processing
US11315556B2 (en) 2019-02-08 2022-04-26 Sonos, Inc. Devices, systems, and methods for distributed voice processing by transmitting sound data associated with a wake word to an appropriate device for identification
US11120794B2 (en) 2019-05-03 2021-09-14 Sonos, Inc. Voice assistant persistence across multiple network microphone devices
US10586540B1 (en) 2019-06-12 2020-03-10 Sonos, Inc. Network microphone device with command keyword conditioning
US11361756B2 (en) 2019-06-12 2022-06-14 Sonos, Inc. Conditional wake word eventing based on environment
US11200894B2 (en) 2019-06-12 2021-12-14 Sonos, Inc. Network microphone device with command keyword eventing
US10871943B1 (en) 2019-07-31 2020-12-22 Sonos, Inc. Noise classification for event detection
US11138975B2 (en) 2019-07-31 2021-10-05 Sonos, Inc. Locally distributed keyword detection
US11138969B2 (en) 2019-07-31 2021-10-05 Sonos, Inc. Locally distributed keyword detection
US10937539B1 (en) * 2019-08-08 2021-03-02 Toyota Motor North America, Inc. Automated activity suggestions based on wearable connectivity with vehicle systems
US11189286B2 (en) 2019-10-22 2021-11-30 Sonos, Inc. VAS toggle based on device orientation
US11200900B2 (en) 2019-12-20 2021-12-14 Sonos, Inc. Offline voice control
US11562740B2 (en) 2020-01-07 2023-01-24 Sonos, Inc. Voice verification for media playback
US11556307B2 (en) 2020-01-31 2023-01-17 Sonos, Inc. Local voice data processing
US11308958B2 (en) 2020-02-07 2022-04-19 Sonos, Inc. Localized wakeword verification
US11482224B2 (en) 2020-05-20 2022-10-25 Sonos, Inc. Command keywords with input detection windowing
US11727919B2 (en) 2020-05-20 2023-08-15 Sonos, Inc. Memory allocation for keyword spotting engines
US11308962B2 (en) 2020-05-20 2022-04-19 Sonos, Inc. Input detection windowing
US12387716B2 (en) 2020-06-08 2025-08-12 Sonos, Inc. Wakewordless voice quickstarts
US11698771B2 (en) 2020-08-25 2023-07-11 Sonos, Inc. Vocal guidance engines for playback devices
US12283269B2 (en) 2020-10-16 2025-04-22 Sonos, Inc. Intent inference in audiovisual communication sessions
US11984123B2 (en) 2020-11-12 2024-05-14 Sonos, Inc. Network device interaction by range
EP4116946A1 (en) * 2021-07-06 2023-01-11 Nxp B.V. Access control system
WO2023056258A1 (en) 2021-09-30 2023-04-06 Sonos, Inc. Conflict management for wake-word detection processes
US12327556B2 (en) 2021-09-30 2025-06-10 Sonos, Inc. Enabling and disabling microphones and voice assistants
US12327549B2 (en) 2022-02-09 2025-06-10 Sonos, Inc. Gatekeeping for voice intent processing
US12344193B2 (en) 2022-12-22 2025-07-01 Ford Global Technologies, Llc Wearable virtual keys for vehicle access
DE102023204984B4 (en) * 2023-05-26 2025-06-12 Sivantos Pte. Ltd. Access control procedure for a restricted area using a hearing aid

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6041410A (en) * 1997-12-22 2000-03-21 Trw Inc. Personal identification fob
US6140939A (en) * 1995-04-14 2000-10-31 Flick; Kenneth E. Biometric characteristic vehicle control system having verification and reset features
US20020057810A1 (en) * 1999-05-10 2002-05-16 Boesen Peter V. Computer and voice communication unit with handsfree device
US20030002705A1 (en) * 1999-05-10 2003-01-02 Boesen Peter V. Earpiece with an inertial sensor
US20040124968A1 (en) * 2002-12-19 2004-07-01 Shinsaku Inada Boarding environment controlling system, boarding environment controlling apparatus, and boarding environment controlling method
US20080146890A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Telemetric apparatus for health and environmental monitoring
US20090128286A1 (en) * 2007-11-20 2009-05-21 Vitito Christopher J System for controlling the use of electronic devices within an automobile
US20090154739A1 (en) * 2007-12-13 2009-06-18 Samuel Zellner Systems and methods employing multiple individual wireless earbuds for a common audio source
US20100075631A1 (en) * 2006-03-20 2010-03-25 Black Gerald R Mobile communication device
US20110215921A1 (en) * 2009-06-22 2011-09-08 Mourad Ben Ayed Systems for wireless authentication based on bluetooth proximity
US8610585B1 (en) * 2009-12-07 2013-12-17 Matthew Kielbasa Electronic alerting device and associated method
US20150028996A1 (en) * 2013-07-25 2015-01-29 Bionym Inc. Preauthorized wearable biometric device, system and method for use thereof
US10040423B2 (en) * 2015-11-27 2018-08-07 Bragi GmbH Vehicle with wearable for identifying one or more vehicle occupants
US10099636B2 (en) * 2015-11-27 2018-10-16 Bragi GmbH System and method for determining a user role and user settings associated with a vehicle

Family Cites Families (137)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3934100A (en) 1974-04-22 1976-01-20 Seeburg Corporation Acoustic coupler for use with auditory equipment
US4150262A (en) 1974-11-18 1979-04-17 Hiroshi Ono Piezoelectric bone conductive in ear voice sounds transmitting and receiving apparatus
JPS5850078B2 (en) 1979-05-04 1983-11-08 株式会社 弦エンジニアリング Vibration pickup type ear microphone transmitting device and transmitting/receiving device
JPS56152395A (en) 1980-04-24 1981-11-25 Gen Eng:Kk Ear microphone of simultaneous transmitting and receiving type
US4375016A (en) 1980-04-28 1983-02-22 Qualitone Hearing Aids Inc. Vented ear tip for hearing aid and adapter coupler therefore
US4588867A (en) 1982-04-27 1986-05-13 Masao Konomi Ear microphone
JPS6068734U (en) 1983-10-18 1985-05-15 株式会社岩田エレクトリツク handset
US4682180A (en) 1985-09-23 1987-07-21 American Telephone And Telegraph Company At&T Bell Laboratories Multidirectional feed and flush-mounted surface wave antenna
US4791673A (en) 1986-12-04 1988-12-13 Schreiber Simeon B Bone conduction audio listening device and method
US4865044A (en) 1987-03-09 1989-09-12 Wallace Thomas L Temperature-sensing system for cattle
US5201007A (en) 1988-09-15 1993-04-06 Epic Corporation Apparatus and method for conveying amplified sound to ear
US5298692A (en) 1990-11-09 1994-03-29 Kabushiki Kaisha Pilot Earpiece for insertion in an ear canal, and an earphone, microphone, and earphone/microphone combination comprising the same
US5191602A (en) 1991-01-09 1993-03-02 Plantronics, Inc. Cellular telephone headset
US5295193A (en) 1992-01-22 1994-03-15 Hiroshi Ono Device for picking up bone-conducted sound in external auditory meatus and communication device using the same
US5343532A (en) 1992-03-09 1994-08-30 Shugart Iii M Wilbert Hearing aid device
US5280524A (en) 1992-05-11 1994-01-18 Jabra Corporation Bone conductive ear microphone and method
JP3499239B2 (en) 1992-05-11 2004-02-23 ジャブラ・コーポレーション Unidirectional ear microphone and method
JPH06292195A (en) 1993-03-31 1994-10-18 Matsushita Electric Ind Co Ltd Portable radio type tv telephone
US5497339A (en) 1993-11-15 1996-03-05 Ete, Inc. Portable apparatus for providing multiple integrated communication media
US5933506A (en) 1994-05-18 1999-08-03 Nippon Telegraph And Telephone Corporation Transmitter-receiver having ear-piece type acoustic transducing part
US5749072A (en) 1994-06-03 1998-05-05 Motorola Inc. Communications device responsive to spoken commands and methods of using same
US5613222A (en) 1994-06-06 1997-03-18 The Creative Solutions Company Cellular telephone headset for hand-free communication
US6339754B1 (en) 1995-02-14 2002-01-15 America Online, Inc. System for automated translation of speech
US5692059A (en) 1995-02-24 1997-11-25 Kruger; Frederick M. Two active element in-the-ear microphone system
WO1996037052A1 (en) 1995-05-18 1996-11-21 Aura Communications, Inc. Short-range magnetic communication system
US5721783A (en) 1995-06-07 1998-02-24 Anderson; James C. Hearing aid with wireless remote processor
US5606621A (en) 1995-06-14 1997-02-25 Siemens Hearing Instruments, Inc. Hybrid behind-the-ear and completely-in-canal hearing aid
US6081724A (en) 1996-01-31 2000-06-27 Qualcomm Incorporated Portable communication device and accessory system
JP3815513B2 (en) 1996-08-19 2006-08-30 ソニー株式会社 earphone
US5802167A (en) 1996-11-12 1998-09-01 Hong; Chu-Chai Hands-free device for use with a cellular telephone in a car to permit hands-free operation of the cellular telephone
US6112103A (en) 1996-12-03 2000-08-29 Puthuff; Steven H. Personal communication device
IL119948A (en) 1996-12-31 2004-09-27 News Datacom Ltd Voice activated communication system and program guide
US6111569A (en) 1997-02-21 2000-08-29 Compaq Computer Corporation Computer-based universal remote control system
US5987146A (en) 1997-04-03 1999-11-16 Resound Corporation Ear canal microphone
US6021207A (en) 1997-04-03 2000-02-01 Resound Corporation Wireless open ear canal earpiece
US6181801B1 (en) 1997-04-03 2001-01-30 Resound Corporation Wired open ear canal earpiece
DE19721982C2 (en) 1997-05-26 2001-08-02 Siemens Audiologische Technik Communication system for users of a portable hearing aid
US5929774A (en) 1997-06-13 1999-07-27 Charlton; Norman J Combination pager, organizer and radio
US5916181A (en) 1997-10-24 1999-06-29 Creative Sports Designs, Inc. Head gear for detecting head motion and providing an indication of head movement
US6167039A (en) 1997-12-17 2000-12-26 Telefonaktiebolget Lm Ericsson Mobile station having plural antenna elements and interference suppression
US6041130A (en) 1998-06-23 2000-03-21 Mci Communications Corporation Headset with multiple connections
US6054989A (en) 1998-09-14 2000-04-25 Microsoft Corporation Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio
US6519448B1 (en) 1998-09-30 2003-02-11 William A. Dress Personal, self-programming, short-range transceiver system
US20030034874A1 (en) 1998-10-29 2003-02-20 W. Stephen G. Mann System or architecture for secure mail transport and verifiable delivery, or apparatus for mail security
US20020030637A1 (en) 1998-10-29 2002-03-14 Mann W. Stephen G. Aremac-based means and apparatus for interaction with computer, or one or more other people, through a camera
US6275789B1 (en) 1998-12-18 2001-08-14 Leo Moser Method and apparatus for performing full bidirectional translation between a source language and a linked alternative language
US20010005197A1 (en) 1998-12-21 2001-06-28 Animesh Mishra Remotely controlling electronic devices
EP1017252A3 (en) 1998-12-31 2006-05-31 Resistance Technology, Inc. Hearing aid system
US6208934B1 (en) 1999-01-19 2001-03-27 Navigation Technologies Corp. Method and system for providing walking instructions with route guidance in a navigation program
US6542721B2 (en) 1999-10-11 2003-04-01 Peter V. Boesen Cellular telephone, personal digital assistant and pager unit
US6560468B1 (en) 1999-05-10 2003-05-06 Peter V. Boesen Cellular telephone, personal digital assistant, and pager unit with capability of short range radio frequency transmissions
USD468299S1 (en) 1999-05-10 2003-01-07 Peter V. Boesen Communication device
US6879698B2 (en) 1999-05-10 2005-04-12 Peter V. Boesen Cellular telephone, personal digital assistant with voice communication unit
US6094492A (en) 1999-05-10 2000-07-25 Boesen; Peter V. Bone conduction voice transmission apparatus and system
US6952483B2 (en) 1999-05-10 2005-10-04 Genisus Systems, Inc. Voice transmission apparatus with UWB
US6738485B1 (en) 1999-05-10 2004-05-18 Peter V. Boesen Apparatus, method and system for ultra short range communication
US6823195B1 (en) 2000-06-30 2004-11-23 Peter V. Boesen Ultra short range communication with sensing device and method
US6208372B1 (en) 1999-07-29 2001-03-27 Netergy Networks, Inc. Remote electromechanical control of a video communications system
US6470893B1 (en) 2000-05-15 2002-10-29 Peter V. Boesen Wireless biopotential sensing device and method with capability of short-range radio frequency transmission and reception
US6694180B1 (en) 1999-10-11 2004-02-17 Peter V. Boesen Wireless biopotential sensing device and method with capability of short-range radio frequency transmission and reception
US7508411B2 (en) 1999-10-11 2009-03-24 S.P. Technologies Llp Personal communications device
US6852084B1 (en) 2000-04-28 2005-02-08 Peter V. Boesen Wireless physiological pressure sensor and transmitter with capability of short range radio frequency transmissions
US8140357B1 (en) 2000-04-26 2012-03-20 Boesen Peter V Point of service billing and records system
US7047196B2 (en) 2000-06-08 2006-05-16 Agiletv Corporation System and method of voice recognition near a wireline node of a network supporting cable television and/or video delivery
JP2002083152A (en) 2000-06-30 2002-03-22 Victor Co Of Japan Ltd Content distribution system, portable terminal player and content provider
KR100387918B1 (en) 2000-07-11 2003-06-18 이수성 Interpreter
US6784873B1 (en) 2000-08-04 2004-08-31 Peter V. Boesen Method and medium for computer readable keyboard display incapable of user termination
JP4135307B2 (en) 2000-10-17 2008-08-20 株式会社日立製作所 Voice interpretation service method and voice interpretation server
US6472978B1 (en) 2000-11-24 2002-10-29 Yokogawa Electric Corporation Traffic system to prevent from accidents
US20020076073A1 (en) 2000-12-19 2002-06-20 Taenzer Jon C. Automatically switched hearing aid communications earpiece
US6987986B2 (en) 2001-06-21 2006-01-17 Boesen Peter V Cellular telephone, personal digital assistant with dual lines for simultaneous uses
USD468300S1 (en) 2001-06-26 2003-01-07 Peter V. Boesen Communication device
USD464039S1 (en) 2001-06-26 2002-10-08 Peter V. Boesen Communication device
US20030065504A1 (en) 2001-10-02 2003-04-03 Jessica Kraemer Instant verbal translator
US6664713B2 (en) 2001-12-04 2003-12-16 Peter V. Boesen Single chip device for voice communications
US7539504B2 (en) 2001-12-05 2009-05-26 Espre Solutions, Inc. Wireless telepresence collaboration system
US8527280B2 (en) 2001-12-13 2013-09-03 Peter V. Boesen Voice communication device with foreign language translation
US20030218064A1 (en) 2002-03-12 2003-11-27 Storcard, Inc. Multi-purpose personal portable electronic system
US7030856B2 (en) 2002-10-15 2006-04-18 Sony Corporation Method and system for controlling a display device
DE10253192A1 (en) 2002-11-15 2004-05-27 Philips Intellectual Property & Standards Gmbh Anti-collision system for use with road vehicle has position determining computer with GPS receiver and has radio transmitter ending signals to equipment carried by pedestrians
US7107010B2 (en) 2003-04-16 2006-09-12 Nokia Corporation Short-range radio terminal adapted for data streaming and real time services
US20050017842A1 (en) 2003-07-25 2005-01-27 Bryan Dematteo Adjustment apparatus for adjusting customizable vehicle components
DE10334203A1 (en) 2003-07-26 2005-03-10 Volkswagen Ag Interactive traffic handling method, by informing respective road users of current movements of other road users by direct intercommunication
US7136282B1 (en) 2004-01-06 2006-11-14 Carlton Rebeske Tablet laptop and interactive conferencing station system
US7558744B2 (en) 2004-01-23 2009-07-07 Razumov Sergey N Multimedia terminal for product ordering
US20050251455A1 (en) 2004-05-10 2005-11-10 Boesen Peter V Method and system for purchasing access to a recording
US8526646B2 (en) 2004-05-10 2013-09-03 Peter V. Boesen Communication device
US20060074808A1 (en) 2004-05-10 2006-04-06 Boesen Peter V Method and system for purchasing access to a recording
EP1757125B1 (en) 2004-06-14 2011-05-25 Nokia Corporation Automated application-selective processing of information obtained through wireless data communication links
US7925506B2 (en) 2004-10-05 2011-04-12 Inago Corporation Speech recognition accuracy via concept to keyword mapping
US7183932B2 (en) 2005-03-21 2007-02-27 Toyota Technical Center Usa, Inc Inter-vehicle drowsy driver advisory system
WO2007034371A2 (en) 2005-09-22 2007-03-29 Koninklijke Philips Electronics N.V. Method and apparatus for acoustical outer ear characterization
US20130090744A1 (en) 2006-06-12 2013-04-11 Bao Tran Mesh network access controller
US8194865B2 (en) 2007-02-22 2012-06-05 Personics Holdings Inc. Method and device for sound detection and audio control
US8068925B2 (en) 2007-06-28 2011-11-29 Apple Inc. Dynamic routing of audio among multiple audio devices
US8125348B2 (en) 2007-06-29 2012-02-28 Verizon Patent And Licensing Inc. Automobile beacon, system and associated method
US8108143B1 (en) 2007-12-20 2012-01-31 U-Blox Ag Navigation system enabled wireless headset
US8473081B2 (en) 2007-12-25 2013-06-25 Personics Holdings, Inc. Method and system for event reminder using an earpiece
US20090191920A1 (en) 2008-01-29 2009-07-30 Paul Regen Multi-Function Electronic Ear Piece
US8319620B2 (en) 2008-06-19 2012-11-27 Personics Holdings Inc. Ambient situation awareness system and method for vehicles
JP5245894B2 (en) 2009-02-16 2013-07-24 富士通モバイルコミュニケーションズ株式会社 Mobile communication device
CN102484461A (en) 2009-07-02 2012-05-30 骨声通信有限公司 A system and a method for providing sound signals
US8855918B2 (en) 2009-09-25 2014-10-07 Mitac International Corp. Methods of assisting a user with selecting a route after a personal navigation device transitions from driving mode to walking mode
US8253589B2 (en) 2009-10-20 2012-08-28 GM Global Technology Operations LLC Vehicle to entity communication
AU2011220382A1 (en) 2010-02-28 2012-10-18 Microsoft Corporation Local advertising content on an interactive head-mounted eyepiece
DE102010003429A1 (en) 2010-03-30 2011-10-06 Bayerische Motoren Werke Aktiengesellschaft Method for communication between e.g. mobile telephone and mobile transmitter in car, involves performing given action e.g. type of road user, location information such as global positioning system data, dependent on received signal of car
US9880014B2 (en) 2010-11-24 2018-01-30 Telenav, Inc. Navigation system with session transfer mechanism and method of operation thereof
DE102011118966A1 (en) 2011-11-19 2013-05-23 Valeo Schalter Und Sensoren Gmbh Communication apparatus e.g. mobile telephone, for e.g. wirelessly transmitting data for e.g. motor car, has sensor for detecting velocity of apparatus that receives and analyzes data containing information about current speed road users
CN104321618A (en) 2012-03-16 2015-01-28 观致汽车有限公司 Navigation system and method for different mobility modes
EP2669634A1 (en) 2012-05-30 2013-12-04 GN Store Nord A/S A personal navigation system with a hearing device
US9638537B2 (en) 2012-06-21 2017-05-02 Cellepathy Inc. Interface selection in navigation guidance systems
US8929573B2 (en) 2012-09-14 2015-01-06 Bose Corporation Powered headset accessory devices
SE537958C2 (en) 2012-09-24 2015-12-08 Scania Cv Ab Procedure, measuring device and control unit for adapting vehicle train control
US20140163771A1 (en) 2012-12-10 2014-06-12 Ford Global Technologies, Llc Occupant interaction with vehicle system using brought-in devices
US9391580B2 (en) 2012-12-31 2016-07-12 Cellco Paternership Ambient audio injection
US9210493B2 (en) 2013-03-14 2015-12-08 Cirrus Logic, Inc. Wireless earpiece with local audio cache
US9751534B2 (en) 2013-03-15 2017-09-05 Honda Motor Co., Ltd. System and method for responding to driver state
US9081944B2 (en) 2013-06-21 2015-07-14 General Motors Llc Access control for personalized user information maintained by a telematics unit
JP6429368B2 (en) 2013-08-02 2018-11-28 本田技研工業株式会社 Inter-vehicle communication system and method
US9279696B2 (en) 2013-10-25 2016-03-08 Qualcomm Incorporated Automatic handover of positioning parameters from a navigation device to a mobile device
EP3072317B1 (en) 2013-11-22 2018-05-16 Qualcomm Incorporated System and method for configuring an interior of a vehicle based on preferences provided with multiple mobile computing devices within the vehicle
WO2015110587A1 (en) 2014-01-24 2015-07-30 Hviid Nikolaj Multifunctional headphone system for sports activities
DE102014100824A1 (en) 2014-01-24 2015-07-30 Nikolaj Hviid Independent multifunctional headphones for sports activities
US9037125B1 (en) 2014-04-07 2015-05-19 Google Inc. Detecting driving with a wearable computing device
US20150379859A1 (en) 2014-06-30 2015-12-31 Denso International America, Inc. Method and device for locating automotive key fob, portable computing device, and vehicle
US10024667B2 (en) 2014-08-01 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
WO2016032990A1 (en) 2014-08-26 2016-03-03 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
CN204244472U (en) 2014-12-19 2015-04-01 中国长江三峡集团公司 A kind of vehicle-mounted road background sound is adopted and is broadcast safety device
US9272711B1 (en) 2014-12-31 2016-03-01 Volkswagen Ag Congestion-friendly adaptive cruise control
KR101711835B1 (en) 2015-01-30 2017-03-03 엘지전자 주식회사 Vehicle, Vehicle operating method and wearable device operating method
CN104837094A (en) 2015-04-24 2015-08-12 成都迈奥信息技术有限公司 Bluetooth earphone integrated with navigation function
US9510159B1 (en) 2015-05-15 2016-11-29 Ford Global Technologies, Llc Determining vehicle occupant location
KR20170025179A (en) 2015-08-27 2017-03-08 엘지전자 주식회사 The pedestrian crash prevention system and operation method thereof
US9905088B2 (en) 2015-08-29 2018-02-27 Bragi GmbH Responsive visual communication system and method
US10937407B2 (en) 2015-10-26 2021-03-02 Staton Techiya, Llc Biometric, physiological or environmental monitoring using a closed chamber
US20170153636A1 (en) * 2015-11-27 2017-06-01 Bragi GmbH Vehicle with wearable integration or communication
US20170153114A1 (en) 2015-11-27 2017-06-01 Bragi GmbH Vehicle with interaction between vehicle navigation system and wearable devices

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6140939A (en) * 1995-04-14 2000-10-31 Flick; Kenneth E. Biometric characteristic vehicle control system having verification and reset features
US6041410A (en) * 1997-12-22 2000-03-21 Trw Inc. Personal identification fob
US20020057810A1 (en) * 1999-05-10 2002-05-16 Boesen Peter V. Computer and voice communication unit with handsfree device
US20030002705A1 (en) * 1999-05-10 2003-01-02 Boesen Peter V. Earpiece with an inertial sensor
US20040124968A1 (en) * 2002-12-19 2004-07-01 Shinsaku Inada Boarding environment controlling system, boarding environment controlling apparatus, and boarding environment controlling method
US20100075631A1 (en) * 2006-03-20 2010-03-25 Black Gerald R Mobile communication device
US20080146890A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Telemetric apparatus for health and environmental monitoring
US20090128286A1 (en) * 2007-11-20 2009-05-21 Vitito Christopher J System for controlling the use of electronic devices within an automobile
US20090154739A1 (en) * 2007-12-13 2009-06-18 Samuel Zellner Systems and methods employing multiple individual wireless earbuds for a common audio source
US20110215921A1 (en) * 2009-06-22 2011-09-08 Mourad Ben Ayed Systems for wireless authentication based on bluetooth proximity
US8610585B1 (en) * 2009-12-07 2013-12-17 Matthew Kielbasa Electronic alerting device and associated method
US20150028996A1 (en) * 2013-07-25 2015-01-29 Bionym Inc. Preauthorized wearable biometric device, system and method for use thereof
US10040423B2 (en) * 2015-11-27 2018-08-07 Bragi GmbH Vehicle with wearable for identifying one or more vehicle occupants
US10099636B2 (en) * 2015-11-27 2018-10-16 Bragi GmbH System and method for determining a user role and user settings associated with a vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2616446A (en) * 2022-03-09 2023-09-13 Continental Automotive Tech Gmbh Apparatus capable of being used as a primary device and an auxiliary device, and a system in association thereto

Also Published As

Publication number Publication date
EP3416554A1 (en) 2018-12-26
US20170151930A1 (en) 2017-06-01
WO2017089537A1 (en) 2017-06-01
US10040423B2 (en) 2018-08-07

Similar Documents

Publication Publication Date Title
US10040423B2 (en) Vehicle with wearable for identifying one or more vehicle occupants
US10155524B2 (en) Vehicle with wearable for identifying role of one or more users and adjustment of user settings
US10099636B2 (en) System and method for determining a user role and user settings associated with a vehicle
US20170151959A1 (en) Autonomous vehicle with interactions with wearable devices
US20170151957A1 (en) Vehicle with interactions with wearable device to provide health or physical monitoring
US20170153636A1 (en) Vehicle with wearable integration or communication
US9978278B2 (en) Vehicle to vehicle communications using ear pieces
US20170155998A1 (en) Vehicle with display system for interacting with wearable device
US20180040093A1 (en) Vehicle request using wearable earpiece
US20170156000A1 (en) Vehicle with ear piece to provide audio safety
US20180034951A1 (en) Earpiece with vehicle forced settings
US20190007767A1 (en) Vehicle with interaction between entertainment systems and wearable devices
US20170153114A1 (en) Vehicle with interaction between vehicle navigation system and wearable devices
US10220854B2 (en) System and method for identifying at least one passenger of a vehicle by a pattern of movement
US20170330044A1 (en) Thermal monitoring in autonomous-driving vehicles
US12208805B2 (en) Method provided in a car that can automatically take actions in the event of health risk
EP3380011A1 (en) Vehicle with wearable integration or communication
CN120024347A (en) Vehicle safety management method and device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: BRAGI GMBH, GERMANY

Free format text: EMPLOYMENT DOCUMENT;ASSIGNOR:BOESEN, PETER VINCENT;REEL/FRAME:049672/0188

Effective date: 20190603

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION