WO2018226220A1 - Vehicle occupant sleep state management - Google Patents
Vehicle occupant sleep state management Download PDFInfo
- Publication number
- WO2018226220A1 WO2018226220A1 PCT/US2017/036357 US2017036357W WO2018226220A1 WO 2018226220 A1 WO2018226220 A1 WO 2018226220A1 US 2017036357 W US2017036357 W US 2017036357W WO 2018226220 A1 WO2018226220 A1 WO 2018226220A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- occupant
- user
- computer
- sleep state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096741—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0064—Body surface scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4809—Sleep detection, i.e. determining whether a subject is asleep or not
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4812—Detecting sleep stages or cycles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Definitions
- Figure 1 is a block diagram of an example vehicle occupant sleep state management system.
- Figure 2 illustrates an example vehicle occupant sleep state management process.
- a system comprised a computer programmed to identify a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and perform an action based on a location of the vehicle and the identified occupant sleep state.
- the system can further comprise a portable computing device programmed to provide occupant motion data to the computer.
- the computer can be a portable computing device.
- the portable computing device can be programmed to receive vehicle motion data from a vehicle computer.
- the computer can be further programmed to receive occupant motion data including at least one of accelerometer data and image data, and to measure the motion of the vehicle occupant according to the received occupant motion data.
- the computer can be further programmed to receive vehicle operation data and to determine the measurement of vehicle motion according to the received vehicle operation data.
- the action can be initiating a communication with a remote server based on the vehicle occupant sleep state and/or controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
- the computer can be further programmed to perform the action based in part upon a determination that the location of the vehicle is within a predetermined distance of a target location.
- the identified sleep state can be one of not asleep, asleep, and unconscious.
- a method comprises identifying a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and performing an action based on a location of the vehicle and the identified occupant sleep state.
- the method can further comprise providing, from a portable computing device, occupant motion data to a vehicle computer.
- the method can further comprise providing, from a vehicle computer, the vehicle motion data to a portable computing device.
- the method can further comprise receiving occupant motion data including at least one of accelerometer data and image data, and measuring the motion of the vehicle occupant according to the received occupant motion data.
- the method can further comprise receiving vehicle operation data and determining the measurement of vehicle motion according to the received vehicle operation data.
- the method can further comprise that the action is initiating a communication with a remote server based on the vehicle occupant sleep state.
- the method can further comprise that the action is controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
- the method can further comprise performing the action based in part upon a
- the method can further comprise that the identified sleep state is one of not asleep, asleep, and unconscious.
- the method can be executed according to program instructions stored on a computer. Further disclosed is a computer programmed to execute the method, including any of the foregoing.
- FIG. 1 is a block diagram illustrating an example system 100 for determining and acting on a vehicle 101 occupant sleep state.
- a vehicle 101 computing device (or computer) 105 can be programmed to receive data concerning a vehicle 101 occupant or user 145.
- the vehicle 101 includes sensors 110 to provide collected data 1 15 to the computing device 105, including data relating to vehicle operations (e.g., speed, location, route traveled, etc.) and/or data 115, e.g., images or the like, relating to the user 145.
- the user 145 typically carries a user device 150, which includes sensors 155, e.g., a camera, an accelerometer, a location sensor, etc., to provide collected data 160.
- the computer 105 can be programmed to analyze the collected data 115, 160 to determine a user 145 sleep state, as well as one or more instructions to actuate one or more vehicle 101 subsystems 120.
- the user device 150 can be programmed to analyze the collected data 115, 160 to determine a user 145 sleep state, and to specify the sleep state to the computing device 105.
- a vehicle 101 computing device 105 includes a processor and a memory that includes volatile and non-volatile memory.
- the computing device 105 is generally programmed, i.e., the memory stores instructions executable by the processor, for various tasks as disclosed herein.
- the computing device 105 is typically programmed for communications on a vehicle 101 network or communications bus, as is known. Via the network, bus, and/or other wired or wireless mechanisms (e.g., a wired or wireless local area network in the vehicle 101), the computing device 105 may transmit messages to various devices in a vehicle 101 and/or receive messages from the various devices, e.g., controllers, actuators, sensors, etc., including sensors 1 10.
- the vehicle network or bus may be used for communications between devices represented as the computing device 105 in this disclosure, in addition, the computing device 105 may be programmed for communicating with the user device 150, e.g., via a close-range wireless protocol such as NFC (Near Field Communication), BLUETOOTH® or the like.
- the computing device 105 may further communicate with other devices via the network 125 which as discussed below can include various wired and/or wireless networking technologies, e.g., cellular, Ethernet, wired and/or wireless packet networks, etc.
- the computer 105 typically includes data storage that may be of any known type, e.g., hard disk drives, solid state drives, servers, or any volatile or non-volatile media.
- the computer 105 may also store collected data off-vehicle in a remote Server 130.
- the computer 105 may thereby store collected data 115 sent from sensors 110.
- Sensors 110 may include a variety of devices.
- various controllers in a vehicle 101 may operate as sensors 110 to provide data 115 via the vehicle 101 network or bus, e.g., data 115 relating to vehicle speed, acceleration, position, system and/or component status, directional heading, etc.
- the sensors 110 could include short range radar, long range radar, LIDAR (Light Detection and Ranging) sensor, capacitive sensors, and/or ultrasonic transducers.
- other sensors 110 could include cameras, microphones, or the like, e.g., disposed in a vehicle 105 cabin, to provide images or audio of a user 145 occupying the vehicle 105 cabin.
- Collected data 1 15 may include a variety of data collected in a vehicle 101. Examples of collected data 1 15 are provided above, and moreover, data 115 are generally collected using one or more sensors 110, and may additionally include data calculated therefrom in the computing device 105, and/or at the server 130. In general, collected data 115 may include any data thai may be gathered by the sensors 110 and/or computed from such data.
- the vehicle 101 includes a plurality of subsystems 120.
- the subsystems 120 control vehicle 101 components; the subsystems 120 can include, e.g., a steering subsystem, a propulsion subsystem (including, e.g., an internal combustion engine and or electric motor), a brake subsystem, a park assist subsystem, an adaptive cruise control subsystem, etc.
- the computing device 105 may actuate the subsystems 120 to control the vehicle 101 components, e.g., to stop the vehicle 101, steer, etc.
- the computing device 105 may be programmed to operate some or all of the subsystems 120 with limited or no input from a human operator, i.e., the computing device 105 may be programmed to operate the subsystems 120 as a virtual operator.
- the computing device 105 can ignore input from the human operator with respect to subsystems 120 selected for control by the virtual operator, which provides instructions, e.g., via a vehicle 101 communications bus and/or to electronic control units (ECUs) as are known, to actuate vehicle 101 components, e.g., to apply brakes, change a steering wheel angle, etc.
- ECUs electronice control units
- the computing device 105 may ignore the movement of the steering wheel and steer the vehicle 101 according to its programming.
- an autonomous mode is defined as one in which each of vehicle 101 propulsion (e.g., via a powertrain including an electric motor and/or an internal combustion engine), braking, and steering are controlled by the computing device 105; in a semi-autonomous mode the computing device 105 controls one or two of vehicle 101 propulsion, braking, and steering.
- the system 100 may further include a network 125 connected to a server 130 and a data store 135.
- the server 130 is geographically remote from the vehicle 101.
- geographically remote typically means a distance of a mile and usually much more, hut at a minimum means thai the server 130 is not in a same building or vehicle with the vehicle computer 105.
- geoographically remote means that communications with the server 130 can happen only via a wide area network such as the network 125; wired or wireless
- the computer 105 may further be programmed to communicate with one or more remote sites such as the server 130, via the network 125, such remote site possibly including a data store 135.
- the network 125 represents one or more mechanisms by which a vehicle computer 105 may communicate with a remote server 130.
- the network 125 may be one or more of various wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms and any desired network topology (or topologies when multiple communication mechanisms are utilized).
- Exemplary communication networks include wireless communication networks (e.g., using BLUETOOTH, IEEE 802.1 1 , etc.), Direct Short Range Communication (DSRC), local area networks (LAN) and/or wide area networks (WAN), including the Internet, providing data communication services.
- the user 145 typically carries a user device 150 that communicates with the computer 105.
- the user device 150 is a computing device that includes a memory, a processor, a display, and one or more input mechanisms, such as a touchscreen, buttons, etc., as well as hardware and software for wireless communications such as described herein.
- the user device 150 is often a wearable (i.e., fitted to be worn on a part of a human body such as a wrist) computing device in the form of a bracelet, pendant, glasses, etc., but may be any suitable portable computing device of a size and weight so as to be carried by a user.
- the device 1 0 may be a portable computer, tablet computer, mobile phone, e.g., a smart phone, etc.
- the user device 150 includes capabilities for wireless communications using IEEE 802.11,
- the user device 150 includes sensors 155 that can collect data 160 about the user 145.
- the device 150 sensors 155 can be a variety of devices, e.g., one or more of a heart rate sensor, a galvanic skin response sensor, a camera, a microphone, an accelerometer, a gyroscope, a location (e.g., global positioning system) sensor, etc.
- Figure 2 illustrates an example process 200 for determining and acting on a user 145 in a sleep state in a vehicle 101. Except for the blocks 225 and 230, which typically are carried out by a vehicle 101 computing device 105, steps of the process 200 can variously be carried out by the vehicle 101 computing device 105 and/or a user device 150.
- the process 200 begins in a block 205, in which vehicle operation data is received.
- Vehicle operation data is defined as data 1 15 received from the vehicle 101 sensors 110 and/or subsystems 120 describing a physical state or condition of the vehicle 101 or its surrounding environment.
- sensors 110 and/or electronic control units (ECUs) included in subsystems 120 such as braking, steering, powertrain, etc., can provide data relating to vehicle 101 wheel speed, engine speed, steering angle, cabin inside temperature, vehicle outside temperature, precipitation, acceleration, and many other values.
- ECUs electronice control units
- one or more measures of vehicle 101 motion are detected from the vehicle operation data 115.
- the computer 105 may determine, from vehicle operation data including vehicle motion data 115, vehicle 101 acceleration and/or jerk, acceleration being a first derivative of vehicle 101 speed, and jerk being a second derivative of vehicle 101 speed.
- a measure of vehicle 101 motion e.g., acceleration or jerk, is typically determined for a specific moment in time.
- one or more measures of user 145 motion are determined, e.g., acceleration and/or jerk as described above.
- the user 145 motion measures can be determined according to user device 150 sensor 155 data 160.
- the user device 150 can include an accelerometer sensor 155 and/or a GPS sensor 155 that can be used to determine user 145 motion over time when the user 145 is wearing or carrying the user device 1 0, and which can provide occupant motion data 160 from which motion measurements, e.g., acceleration and/or jerk, can be determined tor one or more moments in time.
- a user 145 sleep state is estimated.
- a first estimate may be to use a 2-way audio system consisting of speakers, a microphone, and speech processing unit, to query the user 145 to respond or acknowledge the query. If the user does not respond, then the user 145 sleep state may be estimated in a variety of ways.
- the user could be determined to be asleep by a comparison of a measurement or measurements of user 145 motion to a measurement or measurements of vehicle 101 motion, e.g., when a measurement of user 145 motion differs from a measurement of vehicle 101 motion by more than a predetermined threshold, e.g., when user 145 acceleration is less than vehicle 101 acceleration by more than a predetermined threshold and/or when user 145 jerk is greater than vehicle 101 jerk by more than a predetermined threshold.
- a predetermined threshold e.g., when user 145 acceleration is less than vehicle 101 acceleration by more than a predetermined threshold and/or when user 145 jerk is greater than vehicle 101 jerk by more than a predetermined threshold.
- the camera sensor 155 of the user device 150 and/or a vehicle camera sensor 110 can provide image data 1 15, 160 of a user 145.
- the computer 105 and/or user device 150 can be programmed to analyze the image data 115, 160 to determine user 145 motion (or lack thereof) in the vehicle 101. For example, successive images of a user 145 can show whether a user is moving or still within a specified period of time; if the images show that the user has not moved more than a predetermined threshold from a position at a first time in a time window, then it may be determined that the user 145 is asleep. Further for example, image data can be analyzed to determine user 145 motion relative to vehicle 101 motion.
- images can show motion of the user 145 relative to a vehicle 101 seat, e.g., whether the user 145 torso and head do or do not move longitudinally in the vehicle (typically within predefined parameters to account for normal waking movement), identifying that the user 145 head and torso do move longitudinally in the vehicle may indicate that the user 145 is in a sleep state because a sleeping user 145 may not, exercise muscular control to remain in a same longitudinal position when the vehicle 101 decelerates.
- negative acceleration e.g., brakes to a stop at an intersection
- images can show motion of the user 145 relative to a vehicle 101 seat, e.g., whether the user 145 torso and head do or do not move longitudinally in the vehicle (typically within predefined parameters to account for normal waking movement)
- identifying that the user 145 head and torso do move longitudinally in the vehicle may indicate that the user 145 is in a sleep state because a sleeping user 145 may not, exercise muscular control to remain in a same longitudinal position
- collected data 160 from a user device 150 could be used to determine a user sleep state. For example, if a time between one or more of a last time when user input was received, a time when a Screensaver or the like was activated, a time when the user device 150 transitioned to a power-saving mode, etc., and a current time exceed a predetermined threshold, a determination could be made that the user 145 is asleep.
- vehicle 101 collected data 115 and/or user 145 device 150 collected data 160 could be used to determine whether a user 145 had spoken within a predetermined amount of time.
- speech recognition techniques as are known could be used to detect sounds in a vehicle 101 cabin and determine whether the sounds were speech from a user 145. in order to discern that the detected speech was from the user 145 and not sound from outside the vehicle (i.e., window open) or sound emanating from user device 150 in possession of user 145, the cabin cameras could be used to confirm facial/lip movement correlates with detected speech. In most cases, barring sleep disorders that result in persons speaking while asleep, detected speech from the user 145 would indicate that the user 145 is awake.
- the computer 105 could employ some form of question and answer interaction, e.g., queries to a user seeking information about the user's day and/or surroundings in order to detect they are alert and aware. For example, based on a calendar entry, the computer 105 could ask “Who did you meet with today?" or, based on weather day a, could ask, "is it raining?" Further, the controller 105 could communicate with device 150 to send audible alerts to a user's device 150, which would be particularly effective if the user has the device 150 in media mode and is listening to media such as music on headphones. The device 150 media could be paused before sending the alert.
- question and answer interaction e.g., queries to a user seeking information about the user's day and/or surroundings in order to detect they are alert and aware. For example, based on a calendar entry, the computer 105 could ask “Who did you meet with today?" or, based on weather day a, could ask, "is it raining?” Further, the controller 105 could communicate with device 150 to
- biometric data such as respiration or heart rate data
- a rate of respiration and/or a heart rate can be indicative that a person is asleep, as is known, e.g., refer to micro-motion radar used in baby monitors or use of pulse detection software for post processing of facial camera images.
- user 145 sleep state estimation could be binary or the computer 105 could be programmed to determine that the user 145 is in one of multiple possible sleep states.
- the user 145 sleep state estimation may be binary, i.e., a user 145 may be estimated to be one of "asleep" or "not asleep.”
- the user 145 sleep estimation may be selected as one of a plurality of possible sleep states, e.g., as illustrated in the table below.
- the computer 105 determines, based on the estimated user sleep state determined in the block 220, whether an action is needed, i.e., whether to actuate one or more components or subsystems in the vehicle 101. This determination is typically made in conjunction with a location of the vehicle 101, e.g., determined according to the operation data received in the block 205. However, some estimations of a user 145 sleep state, e.g., that the user 145 is comatose, may trigger an action regardless of and/or without determining a vehicle 101 location.
- the determination of whether an action is needed may be based on determining that the user 145 is asleep and the vehicle 101 is at a location requiring the user 145 to be awake. For example, the vehicle 101 may be within a predetermined distance or time of arrival (e.g., five minutes, 10 minutes, etc.) of a target location, e.g., a user 145 destination. If the user 145 is asleep at such location, the computer 105 may be programmed to take an action to awaken the user 145, e.g., so that the user 145 may exit the vehicle 101 at the target location.
- a predetermined distance or time of arrival e.g., five minutes, 10 minutes, etc.
- the vehicle 101 may be at an accident location, an emergency stop location, etc., whereupon the computer 105 may be programmed to take an action to awaken a sleeping user 145 so that the user 145 can attend to a possible emergency situation.
- the computer 105 may determine that an action such as moving the vehicle 101 to a safe location, sending a request for emergency assistance, etc. should be taken.
- the computer 105 may be programmed to determine whether an action is to be taken based on the vehicle 101 location in combination with a specific estimated sleep state. For example, the computer 105 could be programmed to determine to awaken a user when a vehicle 101 is within a predetermined distance or time of arrival of a user 145 destination, but the predetermined distance or time threshold could vary according to the user sleep state. For example, a drowsy state, a light sleep state, an REM sleep state, etc. could have increasing distance and/or time thresholds with respect to a target location such as a user 145 destination to trigger an action based on the user 145 sleep state.
- a block 230 is executed next. Otherwise, the process 200 returns to the block 205.
- the process 200 could end after the block 225, e.g., if it is determined not to take an action and the vehicle 101 has arrived at a target location such as a user 145 destination.
- the computer 105 instructs a vehicle 101 component or subsystem, and/or sends an instruction to the user device 150, to take an action based on the estimated user 145 sleep stale.
- the action may be determined according to the estimated sleep state. For example, if the user 145 is drowsy, the action may be providing a visual alert via a vehicle 101 human machine interface (HMI) and/or a display of the user device 150. if a user 145 is lightly asleep, then an audio alert or alarm could be initiated via the vehicle 101 HMI and/or the user device 150. Such audio alert or alarm, could be provided more loudly or aggressively if the user 145 is in an REM sleep state.
- HMI human machine interface
- the controller 105 may first request help of other vehicle 101 occupants to wake user 145. If none accepts, the controller 105 may advise the vehicle will sound an alarm one to attempt to wake the user 145. Other passengers may then request a PAUSE to any subsequent alarms until they have exited the vehicle. Once other occupants have exited the vehicle and/or when user 145 is in higher levels of sleep states, i.e., a sleep state is estimated to be a deeper sleep, more aggressive action may be taken.
- the computer 105 could actuate haptic output, e.g., vibration of a vehicle 101 seat, could actuate vehicle 101 brakes to provide brake pulses to awaken the user 145, activating vehicle steering to arouse the user, controlling the vehicle propulsion, e.g. powertrain, e.g., to slow the vehicle, to arouse the user, etc.
- haptic output e.g., vibration of a vehicle 101 seat
- vehicle 101 brakes to provide brake pulses to awaken the user 145
- the vehicle propulsion e.g. powertrain, e.g., to slow the vehicle, to arouse the user, etc.
- the computer 105 in the block 230 could actuate, possibly after providing a visual and/or audible alarm, the vehicle 101 HMI and/or user device 152 request user 145 input to confirm that the user 145 is awake. Upon receiving such input, the process 200 could then exit the block 230. However, if such input was not received within a predetermined amount of time, e.g., five seconds, 10 seconds, etc., the computer 105 could actuate more aggressive visual and/or audible alarms, and could again request user 145 input, to confirm that the user 145 is awake. As a means of preventing other passengers from respond to the confirmation request so as to stop the alarm, the controller 105 may ask for specific information that only user 145 would know such as final destination, reservation order number, etc.
- the computer 105 could be programmed to determine that the user 145 is un-arousable, e.g., comatose, sleep state, and to take action accordingly, e.g., sending a request for emergency assistance via the server 130.
- the action could be determined according to a vehicle 101 location in combination with the user 145 sleep stale. For example, if the user 145 is asleep and the vehicle 101 is at an accident scene, the computer 105 could send a message to the server 130 requesting emergency assistance.
- location it is to be understood that the location could be determined in a known manner, e.g., according to geo-coordinates such as are known.
- GPS global positioning system
- the location could be determined in a known manner, e.g., according to geo-coordinates such as are known.
- GPS global positioning system
- the adverb "substantially" modifying an adjective means that a shape, structure, measurement, value, calculation, etc. may deviate from an exact described geometry, distance, measurement, value, calculation, etc., because of imperfections in materials, machining, manufacturing, sensor measurements, computations, processing time, communications time, etc.
- Computing devices 105, 150, etc. generally each include instructions executable by one or more computing devices such as those identified above, and for carrying out blocks or steps of processes described above.
- the terms "computing device” and “computer” may be used interchangeably in this disclosure.
- Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, JavaTM, C, C++, Visual Basic, Java Script, Perl, HTML, etc.
- a processor receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein.
- Such instructions and other data may be stored and transmitted using a variety of computer-readable media.
- a file in the computing device 105 is generally a collection of data stored on a computer readable medium, such as a storage medium, a random access memory, etc.
- a computer-readable medium includes any medium that participates in providing data (e.g., instructions), which may be read by a computer. Such a medium may take many forms, including, but not limited to, non- volatile media, volatile media, etc.
- Non-volatile media include, for example, optical or magnetic disks and other persistent memory.
- Volatile media include dynamic random access memory (DRAM), which typically constitutes a main memory.
- DRAM dynamic random access memory
- Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
Landscapes
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Atmospheric Sciences (AREA)
- Medical Informatics (AREA)
- Transportation (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Heart & Thoracic Surgery (AREA)
- Mechanical Engineering (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
A computer is programmed to identify a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion, and to perform, an action based on a location of the vehicle and the identified occupant sleep state.
Description
VEHICLE OCCUPANT SLEEP STATE MANAGEMENT
BACKGROUND
[0001] Users riding in vehicles often fall asleep, particularly at night or during long trips. Problems arise when users are not awake at certain locations, such as a location of a user's destination, or a location of an emergency shutdown, accident, or the like. A problem exists in that some vehicles, e.g., autonomous rideshare vehicles, cannot rely on a user awakening at a location such as a destination, end of the service day storage of the vehicle, etc., but lack means to ensure that the user is awake at the location.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figure 1 is a block diagram of an example vehicle occupant sleep state management system.
[0003] Figure 2 illustrates an example vehicle occupant sleep state management process.
DETAILED DESCRIPTION
[0004] A system, comprised a computer programmed to identify a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and perform an action based on a location of the vehicle and the identified occupant sleep state.
[0005] The system can further comprise a portable computing device programmed to provide occupant motion data to the computer.
[0006] The computer can be a portable computing device. The portable computing device can be programmed to receive vehicle motion data from a vehicle computer.
[0007] The computer can be further programmed to receive occupant motion data including at least one of accelerometer data and image data, and to measure the motion of the vehicle occupant according to the received occupant motion data.
[0008] The computer can be further programmed to receive vehicle operation data and to determine the measurement of vehicle motion according to the received vehicle operation data.
[0009] The action can be initiating a communication with a remote server based on the vehicle occupant sleep state and/or controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
[0010] The computer can be further programmed to perform the action based in part upon a determination that the location of the vehicle is within a predetermined distance of a target location.
[0011] The identified sleep state can be one of not asleep, asleep, and unconscious.
[0012] A method comprises identifying a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and performing an action based on a location of the vehicle and the identified occupant sleep state.
[0013] The method can further comprise providing, from a portable computing device, occupant motion data to a vehicle computer.
[0014] The method can further comprise providing, from a vehicle computer, the vehicle motion data to a portable computing device.
[0015] The method can further comprise receiving occupant motion data including at least one of accelerometer data and image data, and measuring the motion of the vehicle occupant according to the received occupant motion data.
[0016] The method can further comprise receiving vehicle operation data and determining the measurement of vehicle motion according to the received vehicle operation data.
[0017] The method can further comprise that the action is initiating a communication with a remote server based on the vehicle occupant sleep state.
[0018] The method can further comprise that the action is controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
[0019] The method can further comprise performing the action based in part upon a
determination that the location of the vehicle is within a predetermined distance of a target location.
[0020] The method can further comprise that the identified sleep state is one of not asleep, asleep, and unconscious.
[0021] The method, including any of the foregoing, can be executed according to program instructions stored on a computer. Further disclosed is a computer programmed to execute the method, including any of the foregoing.
[0022] Figure 1 is a block diagram illustrating an example system 100 for determining and acting on a vehicle 101 occupant sleep state. A vehicle 101 computing device (or computer) 105 can be programmed to receive data concerning a vehicle 101 occupant or user 145. The vehicle 101 includes sensors 110 to provide collected data 1 15 to the computing device 105, including
data relating to vehicle operations (e.g., speed, location, route traveled, etc.) and/or data 115, e.g., images or the like, relating to the user 145. Further, the user 145 typically carries a user device 150, which includes sensors 155, e.g., a camera, an accelerometer, a location sensor, etc., to provide collected data 160. The computer 105 can be programmed to analyze the collected data 115, 160 to determine a user 145 sleep state, as well as one or more instructions to actuate one or more vehicle 101 subsystems 120. Alternatively or additionally, the user device 150 can be programmed to analyze the collected data 115, 160 to determine a user 145 sleep state, and to specify the sleep state to the computing device 105.
[0023] A vehicle 101 computing device 105 includes a processor and a memory that includes volatile and non-volatile memory. The computing device 105 is generally programmed, i.e., the memory stores instructions executable by the processor, for various tasks as disclosed herein. For example, the computing device 105 is typically programmed for communications on a vehicle 101 network or communications bus, as is known. Via the network, bus, and/or other wired or wireless mechanisms (e.g., a wired or wireless local area network in the vehicle 101), the computing device 105 may transmit messages to various devices in a vehicle 101 and/or receive messages from the various devices, e.g., controllers, actuators, sensors, etc., including sensors 1 10. Alternatively or additionally, in cases where the computing device 105 actually comprises multiple devices, the vehicle network or bus may be used for communications between devices represented as the computing device 105 in this disclosure, in addition, the computing device 105 may be programmed for communicating with the user device 150, e.g., via a close-range wireless protocol such as NFC (Near Field Communication), BLUETOOTH® or the like. The computing device 105 may further communicate with other devices via the network 125 which as discussed below can include various wired and/or wireless networking technologies, e.g., cellular, Ethernet, wired and/or wireless packet networks, etc.
[0024] The computer 105 typically includes data storage that may be of any known type, e.g., hard disk drives, solid state drives, servers, or any volatile or non-volatile media. The computer 105 may also store collected data off-vehicle in a remote Server 130. The computer 105 may thereby store collected data 115 sent from sensors 110.
[0025] Sensors 110 may include a variety of devices. For example, as is known, various controllers in a vehicle 101 may operate as sensors 110 to provide data 115 via the vehicle 101 network or bus, e.g., data 115 relating to vehicle speed, acceleration, position, system and/or component status, directional heading, etc. The sensors 110 could include short range radar, long
range radar, LIDAR (Light Detection and Ranging) sensor, capacitive sensors, and/or ultrasonic transducers. Further, other sensors 110 could include cameras, microphones, or the like, e.g., disposed in a vehicle 105 cabin, to provide images or audio of a user 145 occupying the vehicle 105 cabin.
[0026] Collected data 1 15 may include a variety of data collected in a vehicle 101. Examples of collected data 1 15 are provided above, and moreover, data 115 are generally collected using one or more sensors 110, and may additionally include data calculated therefrom in the computing device 105, and/or at the server 130. In general, collected data 115 may include any data thai may be gathered by the sensors 110 and/or computed from such data.
[0027] The vehicle 101 includes a plurality of subsystems 120. The subsystems 120 control vehicle 101 components; the subsystems 120 can include, e.g., a steering subsystem, a propulsion subsystem (including, e.g., an internal combustion engine and or electric motor), a brake subsystem, a park assist subsystem, an adaptive cruise control subsystem, etc. The computing device 105 may actuate the subsystems 120 to control the vehicle 101 components, e.g., to stop the vehicle 101, steer, etc.
[0028] The computing device 105 may be programmed to operate some or all of the subsystems 120 with limited or no input from a human operator, i.e., the computing device 105 may be programmed to operate the subsystems 120 as a virtual operator. When the computing device 105 operates the subsystems 120 as a virtual operator, the computing device 105 can ignore input from the human operator with respect to subsystems 120 selected for control by the virtual operator, which provides instructions, e.g., via a vehicle 101 communications bus and/or to electronic control units (ECUs) as are known, to actuate vehicle 101 components, e.g., to apply brakes, change a steering wheel angle, etc. For example, if the human operator attempts to turn a steering wheel during virtual operator steering operation, the computing device 105 may ignore the movement of the steering wheel and steer the vehicle 101 according to its programming.
[0029] For purposes of this disclosure, an autonomous mode is defined as one in which each of vehicle 101 propulsion (e.g., via a powertrain including an electric motor and/or an internal combustion engine), braking, and steering are controlled by the computing device 105; in a semi-autonomous mode the computing device 105 controls one or two of vehicle 101 propulsion, braking, and steering.
[0030] The system 100 may further include a network 125 connected to a server 130 and a data store 135. The server 130 is geographically remote from the vehicle 101. For purposes of this
disclosure, "geographically remote" typically means a distance of a mile and usually much more, hut at a minimum means thai the server 130 is not in a same building or vehicle with the vehicle computer 105. Further, "geographically remote" means that communications with the server 130 can happen only via a wide area network such as the network 125; wired or wireless
communications with intermediate routers, gateways, etc., are not possible under this scenario.
[0031] Accordingly, the computer 105 may further be programmed to communicate with one or more remote sites such as the server 130, via the network 125, such remote site possibly including a data store 135. The network 125 represents one or more mechanisms by which a vehicle computer 105 may communicate with a remote server 130. Accordingly, the network 125 may be one or more of various wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., cellular, wireless, satellite, microwave, and radio frequency) communication mechanisms and any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communication networks include wireless communication networks (e.g., using BLUETOOTH, IEEE 802.1 1 , etc.), Direct Short Range Communication (DSRC), local area networks (LAN) and/or wide area networks (WAN), including the Internet, providing data communication services.
[0032] The user 145 typically carries a user device 150 that communicates with the computer 105. The user device 150 is a computing device that includes a memory, a processor, a display, and one or more input mechanisms, such as a touchscreen, buttons, etc., as well as hardware and software for wireless communications such as described herein. The user device 150 is often a wearable (i.e., fitted to be worn on a part of a human body such as a wrist) computing device in the form of a bracelet, pendant, glasses, etc., but may be any suitable portable computing device of a size and weight so as to be carried by a user. For example, the device 1 0 may be a portable computer, tablet computer, mobile phone, e.g., a smart phone, etc. As mentioned above, the user device 150 includes capabilities for wireless communications using IEEE 802.11,
BLUETOOTH, DSRC, and/or cellular communications protocols, etc.
[0033] The user device 150 includes sensors 155 that can collect data 160 about the user 145. The device 150 sensors 155 can be a variety of devices, e.g., one or more of a heart rate sensor, a galvanic skin response sensor, a camera, a microphone, an accelerometer, a gyroscope, a location (e.g., global positioning system) sensor, etc.
[0034] Figure 2 illustrates an example process 200 for determining and acting on a user 145 in a sleep state in a vehicle 101. Except for the blocks 225 and 230, which typically are carried out by a vehicle 101 computing device 105, steps of the process 200 can variously be carried out by the vehicle 101 computing device 105 and/or a user device 150.
[0035] The process 200 begins in a block 205, in which vehicle operation data is received.
Vehicle operation data is defined as data 1 15 received from the vehicle 101 sensors 110 and/or subsystems 120 describing a physical state or condition of the vehicle 101 or its surrounding environment. For example, sensors 110 and/or electronic control units (ECUs) included in subsystems 120 such as braking, steering, powertrain, etc., can provide data relating to vehicle 101 wheel speed, engine speed, steering angle, cabin inside temperature, vehicle outside temperature, precipitation, acceleration, and many other values.
[0036] Next, in a block 210, one or more measures of vehicle 101 motion are detected from the vehicle operation data 115. In particular, the computer 105 may determine, from vehicle operation data including vehicle motion data 115, vehicle 101 acceleration and/or jerk, acceleration being a first derivative of vehicle 101 speed, and jerk being a second derivative of vehicle 101 speed. A measure of vehicle 101 motion, e.g., acceleration or jerk, is typically determined for a specific moment in time.
[0037] Next, in a block 215, one or more measures of user 145 motion are determined, e.g., acceleration and/or jerk as described above. The user 145 motion measures can be determined according to user device 150 sensor 155 data 160. For example, the user device 150 can include an accelerometer sensor 155 and/or a GPS sensor 155 that can be used to determine user 145 motion over time when the user 145 is wearing or carrying the user device 1 0, and which can provide occupant motion data 160 from which motion measurements, e.g., acceleration and/or jerk, can be determined tor one or more moments in time.
[0038] Next, in a block 220, a user 145 sleep state is estimated. A first estimate may be to use a 2-way audio system consisting of speakers, a microphone, and speech processing unit, to query the user 145 to respond or acknowledge the query. If the user does not respond, then the user 145 sleep state may be estimated in a variety of ways. For example, the user could be determined to be asleep by a comparison of a measurement or measurements of user 145 motion to a measurement or measurements of vehicle 101 motion, e.g., when a measurement of user 145 motion differs from a measurement of vehicle 101 motion by more than a predetermined threshold, e.g., when user 145 acceleration is less than vehicle 101 acceleration by more than a
predetermined threshold and/or when user 145 jerk is greater than vehicle 101 jerk by more than a predetermined threshold.
[0039] Alternatively or additionally, the camera sensor 155 of the user device 150 and/or a vehicle camera sensor 110 can provide image data 1 15, 160 of a user 145. The computer 105 and/or user device 150 can be programmed to analyze the image data 115, 160 to determine user 145 motion (or lack thereof) in the vehicle 101. For example, successive images of a user 145 can show whether a user is moving or still within a specified period of time; if the images show that the user has not moved more than a predetermined threshold from a position at a first time in a time window, then it may be determined that the user 145 is asleep. Further for example, image data can be analyzed to determine user 145 motion relative to vehicle 101 motion. For example, as a vehicle 101 experiences negative acceleration (e.g., brakes to a stop at an intersection), images can show motion of the user 145 relative to a vehicle 101 seat, e.g., whether the user 145 torso and head do or do not move longitudinally in the vehicle (typically within predefined parameters to account for normal waking movement), identifying that the user 145 head and torso do move longitudinally in the vehicle may indicate that the user 145 is in a sleep state because a sleeping user 145 may not, exercise muscular control to remain in a same longitudinal position when the vehicle 101 decelerates.
[0040] Yet further alternatively or additionally, collected data 160 from a user device 150 could be used to determine a user sleep state. For example, if a time between one or more of a last time when user input was received, a time when a Screensaver or the like was activated, a time when the user device 150 transitioned to a power-saving mode, etc., and a current time exceed a predetermined threshold, a determination could be made that the user 145 is asleep.
[0041] Yet further alternatively or additionally, vehicle 101 collected data 115 and/or user 145 device 150 collected data 160 could be used to determine whether a user 145 had spoken within a predetermined amount of time. For example, speech recognition techniques as are known could be used to detect sounds in a vehicle 101 cabin and determine whether the sounds were speech from a user 145. in order to discern that the detected speech was from the user 145 and not sound from outside the vehicle (i.e., window open) or sound emanating from user device 150 in possession of user 145, the cabin cameras could be used to confirm facial/lip movement correlates with detected speech. In most cases, barring sleep disorders that result in persons speaking while asleep, detected speech from the user 145 would indicate that the user 145 is awake. To detect users who may possibly be speaking while asleep, the computer 105 could
employ some form of question and answer interaction, e.g., queries to a user seeking information about the user's day and/or surroundings in order to detect they are alert and aware. For example, based on a calendar entry, the computer 105 could ask "Who did you meet with today?" or, based on weather day a, could ask, "is it raining?" Further, the controller 105 could communicate with device 150 to send audible alerts to a user's device 150, which would be particularly effective if the user has the device 150 in media mode and is listening to media such as music on headphones. The device 150 media could be paused before sending the alert.
[0042] Yet further alternatively or additionally, biometric data, such as respiration or heart rate data, could be used to determine a user 145 sleep state, e.g., a rate of respiration and/or a heart rate can be indicative that a person is asleep, as is known, e.g., refer to micro-motion radar used in baby monitors or use of pulse detection software for post processing of facial camera images.
[0043] Further, user 145 sleep state estimation could be binary or the computer 105 could be programmed to determine that the user 145 is in one of multiple possible sleep states. In a first example, the user 145 sleep state estimation may be binary, i.e., a user 145 may be estimated to be one of "asleep" or "not asleep." In a second example, the user 145 sleep estimation may be selected as one of a plurality of possible sleep states, e.g., as illustrated in the table below.
[0044] Following the block 220, in a block 225, the computer 105 determines, based on the estimated user sleep state determined in the block 220, whether an action is needed, i.e., whether to actuate one or more components or subsystems in the vehicle 101. This determination is typically made in conjunction with a location of the vehicle 101, e.g., determined according to the operation data received in the block 205. However, some estimations of a user 145 sleep state, e.g., that the user 145 is comatose, may trigger an action regardless of and/or without determining a vehicle 101 location.
[0045] In implementatio s in which a sleep state estimation is binary, i.e., "awake" or "not awake," the determination of whether an action is needed may be based on determining that the user 145 is asleep and the vehicle 101 is at a location requiring the user 145 to be awake. For example, the vehicle 101 may be within a predetermined distance or time of arrival (e.g., five minutes, 10 minutes, etc.) of a target location, e.g., a user 145 destination. If the user 145 is asleep at such location, the computer 105 may be programmed to take an action to awaken the user 145, e.g., so that the user 145 may exit the vehicle 101 at the target location. In another example, the vehicle 101 may be at an accident location, an emergency stop location, etc., whereupon the computer 105 may be programmed to take an action to awaken a sleeping user 145 so that the user 145 can attend to a possible emergency situation. Alternatively or additionally if the vehicle 101 is at an emergency location and the user 145 is asleep, the computer 105 may determine that an action such as moving the vehicle 101 to a safe location, sending a request for emergency assistance, etc. should be taken.
[0046] In implementatio s in which multiple, i.e., more than two, sleep state estimations are possible, the computer 105 may be programmed to determine whether an action is to be taken based on the vehicle 101 location in combination with a specific estimated sleep state. For example, the computer 105 could be programmed to determine to awaken a user when a vehicle 101 is within a predetermined distance or time of arrival of a user 145 destination, but the predetermined distance or time threshold could vary according to the user sleep state. For example, a drowsy state, a light sleep state, an REM sleep state, etc. could have increasing distance and/or time thresholds with respect to a target location such as a user 145 destination to trigger an action based on the user 145 sleep state.
[0047] If it is determined to take an action in response to a user 145 sleep state, then a block 230 is executed next. Otherwise, the process 200 returns to the block 205. Alternatively, although not
shown in Figure 2, the process 200 could end after the block 225, e.g., if it is determined not to take an action and the vehicle 101 has arrived at a target location such as a user 145 destination.
[0048] In the block 230, the computer 105 instructs a vehicle 101 component or subsystem, and/or sends an instruction to the user device 150, to take an action based on the estimated user 145 sleep stale. The action may be determined according to the estimated sleep state. For example, if the user 145 is drowsy, the action may be providing a visual alert via a vehicle 101 human machine interface (HMI) and/or a display of the user device 150. if a user 145 is lightly asleep, then an audio alert or alarm could be initiated via the vehicle 101 HMI and/or the user device 150. Such audio alert or alarm, could be provided more loudly or aggressively if the user 145 is in an REM sleep state. In one example, the controller 105 may first request help of other vehicle 101 occupants to wake user 145. If none accepts, the controller 105 may advise the vehicle will sound an alarm one to attempt to wake the user 145. Other passengers may then request a PAUSE to any subsequent alarms until they have exited the vehicle. Once other occupants have exited the vehicle and/or when user 145 is in higher levels of sleep states, i.e., a sleep state is estimated to be a deeper sleep, more aggressive action may be taken. For example, the computer 105 could actuate haptic output, e.g., vibration of a vehicle 101 seat, could actuate vehicle 101 brakes to provide brake pulses to awaken the user 145, activating vehicle steering to arouse the user, controlling the vehicle propulsion, e.g. powertrain, e.g., to slow the vehicle, to arouse the user, etc.
[0049] In a further example, the computer 105 in the block 230 could actuate, possibly after providing a visual and/or audible alarm, the vehicle 101 HMI and/or user device 152 request user 145 input to confirm that the user 145 is awake. Upon receiving such input, the process 200 could then exit the block 230. However, if such input was not received within a predetermined amount of time, e.g., five seconds, 10 seconds, etc., the computer 105 could actuate more aggressive visual and/or audible alarms, and could again request user 145 input, to confirm that the user 145 is awake. As a means of preventing other passengers from respond to the confirmation request so as to stop the alarm, the controller 105 may ask for specific information that only user 145 would know such as final destination, reservation order number, etc.
Moreover, if, after a predetermined number of requests for user 145 input, no input is received, the computer 105 could be programmed to determine that the user 145 is un-arousable, e.g., comatose, sleep state, and to take action accordingly, e.g., sending a request for emergency assistance via the server 130.
[0050] Moreover, as mentioned above, the action could be determined according to a vehicle 101 location in combination with the user 145 sleep stale. For example, if the user 145 is asleep and the vehicle 101 is at an accident scene, the computer 105 could send a message to the server 130 requesting emergency assistance.
[0051] Following the block 230, the process 200 ends.
[0052] When this disclosure refers to a "location," it is to be understood that the location could be determined in a known manner, e.g., according to geo-coordinates such as are known. For example, global positioning system (GPS) devices can determine latitude and longitude with great precision, and could be used to determine locations discussed herein.
[0053] As used herein, the adverb "substantially" modifying an adjective means that a shape, structure, measurement, value, calculation, etc. may deviate from an exact described geometry, distance, measurement, value, calculation, etc., because of imperfections in materials, machining, manufacturing, sensor measurements, computations, processing time, communications time, etc.
[0054] The article "a" modifying a noun should be understood as meaning one or more unless stated otherwise, or context requires otherwise. The phrase "based on" encompasses being partly or entirely based on.
[0055] Computing devices 105, 150, etc., generally each include instructions executable by one or more computing devices such as those identified above, and for carrying out blocks or steps of processes described above. The terms "computing device" and "computer" may be used interchangeably in this disclosure. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, Visual Basic, Java Script, Perl, HTML, etc. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media. A file in the computing device 105 is generally a collection of data stored on a computer readable medium, such as a storage medium, a random access memory, etc.
[0056] A computer-readable medium includes any medium that participates in providing data (e.g., instructions), which may be read by a computer. Such a medium may take many forms, including, but not limited to, non- volatile media, volatile media, etc. Non-volatile media include, for example, optical or magnetic disks and other persistent memory. Volatile media
include dynamic random access memory (DRAM), which typically constitutes a main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
[0057] With regard to the media, processes, systems, methods, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. For example, in the process 600, one or more of the steps could be omitted, or the steps could be executed in a different order than shown in Figure 6. In other words, the descriptions of systems and/or processes herein are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the disclosed subject matter.
[0058] Accordingly, it is to be understood that the present disclosure, including the above description and the accompanying figures and below claims, is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent to those of skill in the art upon reading the above description. The scope of the invention should be determined, not with reference to the above description, but should instead be determined with reference to claims appended hereto and/or included in a non-provisional patent application based hereon, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the arts discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the disclosed subject matter is capable of modification and variation.
Claims
1 . A system, comprising a computer programmed to:
identify a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and
perform an action based on a location of the vehicle and the identified occupant sleep state.
2. The system of claim 1, further comprising a portable computing device programmed to provide occupant motion data to the computer.
3. The system of claim 1, wherein the computer is a portable computing device.
4. The system of claim 3, wherein the computer is further programmed to receive vehicle motion data from a vehicle computer.
5. The system of claim 1, wherein the computer is further programmed to receive occupant motion data including at least one of aecelerometer data and image data, and to measure the motion of the vehicle occupant, according to the received occupant motion data.
6. The system of claim 1 , wherein the computer is further programmed to receive vehicle operation data and to determine the measurement of vehicle motion according to the received vehicle operation data.
7. The system of claim 1, wherein the action is initiating a communication with a remote server based on the vehicle occupant sleep state.
8. The system of claim 1, wherein the action is controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
9. The system of claim 1 , wherein the computer is further programmed to perform the action based in part upon a determination that the location of the vehicle is within a
predetermined distance of a target location.
10. The system of claim 1, wherein the identified sleep state is one of not asleep, asleep, and unconscious.
11. A method, comprising:
identifying a vehicle occupant sleep state based at least in part on a comparison of a measurement of vehicle motion to a measurement of occupant motion; and
performing an action based on a location of the vehicle and the identified occupant sleep state.
12. The method of claim 11, further comprising providing, from a portable computing device, occupant motion data to a vehicle computer.
13. The method of claim 11, further comprising providing, from a vehicle computer, the vehicle motion data to a portable computing device.
14. The method of claim 1 1 , further comprising receiving occupant motion data including at least one of accelerometer data and image data, and measuring the motion of the vehicle occupant according to the received occupant motion data.
15. The method of claim 11, further comprising receiving vehicle operation data and determining the measurement of vehicle motion according to the received vehicle operation data.
16. The method of claim 1 1, wherein the action is initiating a communication with a remote server based on the vehicle occupant sleep state.
17. The method of claim 11, wherein the action is controlling at least one of vehicle powertrain, brakes, and steering based on the occupant sleep state.
18. The method of claim 11, further comprising performing the action based in part upon a determination that the location of the vehicle is within a predetermined distance of a target, location,
19. The method of claim 1 1 , wherein the identified sleep state is one of not asleep, asleep, and unconscious.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2017/036357 WO2018226220A1 (en) | 2017-06-07 | 2017-06-07 | Vehicle occupant sleep state management |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2017/036357 WO2018226220A1 (en) | 2017-06-07 | 2017-06-07 | Vehicle occupant sleep state management |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018226220A1 true WO2018226220A1 (en) | 2018-12-13 |
Family
ID=64567258
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2017/036357 Ceased WO2018226220A1 (en) | 2017-06-07 | 2017-06-07 | Vehicle occupant sleep state management |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018226220A1 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109895781A (en) * | 2019-03-18 | 2019-06-18 | 百度在线网络技术(北京)有限公司 | Method for controlling a vehicle and device |
| CN113509150A (en) * | 2021-08-09 | 2021-10-19 | 恒大恒驰新能源汽车研究院(上海)有限公司 | In-vehicle sleep monitoring method and device and electronic equipment |
| CN113928328A (en) * | 2020-06-29 | 2022-01-14 | 美光科技公司 | Impaired driving assistance |
| DE102020214556A1 (en) | 2020-11-19 | 2022-05-19 | Volkswagen Aktiengesellschaft | Communication system for a vehicle for dealing with an occupant's sleep disorder |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160001781A1 (en) * | 2013-03-15 | 2016-01-07 | Honda Motor Co., Ltd. | System and method for responding to driver state |
| US20160071393A1 (en) * | 2014-09-09 | 2016-03-10 | Torvec, Inc. | Systems, methods, and apparatus for monitoring alertness of an individual utilizing a wearable device and providing notification |
| US20160311440A1 (en) * | 2015-04-22 | 2016-10-27 | Motorola Mobility Llc | Drowsy driver detection |
-
2017
- 2017-06-07 WO PCT/US2017/036357 patent/WO2018226220A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160001781A1 (en) * | 2013-03-15 | 2016-01-07 | Honda Motor Co., Ltd. | System and method for responding to driver state |
| US20160071393A1 (en) * | 2014-09-09 | 2016-03-10 | Torvec, Inc. | Systems, methods, and apparatus for monitoring alertness of an individual utilizing a wearable device and providing notification |
| US20160311440A1 (en) * | 2015-04-22 | 2016-10-27 | Motorola Mobility Llc | Drowsy driver detection |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109895781A (en) * | 2019-03-18 | 2019-06-18 | 百度在线网络技术(北京)有限公司 | Method for controlling a vehicle and device |
| CN113928328A (en) * | 2020-06-29 | 2022-01-14 | 美光科技公司 | Impaired driving assistance |
| DE102020214556A1 (en) | 2020-11-19 | 2022-05-19 | Volkswagen Aktiengesellschaft | Communication system for a vehicle for dealing with an occupant's sleep disorder |
| WO2022106176A1 (en) | 2020-11-19 | 2022-05-27 | Volkswagen Aktiengesellschaft | Communication system for a vehicle for acting in the event of a sleeping disorder with an occupant |
| CN113509150A (en) * | 2021-08-09 | 2021-10-19 | 恒大恒驰新能源汽车研究院(上海)有限公司 | In-vehicle sleep monitoring method and device and electronic equipment |
| CN113509150B (en) * | 2021-08-09 | 2023-07-14 | 恒大恒驰新能源汽车研究院(上海)有限公司 | In-vehicle sleep monitoring method and device and electronic equipment |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12202489B2 (en) | Information processing device, moving apparatus, method, and program | |
| KR102669020B1 (en) | Information processing devices, mobile devices, and methods, and programs | |
| JP7424305B2 (en) | Information processing device, information processing method, and program | |
| US20230211810A1 (en) | Information processing device, mobile device, information processing system, and method | |
| JP7080598B2 (en) | Vehicle control device and vehicle control method | |
| CN113665528B (en) | Autonomous vehicle safety system and method | |
| KR20210088565A (en) | Information processing devices, mobile devices and methods, and programs | |
| CN109690609A (en) | Passenger assistance device, method and program | |
| KR102143211B1 (en) | A method and system for preventing drowsiness driving and keeping vehicle safe | |
| US11866073B2 (en) | Information processing device, information processing system, and information processing method for wearable information terminal for a driver of an automatic driving vehicle | |
| JP7376381B2 (en) | Vehicle control device and vehicle control system | |
| JP2017136922A (en) | Vehicle control device, on-vehicle device controller, map information generation device, vehicle control method, and on-vehicle device control method | |
| US11912267B2 (en) | Collision avoidance system for vehicle interactions | |
| WO2018226220A1 (en) | Vehicle occupant sleep state management | |
| WO2018046015A1 (en) | Alarm method, device and terminal for vehicle | |
| WO2019131116A1 (en) | Information processing device, moving device and method, and program | |
| JP2017146744A (en) | Driver state determination device | |
| US10589741B2 (en) | Enhanced collision avoidance | |
| US20170131714A1 (en) | Vehicle control based on connectivity of a portable device | |
| JP5688809B2 (en) | Driver state estimation device | |
| JP2008068665A (en) | Vehicle control apparatus and vehicle control method | |
| WO2012043373A1 (en) | Information processing system, information processing method, and information processing program | |
| CN108351343A (en) | Enhanced Messaging | |
| JP7238193B2 (en) | Vehicle control device and vehicle control method | |
| JP2022006479A (en) | Stroke management system, arrival schedule time monitoring method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17912688 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17912688 Country of ref document: EP Kind code of ref document: A1 |