[go: up one dir, main page]

WO2018097818A1 - Virtual reality interface to an autonomous vehicle - Google Patents

Virtual reality interface to an autonomous vehicle Download PDF

Info

Publication number
WO2018097818A1
WO2018097818A1 PCT/US2016/063376 US2016063376W WO2018097818A1 WO 2018097818 A1 WO2018097818 A1 WO 2018097818A1 US 2016063376 W US2016063376 W US 2016063376W WO 2018097818 A1 WO2018097818 A1 WO 2018097818A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
trajectory
controller
virtual
detecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2016/063376
Other languages
French (fr)
Inventor
Scott Vincent Myers
Harpreetsingh Banvait
Mohamed Ahmad
Lisa Scaria
Rao Nikhil Nagraj
Alexandru Mihai Gurghian
Parsa Mahmoudieh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to PCT/US2016/063376 priority Critical patent/WO2018097818A1/en
Publication of WO2018097818A1 publication Critical patent/WO2018097818A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/211Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays producing three-dimensional [3D] effects, e.g. stereoscopic images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/29Holographic features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/213Virtual instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/25Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output

Definitions

  • This invention relates to incorporating human control inputs into autonomous vehicle operation.
  • Autonomous vehicles are the subject of much research and development. Such vehicles include a set of sensors and control logic that enables the identification and avoidance of obstacles and navigation to a destination. Such vehicles may lack conventional control interfaces, such as a steering wheel, brake pedal, accelerator pedal, and the like. Some people may prefer to exert some control over the autonomous vehicle. In prior approaches, a video game-style controller is provided to an occupant of the autonomous vehicle. However, this is an unfamiliar interface to those used to conventional automobile controls and may become a projectile and cause harm during an accident.
  • FIG. 1A is a schematic block diagram of a system for implementing embodiments of the invention.
  • Fig. IB is a schematic block diagram of a vehicle including interior sensors for implementing embodiments of the invention.
  • FIG. 2 is a schematic block diagram of an example computing device suitable for implementing methods in accordance with embodiments of the invention
  • FIG. 3 is a process flow diagram of a method for providing a virtual interface to an autonomous vehicle in accordance with an embodiment of the present invention.
  • FIGs. 4 is a process flow diagram of a method for providing feedback to a user of a virtual interface to an autonomous vehicle in accordance with embodiments of the present invention.
  • a vehicle 100 may house a controller 102.
  • the vehicle 100 may include any vehicle known in the art.
  • the vehicle 100 may have all of the structures and features of any vehicle known in the art including, wheels, a drive train coupled to the wheels, an engine coupled to the drive train, a steering system, a braking system, and other systems known in the art to be included in a vehicle.
  • the controller 102 may perform autonomous navigation and collision avoidance.
  • the controller 102 may receive one or more outputs from one or more exterior sensors 104.
  • one or more cameras 106a may be mounted to the vehicle 100 and output image streams received to the controller 102.
  • the exterior sensors 104 may include sensors such as an ultrasonic sensor 106b, a RADAR (Radio Detection and Ranging) sensor 106c, a LIDAR (Light Detection and Ranging) sensor 106d, a SONAR (Sound Navigation and Ranging) sensor 106e, and the like.
  • sensors such as an ultrasonic sensor 106b, a RADAR (Radio Detection and Ranging) sensor 106c, a LIDAR (Light Detection and Ranging) sensor 106d, a SONAR (Sound Navigation and Ranging) sensor 106e, and the like.
  • the controller 102 may execute an autonomous operation module 108 that receives the outputs of the exterior sensors 104.
  • the autonomous operation module 108 may include an obstacle identification module 110a, a collision prediction module 110b, and a decision module 110c.
  • the obstacle identification module 110a analyzes the outputs of the exterior sensors and identifies potential obstacles, including people, animals, vehicles, buildings, curbs, and other objects and structures. In particular, the obstacle identification module 110a may identify vehicle images in the sensor outputs.
  • the collision prediction module 110b predicts which obstacle images are likely to collide with the vehicle 100 based on its current trajectory or current intended path. The collision prediction module 110b may evaluate the likelihood of collision with objects identified by the obstacle identification module 110a.
  • the decision module 110c may make a decision to stop, accelerate, turn, etc. in order to avoid obstacles.
  • the manner in which the collision prediction module 110b predicts potential collisions and the manner in which the decision module 110c takes action to avoid potential collisions may be according to any method or system known in the art of autonomous vehicles.
  • the decision module 110c may control the trajectory of the vehicle by actuating one or more actuators 112 controlling the direction and speed of the vehicle 100.
  • the actuators 112 may include a steering actuator 114a, an accelerator actuator 114b, and a brake actuator 114c.
  • the configuration of the actuators 114a- 114c may be according to any implementation of such actuators known in the art of autonomous vehicles.
  • the autonomous operation module 108 may perform autonomous navigation to a specified location, autonomous parking, and other automated driving activities known in the art.
  • the autonomous operation module 108 may be further programmed to receive control inputs from an occupant of the vehicle 100 and to incorporate these inputs into the otherwise autonomous operation of the vehicle 100.
  • the vehicle 100 may include one or more output devices 116 for presenting the virtual interface and providing feedback to the occupant.
  • the output devices 116 may include a virtual reality (VR) display 118a, which may be embodied as a VR head set, holographic projector, heads up display (HUD), or other type of display device.
  • VR virtual reality
  • HUD heads up display
  • Output devices 116 may include one or more haptic devices providing tactile feedback to the occupant, such as one or more haptic gloves 118b, one or more haptic shoes 118c, or a seat 118d incorporating one or more haptic feedback devices.
  • Other output devices 116 may also be included such as speakers, lights, screens, and the like.
  • interior sensors 120 such as one or more cameras 122a or a LIDAR sensor 122b.
  • interior sensors 120 may include an electro-chemical sensor 122c.
  • Other interior sensors 120 may include an ultrasonic sensor, microphone, or other type of sensing device.
  • one or more cameras 122a may be positioned having regions likely to contain the occupant's hands and feet within the cameras' field of view.
  • the LIDAR sensor 122b may be positioned having the front of the occupant in its field of view.
  • the electro-chemical sensor 122c may be mounted to be close to the occupant's mouth in order to detect potential impairment due to alcohol.
  • the electro-chemical sensor 122c may be mounted to the ceiling of the vehicle 100.
  • the autonomous operation module 108 may include a virtual control module HOd that renders control structures in an output of the VR display 118a.
  • the virtual control module HOd may render a three-dimensional (3D) representation of a steering wheel 124 with a virtual position within the interior of the vehicle 100 that is accessible to the occupant.
  • the virtual control module HOd may render brake and accelerator pedals 126 at virtual locations in the interior of the vehicle 100 corresponding to the conventional location of these control structures.
  • Other control structures such as a shift lever, turn signal lever, and the like may also be rendered by the virtual control module 1 lOd at their conventional locations.
  • the autonomous operation module 108 may further include an input module HOe.
  • the input module HOe receives outputs of one or more of the output devices and identifies the locations of the occupant's hands and/or feet. The input module HOe compares these locations with the locations of the virtual controls to determine whether the user is interacting with the virtual controls. If so, then movement of the occupant's hands or feet may be interpreted as interaction with the virtual controls.
  • the autonomous operation module 108 may include a validation module 1 lOf.
  • Inputs from the occupant detected by the input module 1 lOe may be interpreted as steering, acceleration, and braking inputs.
  • the validation module 1 lOf evaluates whether actuating the steering, accelerator, or brake according to these inputs would result in collision with an obstacle or operation of the vehicle 100 outside of its operational limits, i.e. result in loss of stability or control. If not, the actuators 112 are activated according to the inputs. If so, then validation module 11 Of may ignore the inputs or activate the actuators 112 according to a modified version of the inputs that are sufficient to avoid collision and remain within operational limits of the vehicle 100.
  • the autonomous operation module 108 may include a feedback module 1 lOg.
  • the feedback module 1 lOg may provide feedback as a warning indicating that an input has been found to indicate likely collision or exceeding the vehicle's 100 operational limits.
  • the feedback module 1 lOg may also provide feedback as guidance to instruct the occupant to turn, brake, accelerate, etc. in order to avoid collision or exceeding the operational limits of the vehicle 100.
  • Feedback may be in the form of perceptible outputs generated by the haptic devices 118b-118d or another output device such as a speaker, visible light, screen, VR display 118a, or other device.
  • the autonomous operation module 108 may include an occupant state module 11 Oh.
  • the occupant state module 11 Oh may evaluate outputs of the interior sensors 120 to determine a state of impairment of the occupant. For example, an output of the electro-chemical sensor 122c may be evaluated to determine whether alcohol or other impairing substance is detected. Outputs of the other sensors 122a, 122b may be evaluated to determine behavior indicating impairment. For example, video feeds from the cameras 122a and a sequence of point clouds from the LIDAR sensor 122b may be input to a machine learning algorithm, such as a deep neural network (DNN) or other type of machine learning algorithm, that has been trained to identify behavior indicating impairment from drunkenness, ingestion of narcotics, fatigue, or other causes.
  • DNN deep neural network
  • Fig. 2 is a block diagram illustrating an example computing device 200.
  • Computing device 200 may be used to perform various procedures, such as those discussed herein.
  • the controller 102 may have some or all of the attributes of the computing device 200.
  • Computing device 200 includes one or more processor(s) 202, one or more memory device(s) 204, one or more interface(s) 206, one or more mass storage device(s) 208, one or more Input/Output (I/O) device(s) 210, and a display device 230 all of which are coupled to a bus 212.
  • Processor(s) 202 include one or more processors or controllers that execute instructions stored in memory device(s) 204 and/or mass storage device(s) 208.
  • Processor(s) 202 may also include various types of computer-readable media, such as cache memory.
  • Memory device(s) 204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 214) and/or nonvolatile memory (e.g., read-only memory (ROM) 216). Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
  • volatile memory e.g., random access memory (RAM) 2114
  • ROM read-only memory
  • Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
  • Mass storage device(s) 208 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in Fig. 2, a particular mass storage device is a hard disk drive 224. Various drives may also be included in mass storage device(s) 208 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 208 include removable media 226 and/or non-removable media.
  • I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200.
  • Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
  • Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200. Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
  • Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments.
  • Example interface(s) 206 include any number of different network interfaces 220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet.
  • Other interface(s) include user interface 218 and peripheral device interface 222.
  • the interface(s) 206 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
  • Bus 212 allows processor(s) 202, memory device(s) 204, interface(s) 206, mass storage device(s) 208, I/O device(s) 210, and display device 230 to communicate with one another, as well as other devices or components coupled to bus 212.
  • Bus 212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
  • programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 200, and are executed by processor(s) 202.
  • the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware.
  • ASICs application specific integrated circuits
  • the illustrated method 300 may be executed by the controller 102 in order to present virtual controls, receive inputs, and process received inputs.
  • the method 300 may be initiated by receiving 302 a request from an occupant of the vehicle to drive the vehicle 100.
  • the request may be received be pressing a button, receiving selection of an element in a user interface, receiving a voice instruction, or other input.
  • the method 300 may include detecting 304 the occupant using one or more of the interior sensors 120 and evaluating 306 whether a state of the occupant is acceptable. This may include evaluating video images of the occupant from the cameras 122a, point clouds from the LIDAR sensor 122b, and a chemical signature from the electro-chemical sensor 122c. As noted above, one or more of these outputs may be input to a DNN, which outputs an estimate of the whether the occupant is impaired. If this output indicates impairment, the occupant may be found 306 to not be in a state to drive the vehicle 100. In that case, the method 300 may end. In other embodiments, the method 300 may include presenting the virtual controls but ignoring any inputs received.
  • the occupant may be measured to determine where to position the virtual controls.
  • the arms and legs of the occupant may be detected in a point cloud from the LIDAR sensor 122b and the lengths thereof estimated 308.
  • the positions of the virtual controls may then be determined 310 from the lengths of the occupant's arms and legs.
  • the virtual controls may then be displayed 312 at these positions.
  • a virtual steering wheel 124 may be rendered in the VR display 118a within comfortable reach of the occupant's arms and one or more virtual pedals 126 positioned within comfortable reach of the occupant's feet.
  • virtual controls may be displayed 312 superimposed on a field of view of the occupant, i.e. rendered on an image of the field of view of the VR display 118a.
  • the virtual controls may be superimposed on a rendering 314 of outputs of the exterior sensors 104. For example, a rendering of obstacles, vehicles, the road, etc. as detected in an output of the LIDAR sensor 106d.
  • the method 300 may further include receiving 316 occupant inputs. As noted above this may include detecting movement of the occupant's hands in the position of the virtual steering wheel 124. For example, the occupant may position the occupant's hands over the virtual steering wheel 124 and simulate clockwise or counter clockwise movement of the virtual steering wheel 124. This movement may then be detected using the interior sensors 120 and interpreted as an instruction to cause right or left turning, respectively, of the vehicle 100.
  • movement of the occupant's right foot for example, down in the location of a virtual pedal 126 corresponding to the brake, may be detected and interpreted as an instruction to increase braking force. Upward movement of the right foot in the location of the virtual pedal 126 corresponding to the brake may be detected and interpreted as an instruction to decrease braking force.
  • Movement of the occupant's right foot for example, down in the location of a virtual pedal 126 corresponding to the accelerator, may be detected and interpreted as an instruction to increase acceleration. Upward movement of the right foot in the location of the virtual pedal 126 corresponding to the accelerator may be detected and interpreted as an instruction to decrease acceleration.
  • Simulated shifting or actuation of other virtual controls may likewise be interpreted as instructions to invoke the corresponding functions of these controls.
  • Inputs received may then be validated 318. As noted above, this may include estimating whether executing the inputs received at step 316 would result in collision with an object detected using the exterior sensors 104 or would result in the vehicle 100 exceeding its operational limits. For example, for a turning input, this may include evaluating whether altering the trajectory in the direction indicated by the occupant would cause the trajectory to be incident on a detected obstacle. For an acceleration or turning input, this may further include whether turning or acceleration with a magnitude corresponding to a user input would result in instability or loss of control.
  • the validation step 318 may include determining a valid trajectory in view of the user inputs. For example, this may include ignoring the user inputs and continuing on an autonomous trajectory previously determined by the controller 102. This may include reducing the magnitude of a turning, braking, or acceleration input such that the validated trajectory stays within an acceptable tolerance away from obstacles and remains within operational limits of the vehicle 100.
  • the method 300 may include autonomously traversing 320 the validated trajectory as determined at step 318.
  • the method 300 may include providing 322 feedback.
  • Providing feedback may include executing some or all of the method 400 of Fig. 4.
  • the method 400 may include evaluating 402 whether a user input will result in a potential collision.
  • the method 400 may include determining 404 an alternative trajectory that will avoid the potential collision.
  • the method 400 may include providing 406 haptic feedback according to the alternative trajectory. For example, if the alternative trajectory is to the left, a left haptic glove 118b may be caused to produce a perceptible output. If an alternative trajectory requires braking, a right haptic shoe 118c may be caused to product a perceptible output.
  • instructions e.g. right or left arrows may be displayed on the VR display 118a.
  • the method 400 may include evaluating 408 whether an occupant input would result in inputs outside the operation limits of the vehicle 100. If so, the method 400 may include determining and traversing 410 an alternative trajectory that remains within operational limits, e.g. with reduced speed, larger turning radius, more braking force, or other modifications. The method 400 may further include generating 412 a haptic warning communicating to the occupant that the driver is attempting to perform an unsafe action. For example, a series of repeated outputs to the haptic seat 118d or other haptic devices 118b- 118c may be generated. An alert may also be output to the VR display 118a.
  • Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer- executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD- ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • SSDs solid state drives
  • PCM phase-change memory
  • An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network.
  • a "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
  • the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • ASICs application specific integrated circuits
  • a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code.
  • processors may include hardware logic/electrical circuitry controlled by the computer code.
  • At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium.
  • Such software when executed in one or more data processing devices, causes a device to operate as described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)

Abstract

An autonomous vehicle includes a VR display in which virtual controls are rendered, such as a virtual steering wheel, virtual accelerator pedal, and virtual brake pedal. User interaction with the virtual controls are detected and interpreted as control inputs. These control inputs are validated to determine whether they would result in collision or unstable operation of the vehicle. The vehicle controller may then implement the control inputs, modify them, or ignore them in order to avoid unsafe conditions. Feedback to the user may be provided by means of haptic gloves, shoes, or haptic devices in a seat of the vehicle. The user's state may be evaluated for impairment prior to enabling or executing control inputs.

Description

Title: VIRTUAL REALITY INTERFACE TO AN AUTONOMOUS VEHICLE
BACKGROUND
FIELD OF THE INVENTION
[001] This invention relates to incorporating human control inputs into autonomous vehicle operation.
BACKGROUND OF THE INVENTION
[002] Autonomous vehicles are the subject of much research and development. Such vehicles include a set of sensors and control logic that enables the identification and avoidance of obstacles and navigation to a destination. Such vehicles may lack conventional control interfaces, such as a steering wheel, brake pedal, accelerator pedal, and the like. Some people may prefer to exert some control over the autonomous vehicle. In prior approaches, a video game-style controller is provided to an occupant of the autonomous vehicle. However, this is an unfamiliar interface to those used to conventional automobile controls and may become a projectile and cause harm during an accident.
[003] The systems and methods disclosed herein provide an improved approach for providing an interface to an autonomous vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[004] In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which:
[005] Fig. 1A is a schematic block diagram of a system for implementing embodiments of the invention;
[006] Fig. IB is a schematic block diagram of a vehicle including interior sensors for implementing embodiments of the invention;
[007] Fig. 2 is a schematic block diagram of an example computing device suitable for implementing methods in accordance with embodiments of the invention;
[008] Fig. 3 is a process flow diagram of a method for providing a virtual interface to an autonomous vehicle in accordance with an embodiment of the present invention; and
[009] Figs. 4 is a process flow diagram of a method for providing feedback to a user of a virtual interface to an autonomous vehicle in accordance with embodiments of the present invention.
[0010] Referring to Figs. 1A and IB, a vehicle 100 (see Fig. IB) may house a controller 102. The vehicle 100 may include any vehicle known in the art. The vehicle 100 may have all of the structures and features of any vehicle known in the art including, wheels, a drive train coupled to the wheels, an engine coupled to the drive train, a steering system, a braking system, and other systems known in the art to be included in a vehicle.
[0011] As discussed in greater detail herein, the controller 102 may perform autonomous navigation and collision avoidance. The controller 102 may receive one or more outputs from one or more exterior sensors 104. For example, one or more cameras 106a may be mounted to the vehicle 100 and output image streams received to the controller 102.
[0012] The exterior sensors 104 may include sensors such as an ultrasonic sensor 106b, a RADAR (Radio Detection and Ranging) sensor 106c, a LIDAR (Light Detection and Ranging) sensor 106d, a SONAR (Sound Navigation and Ranging) sensor 106e, and the like.
[0013] The controller 102 may execute an autonomous operation module 108 that receives the outputs of the exterior sensors 104. The autonomous operation module 108 may include an obstacle identification module 110a, a collision prediction module 110b, and a decision module 110c. The obstacle identification module 110a analyzes the outputs of the exterior sensors and identifies potential obstacles, including people, animals, vehicles, buildings, curbs, and other objects and structures. In particular, the obstacle identification module 110a may identify vehicle images in the sensor outputs. [0014] The collision prediction module 110b predicts which obstacle images are likely to collide with the vehicle 100 based on its current trajectory or current intended path. The collision prediction module 110b may evaluate the likelihood of collision with objects identified by the obstacle identification module 110a. The decision module 110c may make a decision to stop, accelerate, turn, etc. in order to avoid obstacles. The manner in which the collision prediction module 110b predicts potential collisions and the manner in which the decision module 110c takes action to avoid potential collisions may be according to any method or system known in the art of autonomous vehicles.
[0015] The decision module 110c may control the trajectory of the vehicle by actuating one or more actuators 112 controlling the direction and speed of the vehicle 100. For example, the actuators 112 may include a steering actuator 114a, an accelerator actuator 114b, and a brake actuator 114c. The configuration of the actuators 114a- 114c may be according to any implementation of such actuators known in the art of autonomous vehicles.
[0016] In embodiments disclosed herein, the autonomous operation module 108 may perform autonomous navigation to a specified location, autonomous parking, and other automated driving activities known in the art.
[0017] The autonomous operation module 108 may be further programmed to receive control inputs from an occupant of the vehicle 100 and to incorporate these inputs into the otherwise autonomous operation of the vehicle 100.
[0018] These inputs may be received through interaction with virtual controls that are not physically present in the vehicle 100. To facilitate this, the vehicle 100 may include one or more output devices 116 for presenting the virtual interface and providing feedback to the occupant. For example, the output devices 116 may include a virtual reality (VR) display 118a, which may be embodied as a VR head set, holographic projector, heads up display (HUD), or other type of display device.
[0019] Output devices 116 may include one or more haptic devices providing tactile feedback to the occupant, such as one or more haptic gloves 118b, one or more haptic shoes 118c, or a seat 118d incorporating one or more haptic feedback devices. Other output devices 116 may also be included such as speakers, lights, screens, and the like.
[0020] User interaction with the virtual controls may be detected by way of interior sensors 120, such as one or more cameras 122a or a LIDAR sensor 122b. In some embodiments, interior sensors 120 may include an electro-chemical sensor 122c. Other interior sensors 120 may include an ultrasonic sensor, microphone, or other type of sensing device.
[0021] As shown in Fig. IB, one or more cameras 122a may be positioned having regions likely to contain the occupant's hands and feet within the cameras' field of view. Likewise, the LIDAR sensor 122b may be positioned having the front of the occupant in its field of view. The electro-chemical sensor 122c may be mounted to be close to the occupant's mouth in order to detect potential impairment due to alcohol. For example, the electro-chemical sensor 122c may be mounted to the ceiling of the vehicle 100.
[0022] The autonomous operation module 108 may include a virtual control module HOd that renders control structures in an output of the VR display 118a. For example, the virtual control module HOd may render a three-dimensional (3D) representation of a steering wheel 124 with a virtual position within the interior of the vehicle 100 that is accessible to the occupant. In a like manner, the virtual control module HOd may render brake and accelerator pedals 126 at virtual locations in the interior of the vehicle 100 corresponding to the conventional location of these control structures. Other control structures such as a shift lever, turn signal lever, and the like may also be rendered by the virtual control module 1 lOd at their conventional locations.
[0023] The autonomous operation module 108 may further include an input module HOe. The input module HOe receives outputs of one or more of the output devices and identifies the locations of the occupant's hands and/or feet. The input module HOe compares these locations with the locations of the virtual controls to determine whether the user is interacting with the virtual controls. If so, then movement of the occupant's hands or feet may be interpreted as interaction with the virtual controls.
[0024] The autonomous operation module 108 may include a validation module 1 lOf. Inputs from the occupant detected by the input module 1 lOe may be interpreted as steering, acceleration, and braking inputs. The validation module 1 lOf evaluates whether actuating the steering, accelerator, or brake according to these inputs would result in collision with an obstacle or operation of the vehicle 100 outside of its operational limits, i.e. result in loss of stability or control. If not, the actuators 112 are activated according to the inputs. If so, then validation module 11 Of may ignore the inputs or activate the actuators 112 according to a modified version of the inputs that are sufficient to avoid collision and remain within operational limits of the vehicle 100.
[0025] The autonomous operation module 108 may include a feedback module 1 lOg. The feedback module 1 lOg may provide feedback as a warning indicating that an input has been found to indicate likely collision or exceeding the vehicle's 100 operational limits. The feedback module 1 lOg may also provide feedback as guidance to instruct the occupant to turn, brake, accelerate, etc. in order to avoid collision or exceeding the operational limits of the vehicle 100. Feedback may be in the form of perceptible outputs generated by the haptic devices 118b-118d or another output device such as a speaker, visible light, screen, VR display 118a, or other device.
[001] The autonomous operation module 108 may include an occupant state module 11 Oh. The occupant state module 11 Oh may evaluate outputs of the interior sensors 120 to determine a state of impairment of the occupant. For example, an output of the electro-chemical sensor 122c may be evaluated to determine whether alcohol or other impairing substance is detected. Outputs of the other sensors 122a, 122b may be evaluated to determine behavior indicating impairment. For example, video feeds from the cameras 122a and a sequence of point clouds from the LIDAR sensor 122b may be input to a machine learning algorithm, such as a deep neural network (DNN) or other type of machine learning algorithm, that has been trained to identify behavior indicating impairment from drunkenness, ingestion of narcotics, fatigue, or other causes.
[002] Where the occupant state module 11 Oh indicates impairment, inputs detected by the input module HOe may be ignored. Alternatively, the virtual control module 1 lOd may refrain from presenting the virtual controls.
[003] Fig. 2 is a block diagram illustrating an example computing device 200. Computing device 200 may be used to perform various procedures, such as those discussed herein. The controller 102 may have some or all of the attributes of the computing device 200.
[004] Computing device 200 includes one or more processor(s) 202, one or more memory device(s) 204, one or more interface(s) 206, one or more mass storage device(s) 208, one or more Input/Output (I/O) device(s) 210, and a display device 230 all of which are coupled to a bus 212. Processor(s) 202 include one or more processors or controllers that execute instructions stored in memory device(s) 204 and/or mass storage device(s) 208. Processor(s) 202 may also include various types of computer-readable media, such as cache memory.
[005] Memory device(s) 204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 214) and/or nonvolatile memory (e.g., read-only memory (ROM) 216). Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
[006] Mass storage device(s) 208 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in Fig. 2, a particular mass storage device is a hard disk drive 224. Various drives may also be included in mass storage device(s) 208 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 208 include removable media 226 and/or non-removable media.
[007] I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200. Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
[008] Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200. Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
[009] Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments. Example interface(s) 206 include any number of different network interfaces 220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 218 and peripheral device interface 222. The interface(s) 206 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
[0010] Bus 212 allows processor(s) 202, memory device(s) 204, interface(s) 206, mass storage device(s) 208, I/O device(s) 210, and display device 230 to communicate with one another, as well as other devices or components coupled to bus 212. Bus 212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
[0011] For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 200, and are executed by processor(s) 202. Alternatively, the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. [0012] Referring to Fig. 3, the illustrated method 300 may be executed by the controller 102 in order to present virtual controls, receive inputs, and process received inputs.
[0013] The method 300 may be initiated by receiving 302 a request from an occupant of the vehicle to drive the vehicle 100. The request may be received be pressing a button, receiving selection of an element in a user interface, receiving a voice instruction, or other input.
[0014] In response to receiving 302 the request, the method 300 may include detecting 304 the occupant using one or more of the interior sensors 120 and evaluating 306 whether a state of the occupant is acceptable. This may include evaluating video images of the occupant from the cameras 122a, point clouds from the LIDAR sensor 122b, and a chemical signature from the electro-chemical sensor 122c. As noted above, one or more of these outputs may be input to a DNN, which outputs an estimate of the whether the occupant is impaired. If this output indicates impairment, the occupant may be found 306 to not be in a state to drive the vehicle 100. In that case, the method 300 may end. In other embodiments, the method 300 may include presenting the virtual controls but ignoring any inputs received.
[0015] If the occupant state is found 306 acceptable, some or all of the following steps of the method 300 may be executed.
[0016] In some embodiments, the occupant may be measured to determine where to position the virtual controls. For example, the arms and legs of the occupant may be detected in a point cloud from the LIDAR sensor 122b and the lengths thereof estimated 308. The positions of the virtual controls may then be determined 310 from the lengths of the occupant's arms and legs. The virtual controls may then be displayed 312 at these positions. In particular, as shown in Fig IB, a virtual steering wheel 124 may be rendered in the VR display 118a within comfortable reach of the occupant's arms and one or more virtual pedals 126 positioned within comfortable reach of the occupant's feet.
[0017] In some embodiments, virtual controls may be displayed 312 superimposed on a field of view of the occupant, i.e. rendered on an image of the field of view of the VR display 118a. In some embodiments, the virtual controls may be superimposed on a rendering 314 of outputs of the exterior sensors 104. For example, a rendering of obstacles, vehicles, the road, etc. as detected in an output of the LIDAR sensor 106d.
[0018] The method 300 may further include receiving 316 occupant inputs. As noted above this may include detecting movement of the occupant's hands in the position of the virtual steering wheel 124. For example, the occupant may position the occupant's hands over the virtual steering wheel 124 and simulate clockwise or counter clockwise movement of the virtual steering wheel 124. This movement may then be detected using the interior sensors 120 and interpreted as an instruction to cause right or left turning, respectively, of the vehicle 100.
[0019] In a similar manner, movement of the occupant's right foot, for example, down in the location of a virtual pedal 126 corresponding to the brake, may be detected and interpreted as an instruction to increase braking force. Upward movement of the right foot in the location of the virtual pedal 126 corresponding to the brake may be detected and interpreted as an instruction to decrease braking force. [0020] Movement of the occupant's right foot, for example, down in the location of a virtual pedal 126 corresponding to the accelerator, may be detected and interpreted as an instruction to increase acceleration. Upward movement of the right foot in the location of the virtual pedal 126 corresponding to the accelerator may be detected and interpreted as an instruction to decrease acceleration.
[0021] Simulated shifting or actuation of other virtual controls may likewise be interpreted as instructions to invoke the corresponding functions of these controls.
[0022] Inputs received may then be validated 318. As noted above, this may include estimating whether executing the inputs received at step 316 would result in collision with an object detected using the exterior sensors 104 or would result in the vehicle 100 exceeding its operational limits. For example, for a turning input, this may include evaluating whether altering the trajectory in the direction indicated by the occupant would cause the trajectory to be incident on a detected obstacle. For an acceleration or turning input, this may further include whether turning or acceleration with a magnitude corresponding to a user input would result in instability or loss of control.
[0023] The validation step 318 may include determining a valid trajectory in view of the user inputs. For example, this may include ignoring the user inputs and continuing on an autonomous trajectory previously determined by the controller 102. This may include reducing the magnitude of a turning, braking, or acceleration input such that the validated trajectory stays within an acceptable tolerance away from obstacles and remains within operational limits of the vehicle 100. [0024] The method 300 may include autonomously traversing 320 the validated trajectory as determined at step 318.
[0025] In some embodiments, the method 300 may include providing 322 feedback. Providing feedback may include executing some or all of the method 400 of Fig. 4.
[0026] Referring to Fig. 4, the method 400 may include evaluating 402 whether a user input will result in a potential collision. The method 400 may include determining 404 an alternative trajectory that will avoid the potential collision. The method 400 may include providing 406 haptic feedback according to the alternative trajectory. For example, if the alternative trajectory is to the left, a left haptic glove 118b may be caused to produce a perceptible output. If an alternative trajectory requires braking, a right haptic shoe 118c may be caused to product a perceptible output. In some embodiments, instructions, e.g. right or left arrows may be displayed on the VR display 118a.
[0027] The method 400 may include evaluating 408 whether an occupant input would result in inputs outside the operation limits of the vehicle 100. If so, the method 400 may include determining and traversing 410 an alternative trajectory that remains within operational limits, e.g. with reduced speed, larger turning radius, more braking force, or other modifications. The method 400 may further include generating 412 a haptic warning communicating to the occupant that the driver is attempting to perform an unsafe action. For example, a series of repeated outputs to the haptic seat 118d or other haptic devices 118b- 118c may be generated. An alert may also be output to the VR display 118a. [0028] In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to "one embodiment," "an embodiment," "an example embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0029] Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer- executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
[0030] Computer storage media (devices) includes RAM, ROM, EEPROM, CD- ROM, solid state drives ("SSDs") (e.g., based on RAM), Flash memory, phase-change memory ("PCM"), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
[0031] An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
[0032] Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0033] Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0034] Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
[0035] It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
[0036] At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
[0037] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.

Claims

CLAIMS:
1. A method comprising:
displaying, by a controller, virtual controls to an occupant of the vehicle;
detecting, by the controller, user movement with respect to a virtual control; and in response to detecting the user movement, altering an autonomous trajectory of the vehicle.
2. The method of claim 1, wherein detecting user movement with respect to the virtual control comprises:
detecting movement of a hand of the occupant with respect to a location corresponding to a virtual steering wheel.
3. The method of claim 1, wherein detecting user movement with respect to the virtual control comprises:
detecting movement of a foot of the occupant with respect to a location corresponding to a virtual accelerator pedal.
4. The method of claim 1, wherein detecting user movement with respect to the virtual control comprises:
detecting movement of at least one of a hand and a foot of the occupant in an output of a camera.
5. The method of claim 1, wherein detecting user movement with respect to the virtual control comprises:
detecting movement of at least one of a hand and a foot of the occupant in an output of a light distancing and ranging (LIDAR) sensor.
6. The method of claim 1, further comprising:
determining, by the controller, a trajectory according to the user movement;
identifying, by the controller, at least one of (a) potential for movement outside of operational movements of the vehicle in the trajectory and (b) a potential collision on the trajectory; and
in response to identifying at least one of (a) and (b)—
modify the trajectory to obtain an updated trajectory; and
autonomously cause the vehicle to follow the updated trajectory.
7. The method of claim 1, further comprising:
determining, by the controller, a trajectory according to the user movement;
identifying, by the controller, at least one of (a) potential for movement outside of operational movements of the vehicle in the trajectory and (b) a potential collision on the trajectory; and
causing, by the controller, a haptic feedback device to generate a perceptible output in response to detecting at least one of (a) and (b).
8. The method of claim 1, further comprising:
detecting, by the controller, an exterior environment of the vehicle in outputs of one or more exterior sensors;
displaying, by the controller, a rendering of the exterior environment according to the outputs of the one or more exterior sensors in a virtual reality display; and
displaying, by the controller, a rendering of the virtual control in the virtual reality display.
9. The method of claim 8, wherein the virtual reality display is a virtual reality headset.
10. The method of claim 8, wherein altering the autonomous trajectory of the vehicle comprises activating, by the controller, at least one of a brake actuator, accelerator actuator, and steering actuator.
11. A vehicle comprising:
one or more interior sensors;
one or more actuators including at least one of a brake actuator, an accelerator actuator, and a steering actuator;
a controller coupled to the one or more actuators, the controller programmed to: display virtual controls to an occupant of the vehicle;
detect user movement with respect to a virtual control; and in response to detecting the user movement, alter an autonomous trajectory of the vehicle using the one or more actuators.
12. The vehicle of claim 11, wherein the controller is further programmed to detect user movement with respect to the virtual control by detecting movement of a hand of the occupant with respect to a location corresponding to a virtual steering wheel in one or more outputs of the one or more interior sensors.
13. The vehicle of claim 11, wherein the controller is further programmed to detect user movement with respect to the virtual control by detecting movement of a foot of the occupant with respect to a location corresponding to a virtual accelerator pedal.
14. The vehicle of claim 11, wherein the one or more interior sensors comprise an interior camera.
15. The vehicle of claim 11, wherein the one or more interior sensors comprise a light distancing and ranging (LIDAR) sensor.
16. The vehicle of claim 11, wherein the vehicle controller is further programmed to:
determine a trajectory according to the user movement;
evaluate whether at least one of (a) there is potential for movement outside of operational movements of the vehicle in the trajectory and (b) there is a potential collision on the trajectory; and
if at least one of (a) and (b)—
modify the trajectory to obtain an updated trajectory; and autonomously cause the vehicle to follow the updated trajectory.
17. The vehicle of claim 11, wherein the vehicle controller is further programmed to:
determine a trajectory according to the user movement;
evaluate whether at least one of (a) there is potential for movement outside of operational movements of the vehicle in the trajectory and (b) there is a potential collision on the trajectory; and
if at least one of (a) and (b), cause a haptic feedback device to generate a perceptible output.
18. The vehicle of claim 11, wherein the vehicle controller is further programmed to:
detect an exterior environment of the vehicle in outputs of one or more exterior sensors;
display a rendering of the exterior environment according to the outputs of the one or more exterior sensors in a virtual reality display; and
display a rendering of the virtual control in the virtual reality display.
19. The vehicle of claim 18, wherein the virtual reality display is a virtual reality headset.
20. The vehicle of claim 11, wherein the controller is further programmed to receive outputs from a plurality of external sensors and perform autonomous collision avoidance while altering the trajectory in response to detecting the user movement.
PCT/US2016/063376 2016-11-22 2016-11-22 Virtual reality interface to an autonomous vehicle Ceased WO2018097818A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/063376 WO2018097818A1 (en) 2016-11-22 2016-11-22 Virtual reality interface to an autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/063376 WO2018097818A1 (en) 2016-11-22 2016-11-22 Virtual reality interface to an autonomous vehicle

Publications (1)

Publication Number Publication Date
WO2018097818A1 true WO2018097818A1 (en) 2018-05-31

Family

ID=62196243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/063376 Ceased WO2018097818A1 (en) 2016-11-22 2016-11-22 Virtual reality interface to an autonomous vehicle

Country Status (1)

Country Link
WO (1) WO2018097818A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045826A (en) * 2019-04-01 2019-07-23 北京小马智行科技有限公司 Virtual reality experience methods, devices and systems applied to vehicle
KR102202638B1 (en) * 2019-10-01 2021-01-14 주식회사대성엘텍 Driving assistance and driving assistance method thereof
US20230406363A1 (en) * 2022-06-20 2023-12-21 International Business Machines Corporation Virtual steering wheel with autonomous vehicle
US20240118691A1 (en) * 2022-10-06 2024-04-11 GM Global Technology Operations LLC Augmented reality experience for passenger control of autonomous vehicle
WO2026008432A1 (en) * 2024-07-04 2026-01-08 Huf Hülsbeck & Fürst Gmbh & Co. Kg System for identifying gestures

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130204457A1 (en) * 2012-02-06 2013-08-08 Ford Global Technologies, Llc Interacting with vehicle controls through gesture recognition
US20140236393A1 (en) * 2011-01-05 2014-08-21 Orbotix, Inc. Orienting a user interface of a controller for operating a self-propelled device
US20150149088A1 (en) * 2013-11-22 2015-05-28 Ford Global Technologies, Llc In-vehicle path verification
US20160090104A1 (en) * 2014-09-30 2016-03-31 Continental Automotive Systems, Inc. Hands accelerating control system
US20160272242A1 (en) * 2015-03-16 2016-09-22 Thunder Power Hong Kong Ltd. Vehicle control system for controlling steering of vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140236393A1 (en) * 2011-01-05 2014-08-21 Orbotix, Inc. Orienting a user interface of a controller for operating a self-propelled device
US20130204457A1 (en) * 2012-02-06 2013-08-08 Ford Global Technologies, Llc Interacting with vehicle controls through gesture recognition
US20150149088A1 (en) * 2013-11-22 2015-05-28 Ford Global Technologies, Llc In-vehicle path verification
US20160090104A1 (en) * 2014-09-30 2016-03-31 Continental Automotive Systems, Inc. Hands accelerating control system
US20160272242A1 (en) * 2015-03-16 2016-09-22 Thunder Power Hong Kong Ltd. Vehicle control system for controlling steering of vehicle

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045826A (en) * 2019-04-01 2019-07-23 北京小马智行科技有限公司 Virtual reality experience methods, devices and systems applied to vehicle
KR102202638B1 (en) * 2019-10-01 2021-01-14 주식회사대성엘텍 Driving assistance and driving assistance method thereof
US20230406363A1 (en) * 2022-06-20 2023-12-21 International Business Machines Corporation Virtual steering wheel with autonomous vehicle
US12091058B2 (en) * 2022-06-20 2024-09-17 International Business Machines Corporation Virtual steering wheel with autonomous vehicle
US20240118691A1 (en) * 2022-10-06 2024-04-11 GM Global Technology Operations LLC Augmented reality experience for passenger control of autonomous vehicle
WO2026008432A1 (en) * 2024-07-04 2026-01-08 Huf Hülsbeck & Fürst Gmbh & Co. Kg System for identifying gestures

Similar Documents

Publication Publication Date Title
CN110641472B (en) Neural Network Based Safety Monitoring System for Autonomous Vehicles
CN108089571B (en) Method and system for predicting vehicle traffic behavior of unmanned vehicles to make driving decisions
US10343602B2 (en) Spatial auditory alerts for a vehicle
CN108885836B (en) Driving assistance device, driving assistance system, driving assistance method, control device, vehicle, and medium
US11613249B2 (en) Automatic navigation using deep reinforcement learning
KR102249122B1 (en) Self-driving vehicle control takeover mechanism of human driver using electrodes
CN107577227B (en) Method, apparatus and data processing system for operating an unmanned vehicle
JP6544320B2 (en) Control system and control method of autonomous driving vehicle
US10793165B2 (en) Driving assistance method, and driving assistance device, driving control device, vehicle, driving assistance program, and recording medium using said method
WO2018097818A1 (en) Virtual reality interface to an autonomous vehicle
US20180126984A1 (en) Object tracking using sensor fusion within a probabilistic framework
CN111587197A (en) Using Driving Pattern Recognition to Tune Electric Vehicle Powertrains
JP2015051761A (en) Drive support technique for active vehicle control
KR20180114547A (en) METHOD AND SYSTEM FOR CONTROLLING AUTOMATIC TRIVING VEHICLE RETURNING TO AUTOMATIC TRAVEL MODE
JP2019043495A (en) Automatic operation adjusting device, automatic operation adjusting system, and automatic operation adjusting method
JP2019501435A (en) Method and system for building a surrounding environment for determining travel of an autonomous vehicle
JP2021059327A (en) Safe transition from autonomous driving mode to manual driving mode along with support of autonomous driving system
JP6267275B2 (en) Method and apparatus for controlling a vehicle having automatic driving control capability
JP2018118672A (en) Information processing system, information processing method, program and vehicle
JP6906175B2 (en) Driving support method and driving support device, automatic driving control device, vehicle, program, driving support system using it
JP2024507093A (en) Checking physical feedback from driving assistance systems about traffic events
CN105151044A (en) Method and device for aided driving of vehicle
US20200017116A1 (en) Anomaly Detector For Vehicle Control Signals
JP4239809B2 (en) Vehicle driving support device
US11501561B2 (en) Occupant monitoring device, occupant monitoring method, and occupant monitoring program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922346

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16922346

Country of ref document: EP

Kind code of ref document: A1