[go: up one dir, main page]

GB2592397A - Method and system for mitigating motion sickness of users in a moving vehicle - Google Patents

Method and system for mitigating motion sickness of users in a moving vehicle Download PDF

Info

Publication number
GB2592397A
GB2592397A GB2002743.9A GB202002743A GB2592397A GB 2592397 A GB2592397 A GB 2592397A GB 202002743 A GB202002743 A GB 202002743A GB 2592397 A GB2592397 A GB 2592397A
Authority
GB
United Kingdom
Prior art keywords
motion
vehicle
user
virtual reality
reality device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2002743.9A
Other versions
GB202002743D0 (en
Inventor
Reyes Anthony
Gautam Utkarsh
Saney Kavita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2002743.9A priority Critical patent/GB2592397A/en
Publication of GB202002743D0 publication Critical patent/GB202002743D0/en
Publication of GB2592397A publication Critical patent/GB2592397A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/213Virtual instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/1523Matrix displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/167Vehicle dynamics information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for mitigating motion sickness of a user travelling in a vehicle wherein destination location information is received by a motion mitigation reality device which is worn by the user and able to share virtual reality content with a second virtual reality device associated with a remote user 401. Real-time data relating to the movement of the vehicle in a physical environment is obtained by the motion mitigation reality device from sensors on the vehicle or a user device associated with the user 403. Motion change can be determined based on the real-time data 405 and motion change can be performed on the virtual reality content similar to the motion change of the vehicle such that motion sickness of the user is mitigated 407. The user and the remote user may be able to communicate during the shared VR session. The motion change may be a rotation of the virtual reality content in a direction based on the motion change in the vehicle. The real-time data may be one of speed or vehicle trajectory, estimated travel time, real-time traffic data, climate controls, navigation details, media, head unit functions, autonomous driving commands and social experiences.

Description

Intellectual Property Office Application No. GB2002743.9 RTM Date:11 August 2020 The following terms are registered trade marks and should be read as such wherever they occur in this document: Wi-Fi Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
METHOD AND SYSTEM FOR MITIGATING MOTION SICKNESS OF USERS IN A MOVING VEHICLE
PREAMBLE OF THE DESCRIPTION:
[0001] The following specification particularly describes the invention and the manner in which it is to be performed.
DESCRIPTION OF THE INVENTION:
TECHNICAL FIELD
10002] The present disclosure generally relates to an eyewear device. Particularly, but not exclusively, the disclosure relates to a method and system for mitigating motion sickness of users in a moving vehicle.
BACKGROUND OF THE DISCLOSURE
[0003] Motion sickness, for example nausea or queasiness, is a commonly experienced problem for people while riding as passengers in any type of moving vehicle. It occurs because of a mismatched sensation with an occurrence of movement received to the perceived movement. For example, a passenger travelling along a winding road in a vehicle experiences linear and angular acceleration as the vehicle travels along the curve. The response to the acceleration caused by the motion of the vehicle will not match the visual perception of the user who may experience the symptoms of motion sickness. Further, motion sickness may be caused and/or exacerbated for the passenger by other factors. For example, reading a book, viewing a video in the vehicle, etc., may induce or heighten feelings of dizziness, nausea, or motion sickness generally.
[0004] Virtual Reality (VR) or Augmented Reality (AR) techniques have gained popularity and ubiquity in society as the quality of the technology has increased and the price for equipment has decreased. Typically, VR is a simulated experience that can be similar to or completely different from the real world. VR uses VR devices to allow users to experience and interact with a virtual environment. For the same reasons described above, in-vehicle VR and AR have also gained popularity. However, problems related to motion sickness still prevails during use of VR devices in the vehicle. Existing systems do not prevent motion sickness with in-vehicle VR systems and these existing systems fail to compensate for differences between occurred movement(s) experienced in the vehicle compared to the perceived movement of the vehicle. Further, no provisions are provided for sharing a virtual reality experience with remote users while travelling in the vehicle. Thus, there is a requirement for an efficient technique to prevent motion sickness with the use of VR devices in a moving vehicle.
SUMMARY OF THE DISCLOSURE
100051 In one non-limiting embodiment of the present disclosure, there is provided a method for mitigating motion sickness of a user travelling in a vehicle. The method comprises receiving information related to destination location from a user present in the vehicle. The user wears a motion mitigation reality device and is enabled to share virtual reality content displayed in the motion mitigation reality device with a second virtual reality device associated with a remote user. The method includes the steps of: obtaining real-time data related to movement of the vehicle in a physical environment while travelling in a route leading to a destination location. The real-time data being received from one or more sensors configured in the vehicle and a user device associated with the user. Further, a motion change is detected in the vehicle based on the obtained real-time data. Thereafter, the method further comprises performing a motion change onto the displayed virtual reality content similar to the motion change detected in the vehicle to mitigate the motion sickness of the user.
100061 In an embodiment of the disclosure, the information related to a destination location comprises route details to the destination location.
[0007] In an embodiment of the disclosure, the real-time data obtained from the one or more sensors comprises speed and vehicle trajectory, estimated travel time, real-time traffic, climate controls, navigation details, media, head unit functions, autonomous driving commands and social experiences.
[0008] In an embodiment of the disclosure, the motion change comprises rotating the virtual reality content in one or more directions based on the motion change in the vehicle.
[0009] In an embodiment of the disclosure, sharing of the virtual reality content comprises providing the remote user with an access to the virtual reality content and enabling a communication between the user and the remote user during a shared virtual reality session.
10010] In another non-limiting embodiment of the disclosure, there is provided a motion mitigating reality device for mitigating motion sickness of a user travelling in a vehicle. The motion mitigating reality device comprises a processor and a memory communicatively coupled to the processor, where the memory stores processor executable instructions, which, on execution, may cause the motion mitigation reality device to receive an information related to a destination location from a user present in the vehicle. The motion mitigation reality device is worn by the user and is enabled to share virtual reality content displayed in the motion mitigation reality device with a second virtual reality device associated with a remote user. The motion mitigating reality device obtains real-time data related to movement of the vehicle in a physical environment while travelling in a route leading to the destination location from one or more sensors configured in the vehicle and a user device associated with the user. Further, the motion mitigation reality system detects a motion change in the vehicle based on the obtained real-time data. Thereafter, the motion mitigation system performs the motion change onto the displayed virtual reality content similar to the motion change detected in the vehicle to mitigate the motion sickness.
10011] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0012] The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements, and in which: [0013] Fig.1 illustrates an exemplary environment for mitigating motion sickness of a user travelling in a vehicle in accordance with some embodiments of the present disclosure; [0014] Fig.2 shows a detailed block diagram of a motion mitigation reality system in accordance with some embodiments of the present disclosure; [0015] Fig.3 shows an exemplary representation of an in-vehicle virtual reality device for mitigating motion sickness of a user in accordance with some embodiments of the present disclosure; and [0016] Fig.4 illustrates a flowchart showing a method for mitigating motion sickness of a user while travelling in a vehicle in accordance with some embodiments of present disclosure.
10017] The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
[0018] While the embodiments in the disclosure are subject to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the figures and will be described below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
[0019] It is to be noted that a person skilled in the art would be motivated from the present disclosure and may modify various configurations of circuits, system, architecture and method of updating vehicle functions, which may vary from application to application. However, such modification should be construed within the scope and spirit of the present disclosure. Accordingly, the drawings show only those specific details that are pertinent to understand the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
[0020] The terms "comprises", "comprising", or any other variations thereof used in the disclosure, are intended to cover a non-exclusive inclusion, such that a device, system, method that comprises a list of components does not include only those components but may include other components not expressly listed or inherent to such system, or assembly, or device. In other words, one or more elements in a system proceeded by "comprises.., a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or mechanism.
[0021] The present disclosure provides a method and a motion mitigation reality device for mitigating motion sickness of a user travelling in a vehicle. Particularly, the motion mitigation reality device alleviates motion sickness of the user while wearing a virtual reality device/headset (herein referred as the motion mitigation reality device) while travelling in the vehicle. The user in the vehicle is enabled to share virtual reality content with a second virtual reality device of a remote user. The remote user may be located anywhere outside the vehicle capable of connecting with the user inside the vehicle. When the user is travelling in the vehicle, the motion mitigation reality device receives information regarding destination location from the user. As the vehicle moves, real-time data is obtained for movement of the vehicle in a physical environment in the route leading to a destination location. In case if motion sickness is detected in the vehicle based on the obtained real-time data, the motion mitigation reality device performs the motion change onto the virtual reality content displayed in the motion mitigation reality device similar to motion change detected in the vehicle. Thus, compensating difference between occurred movement experienced in the vehicle to the perceived movement of the vehicle to mitigate the motion sickness of the users. Thus, the present disclosure provides an efficient virtual reality experience to users by mitigating the motion sickness of the user.
[0022] The following paragraphs describe the present disclosure with reference to FIG. 1 to FIG. 4. In the figures, the same element or elements which have same functions are indicated by the same reference numbers. It is to be noted that, the vehicle is not completely illustrated in the figures for the purpose of simplicity. One skilled in the art would appreciate that the method as disclosed in the present disclosure may be used for providing dynamic ride sharing service.
[0023] Fig.1 illustrates an exemplary environment for mitigating motion sickness of a user travelling in a vehicle in accordance with some embodiments of the present disclosure.
[0024] As shown in Fig.1, an environment 100 includes a motion mitigation reality device 101. The motion mitigation reality device 101 is connected through a communication network 109 to a vehicle 103 and a user device 105. In an embodiment, the vehicle 103 may be any automobile. The vehicle 103 may include autonomous vehicle. The vehicle 103 includes sensing unit 107. In an embodiment, the vehicle 103 may include any other units/components not shown explicitly herein. The sensing unit 107 may include one or more sensors configured at different locations in the vehicle 103. The one or more sensors may include, but not limited to, 360-degree camera, Light Detection and Ranging (LIDAR), control unit, climate sensor and the like. A person skilled in the art would understand that, any other vehicle sensors, not mentioned explicitly, may also be used to obtain vehicle information in the present disclosure. Further, the user device 105 may be associated with a user sitting in the vehicle 103. User in this document refers to one or more passengers of a vehicle. The user device may include but not limited to, for example, smart phones, tablets, IOT device and the like. The user device 105 is used for obtaining inertial measurements from Inertial Measurement Unit (IMU) sensors present in the user device 105. Further, the communication network 109 may include, but is not limited to, a direct interconnection, an e-commerce network, a Peer-to-Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (for example, using Wireless Application Protocol), Internet, Wi-Fi and the like.
[0025] The motion mitigation reality device 101 is configured to provide virtual reality experience to the user and for mitigating motion sickness of the user in the vehicle 103. In an embodiment, the motion mitigation reality device 101 may be a virtual reality device such as, a virtual reality headset or any equivalent thereof. Thus, the motion mitigation reality device 101 may be worn by the user.
[0026] Further, the motion mitigation reality device 101 may include a sensing unit 115 which may include sensors such as IMU sensor, at least one Central Processing Unit ("CPU" or "processor") 113 and a memory 111 for storing instructions executable by the at least one processor 113. The processor 113 may include at least one data processor for executing program components for executing user or system-generated requests. The memory 111 is communicatively coupled to the processor 113.
10027] The motion mitigation reality device 101 is initiated to receive information regarding a destination location from the user on detecting movement of the vehicle 103. For instance, the user may input the destination location into a navigation system of the vehicle 103. Thus, the motion mitigation reality device 101 may extract route details from the map navigation system of the vehicle 103. The user in the vehicle 103 uses the motion mitigation reality device 101 for a virtual experience while travelling the vehicle 103. In one embodiment, the motion mitigation reality device 101 is configured to enable the user to share a virtual reality content displayed in the motion mitigation reality device 101 with a second virtual reality device associated with a remote user. The sharing is enabled by providing the remote user with an access to the virtual reality content and enabling a communication between the user and the remote user during a shared virtual reality session. For example, the user may be watching a movie on an augmented screen created in the motion mitigation reality device 101 and wishes to share the movie experience with another user at the remote location. In such case, the user may be enabled to select a contact associated with the remote user from the motion mitigation reality device 101. The remote user may appear as an avatar in the virtual reality session.
[0028] Further, as the vehicle 103 moves, the motion mitigation reality device 101 obtains real-time data related to movement of the vehicle 103. Particularly, the movement of the vehicle 103 in a physical environment associated with the route leading to the destination location. The real-time data is received from the sensing unit 107 configured in the vehicle 103 and the user device 105 associated with the user. Based on the obtained real-time data, the motion mitigation reality device 101 detects a motion change in the vehicle 103. For instance, the motion change in the vehicle 103 may include, a change in a direction of the movement of the vehicle 103 for example, left/right turn, a U-turn and the like. Therefore, the motion mitigation reality device 101 performs a motion change onto the displayed virtual reality content similar to the motion change which are detected in the vehicle 103. In an embodiment, the motion change on the displayed virtual reality content may include rotating the virtual reality content in one or more directions based on the motion change in the vehicle. For instance, consider the vehicle 103 makes a left turn. In such a case, the virtual reality scene also rotates to left so that there is no disparity between the virtual motion of the scene inside the virtual reality and physical motion of the vehicle.
[0029] Fig.2 shows a detailed block diagram of a motion mitigation reality device in accordance with some embodiments of the present disclosure.
[0030] The motion mitigation reality device 101 may include data 200 and one or more modules 209 which are described herein in detail. In an embodiment, data 200 may be stored within the memory 111. The data 200 may include, for example, location data 201, sensor data 203, virtual reality session data 205 and other data 207 related to the invention.
[0031] The location data 201 may include information regarding destination location of the user while travelling in the vehicle 103. The information may include route details, estimated travel time to the destination location and the like.
[0032] The sensor data 203 may include sensor information from the sensing unit 107 of the vehicle 103. The sensor data 203 may include, vehicle cluster information including speed and vehicle trajectory information, traffic information, vehicle orientation, direction and the like. Further, the sensor data 203 may include data such as, inertial measurements obtained from the sensing unit 115 of the motion mitigation reality device 101 and the user device 105.
[0033] The virtual reality session data 205 may include virtual scene created based on the destination location of the user. The virtual reality session data 205 may include one or more motion change performed on the virtual reality content.
[0034] The other data 207 may store data, including temporary data and temporary files, generated by modules 209 for performing the various functions of the motion mitigation reality device 101.
[0035] In an embodiment, the data 200 in the memory 111 are processed by the one or more modules 209 present within the memory 111 of the motion mitigation reality device 101. In an embodiment, the one or more modules 209 may be implemented as dedicated units. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a field-programmable gate arrays (FPGA), Programmable System-on-Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. In some implementations, the one or more modules 209 may be communicatively coupled to the processor 113 for performing one or more functions of the motion mitigation reality device 101. The said modules 209 when configured with the functionality defined in the present disclosure will result in a novel hardware.
[0036] In one implementation, the one or more modules 209 may include, but are not limited to, a receiving module 211, a data obtaining module 213, a motion detection module 215and a virtual content rendering module 217. The one or more modules 209 may also include other modules 219 to perform various miscellaneous functionalities of the motion mitigation reality device 101. In an embodiment, the other modules 219 may include a sharing module for enabling the user to share the virtual reality content with the second virtual reality device of the remote user.
[0037] The receiving module 211 may receive the information regarding the destination location of the users. The information may include the route details for the destination location. Further, the receiving module 211 may also receive the real-time sensor data from the sensing unit 107 of the vehicle 103 and the user device 105.
[0038] The data obtaining module 213 may obtain real-time data from the sensing unit 107 configured in the vehicle 103 and the user device 105. The real-time data may be related to the movement of the vehicle 103 in the physical environment while travelling in the route leading to the destination location. The real-time data may include, but not limited to, the speed and vehicle trajectory, estimated travel time, real-time traffic, climate controls, navigation details, media, head unit functions, autonomous driving commands and social experiences.
[0039] The motion detection module 215 may detect the motion change in the vehicle 103 based on the real-time data obtained from the sensing unit 107 configured in the vehicle 103 and the user device 105. For instance, the motion detection module 215 may detect if the vehicle 103 makes a left/right turn, U-turn and the like.
[0040] The virtual content rendering module 217 may render the virtual content on the motion mitigation reality device 101 based on the motion detection. Particularly, on detection of the motion change, the virtual content rendering module 217 may perform similar motion change onto the displayed virtual reality content which are detected in the vehicle 103 in order to mitigate the motion sickness. Thus, the user experiences similar motion change in the virtual reality as well.
[0041] Fig.3 shows an exemplary representation of an in-vehicle virtual reality device for mitigation motion sickness of a user in accordance with some embodiments of the present disclosure.
[0042] Referring now to Fig.3 an exemplary representation 300 of a user 302 travelling in a vehicle i.e. a car 301 is illustrated. The exemplary representation 300 includes the car 301, the user 302 inside the car 301, a motion mitigation reality device 304 worn by the user 302, a sensing unit 306 of the car 301 and a user device i.e., a mobile phone 305 of the user 302. In an embodiment, the user 302 may be seated at front row or back row of the car while travelling in the car 301.
[0043] Consider the user 302 is seated at the front row of the car 301 and provides destination information to a navigation system of the car 301 in order to start the travel. While seating in the car 301, the user 302 start a virtual session by wearing the motion mitigation reality device 304. The virtual session may be related to entertainment (such as, movie, gaming etc.), utility such as, navigation, and the like. In an embodiment, the motion mitigation reality device 304 may be configured to enable the user 302 to share the virtual reality content with a second virtual reality device of a remote user 307 as shown in Fig.3. For instance, consider the user 302 may select a movie which starts playing in the motion mitigation reality device 304. The user 302 may wish to share the movie with the remote user 307 located away from the user 302. In an embodiment, on selecting the remote user 307, the remote user 307 may appear as an avatar in the virtual scene and the motion mitigation reality device 304 may enable the user 302 and the remote user 307 to communicate while viewing the movie.
[0044] Further, the motion mitigation reality device 304 may fetch route information from the navigation system of the car 301 to recreate the virtual session based on the destination information of the user 302.
[0045] As the car 301 moves, the motion mitigation reality device 304 may obtain real-time data related to movement of the car 301 while travelling in a route leading to the destination location from the sensing unit 306 and the mobile phone 305 of the user 302. The sensing unit 306 may include, but not limited to a 360-degree camera, a LIDAR and the like. Based on the real-time data, the motion mitigation reality device 304 detects any motion change in the car 301. Consider the car 301 takes a right turn. In such case, the motion mitigation reality device 304 may rotate displayed virtual reality content towards right in order to mitigate the motion sickness.
[0046] For instance, consider the user 302 is sitting in a living room and watching a movie. The user 302 may have a control to play or pause the movie. Consider, the car 301 is travelling in a winding path and experiences a linear and an angular acceleration as the car 301 travels along curves. Thus, in such case, any turns by the car 301 may cause the living room scene to be rotated according to the motion change in the car 301.This eliminates disparity between the virtual rotation of the scene inside the motion mitigation reality device 304 and the real-time physical movement of the car 301. In an embodiment, the mobile phone 305 may be connected with a server. In an embodiment, applications running on the motion mitigation reality device 304 may act as a client and connect to the server for latest IMU data. In such case, the applications may find a difference between the rotation measured by IMU of the mobile phone and the motion mitigation reality device 304. Hence, the motion mitigation reality device 304 rotates the virtual scene with the measured amount of rotation and ensuring tha the screen is in correct/ideal position to the user 302.
[0047] Fig.4 illustrates a flowchart showing a method for mitigating motion sickness of a user while travelling in a vehicle in accordance with some embodiments of the present disclosure.
[0048] As illustrated in FIG. 4, the method 400 comprises one or more blocks illustrating a method for mitigating motion sickness of the user travelling in the vehicle. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
[0049] The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0050] At block 401, the information regarding destination location is received by the receiving module 211 from the user.
10051] At block 403, the real-time data related to the movement of vehicle while travelling to the destination location is obtained from the sensing unit 107 of the vehicle 103 and the user device 105.
100521 At block 405, the motion change based on the obtained real-time data is detected by the motion detection module 215.
10053] At block 407, the motion change is performed by the virtual content rendering module 217 onto the displayed virtual reality content similar to the motion change detected in the vehicle 103 to mitigate the motion sickness.
[0054] An embodiment of the present disclosure provides a better user experience by providing control access of virtual reality device to the users [0055] An embodiment of the present disclosure provides a shared virtual experience to the remote user 307.
[0056] An embodiment of the present disclosure provides mirroring user's physical trip route with virtual experience to alleviate motion sickness of the user.
EQUIVALENTS
[0057] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0058] It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.).
[0059] It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations.
[0060] However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations)" without other modifiers, typically means at least two recitations, or two or more recitations).
[0061] Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g.) "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B) or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B".
[0062] In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
[0063] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
REFERRAL NUMERALS
Reference Number Description
Environment 101 Motion mitigation reality device 103 Vehicle User device 107 Sensing unit 109 Communication network 111 Memory 113 Processor sensing unit Data 201 Location data 203 Sensor data 205 Virtual reality session data 207 Other data 209 Modules 211 Receiving module 213 Data obtaining module 215 Motion detection module 217 Virtual content rendering module 219 Other modules 301 Car 302 User 304 Motion mitigation reality device 305 Mobile device 306 Sensing unit of car 307 Remote user

Claims (10)

  1. Claims: We claim 1. A method for mitigating motion sickness of a user while travelling in a vehicle, the method comprising: receiving, by a motion mitigation reality device, information related to a destination location from a user present in the vehicle, wherein the motion mitigation reality device is worn by the user and is enabled to share virtual reality content displayed in the motion mitigation reality device with a second virtual reality device associated with a remote user; obtaining, by the motion mitigation reality device, real-time data related to movement of the vehicle in a physical environment while travelling in a route leading to the destination location from one or more sensors configured in the vehicle and a user device associated with the user; detecting, by the motion mitigation reality device, a motion change in the vehicle based on the obtained real-time data; and performing, by the motion mitigation reality device, the motion change onto the displayed virtual reality content similar to the motion change detected in the vehicle to mitigate the motion sickness.
  2. 2. The method as claimed in claim 1, wherein the information related to a destination location comprises route details to the destination location.
  3. 3. The method as claimed in claim 1, wherein the real-time data obtained from the one or more sensors comprises speed and vehicle trajectory, estimated travel time, real-time traffic, climate controls, navigation details, media, head unit functions, autonomous driving commands and social experiences.
  4. 4. The method as claimed in claim 1, wherein the motion change comprises rotating the virtual reality content in one or more directions based on the motion change in the vehicle.
  5. The method as claimed in claim 1, wherein sharing of the virtual reality content comprises providing the remote user with an access to the virtual reality content and enabling a communication between the user and the remote user during a shared virtual reality session.
  6. 6. A motion mitigating reality device for mitigating motion sickness of a user travelling in a vehicle, comprising: a processor; a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to: receive an information related to a destination location from a user present in the vehicle, wherein the motion mitigation reality device is worn by the user and is enabled to share virtual reality content displayed in the motion mitigation reality device with a second virtual reality device associated with a remote user; obtain real-time data related to movement of the vehicle in a physical environment while travelling in a route leading to the destination location from one or more sensors configured in the vehicle and a user device associated with the user; detect a motion change in the vehicle based on the obtained real-time data; and perform the motion change onto the displayed virtual reality content similar to the motion change detected in the vehicle to mitigate the motion sickness.
  7. 7. The motion mitigating reality device as claimed in claim 6, wherein the information related to a destination location comprises route details to the destination location.
  8. 8. The motion mitigating reality device as claimed in claim 6, wherein the real-time data obtained from the one or more sensors comprises speed and vehicle trajectory, estimated travel time, real-time traffic, climate controls, navigation details, media, head unit functions, autonomous driving commands and social experiences.
  9. 9. The motion mitigating reality device as claimed in claim 6, wherein the motion change comprises rotating the virtual reality content in one or more directions based on the motion change in the vehicle.
  10. 10. The motion mitigating reality device as claimed in claim 6, wherein sharing of the virtual reality content comprises providing the remote user with an access to the virtual reality content and enabling a communication between the user and the remote user during a shared virtual reality session.
GB2002743.9A 2020-02-27 2020-02-27 Method and system for mitigating motion sickness of users in a moving vehicle Withdrawn GB2592397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2002743.9A GB2592397A (en) 2020-02-27 2020-02-27 Method and system for mitigating motion sickness of users in a moving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2002743.9A GB2592397A (en) 2020-02-27 2020-02-27 Method and system for mitigating motion sickness of users in a moving vehicle

Publications (2)

Publication Number Publication Date
GB202002743D0 GB202002743D0 (en) 2020-04-15
GB2592397A true GB2592397A (en) 2021-09-01

Family

ID=70278773

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2002743.9A Withdrawn GB2592397A (en) 2020-02-27 2020-02-27 Method and system for mitigating motion sickness of users in a moving vehicle

Country Status (1)

Country Link
GB (1) GB2592397A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023284382A1 (en) * 2021-07-16 2023-01-19 Oppo广东移动通信有限公司 Display method and apparatus, head-mounted augmented reality device, and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114245100B (en) * 2021-12-16 2023-12-01 北京圣威特科技有限公司 VR film playing control method and device based on roller coaster

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170203768A1 (en) * 2013-10-03 2017-07-20 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US20170277412A1 (en) * 2016-03-28 2017-09-28 Interactive Intelligence Group, Inc. Method for use of virtual reality in a contact center environment
US20170352185A1 (en) * 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US20180314486A1 (en) * 2017-04-28 2018-11-01 Microsoft Technology Licensing, Llc Streaming of Augmented/Virtual Reality Spatial Audio/Video
WO2019068477A1 (en) * 2017-10-04 2019-04-11 Audi Ag CHINETONE-FREE LOOKING AT A DIGITAL CONTENT IN A VEHICLE
WO2019185173A1 (en) * 2018-03-28 2019-10-03 Telefonaktiebolaget Lm Ericsson (Publ) Head-mounted display and method to reduce visually induced motion sickness in a connected remote display
KR102049045B1 (en) * 2018-05-17 2019-11-26 주식회사 패러렐월드 Systme and method for providing virtual reality for remote co-playing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170203768A1 (en) * 2013-10-03 2017-07-20 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
US20170277412A1 (en) * 2016-03-28 2017-09-28 Interactive Intelligence Group, Inc. Method for use of virtual reality in a contact center environment
US20170352185A1 (en) * 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US20180314486A1 (en) * 2017-04-28 2018-11-01 Microsoft Technology Licensing, Llc Streaming of Augmented/Virtual Reality Spatial Audio/Video
WO2019068477A1 (en) * 2017-10-04 2019-04-11 Audi Ag CHINETONE-FREE LOOKING AT A DIGITAL CONTENT IN A VEHICLE
WO2019185173A1 (en) * 2018-03-28 2019-10-03 Telefonaktiebolaget Lm Ericsson (Publ) Head-mounted display and method to reduce visually induced motion sickness in a connected remote display
KR102049045B1 (en) * 2018-05-17 2019-11-26 주식회사 패러렐월드 Systme and method for providing virtual reality for remote co-playing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023284382A1 (en) * 2021-07-16 2023-01-19 Oppo广东移动通信有限公司 Display method and apparatus, head-mounted augmented reality device, and storage medium

Also Published As

Publication number Publication date
GB202002743D0 (en) 2020-04-15

Similar Documents

Publication Publication Date Title
EP3244591B1 (en) System and method for providing augmented virtual reality content in autonomous vehicles
US11285874B2 (en) Providing visual references to prevent motion sickness in vehicles
US10762597B2 (en) Generation apparatus, generation method, reproduction apparatus, and reproduction method
CN108737801B (en) Realistic motion correction for vehicle projection
US20180225875A1 (en) Augmented reality in vehicle platforms
JP7002648B2 (en) Viewing digital content in a vehicle without vehicle sickness
US11328489B2 (en) Augmented reality user interface including dual representation of physical location
CN109644256A (en) Vehicle carrying video system
CN112977460B (en) Method and apparatus for preventing motion sickness when viewing image content in a traveling vehicle
CN109426345A (en) VR equipment and its adaptive display method
GB2592397A (en) Method and system for mitigating motion sickness of users in a moving vehicle
US9283349B2 (en) Methods and apparatus for device based prevention of kinetosis
KR101813018B1 (en) Appartus for providing 3d contents linked to vehicle and method thereof
US20250004683A1 (en) Method, apparatus and computer program product for alleviating motion sickness in a user viewing an image on a display
US20250050054A1 (en) Method, apparatus and computer program product for selecting content for display during a journey to alleviate motion sickness
US11853232B2 (en) Device, method and computer program
JP7535758B2 (en) Display System
US11714496B2 (en) Apparatus, method and computer program for controlling scrolling of content
JP7725543B2 (en) Servers and Systems
JP7442726B1 (en) Information processing device and information processing method
CN112585563A (en) Method for operating a mobile portable output device in a motor vehicle, background processing device, mobile output device and motor vehicle
US20240399861A1 (en) Dynamic audio channel configuration for a vehicle
US20230015904A1 (en) System and method for providing visual assistance to an individual suffering from motion sickness
CN119649343A (en) Method, device, vehicle and storage medium for controlling vehicle screen
CN119649258A (en) Method, device, vehicle and storage medium for adjusting image

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)