SYSTEM AND METHOD FOR PREVENTION OR REDUCTION OF MOTION
SICKNESS
Statement of Related Cases
[0001] This application claims priority to U.S. Provisional Application Serial No. 62/570,289, filed October 10, 2017, whose entire disclosure is incorporated herein by reference.
Field of the Invention
[0002] The present invention relates to the prevention or reduction of motion sickness brought about by movement and, more particularly, to a motion sickness prevention or reduction system and method that encourages a subject to generate a compensatory movement in response to a measured movement.
Background of the Invention
[0003 ] Motion sickness is generally a result of a sensory mismatch between inner ear sensation, visual information, and other body senses. When the inner ear, visual information, and other body senses become desynchronized (i.e., mismatched), the body has a similar response to having been poisoned, resulting in vomiting, vertigo, and/or other symptoms. For example, passengers in vehicles are more likely to experience motion sickness due to the sensory mismatch between the passenger's body senses (e.g., inner ear) and the passenger's actions (e.g., passenger sitting passively). In contrast, drivers or operators of vehicles are much less likely to experience motion sickness due to the
correspondence between body senses and the driver's actions (e.g., controlling the vehicle with expected accelerations, motion, etc.).
[0004] Current treatment options include medications and static visuals. Medications treat symptoms of motion sickness but do not address the root cause (i.e., the desynchroni2ation of senses). Static visuals can provide some relief, but fail to provide synchronization between all three of: (1) an inner ear sensation; (2) visual information; and (3) other body senses. Static visuals fail to provide compensation for frequencies that commonly cause motion sickness such as, for example, 1 vibration every 3-10 seconds (e.g., ocean waves, vehicle movement on mountain roads, etc.). Thus, there is a need for a system and method to prevent or reduce motion sickness that does not rely on medication and that is capable of compensating for movement frequencies that commonly cause motion sickness.
SUMMARY OF THE INVENTION
[0005] An object of the invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
[0006] The present invention provides a system and method for preventing or reducing motion sickness in a subject that measures at least one degree of movement that is experienced by the subject and the system, calculates a compensatory movement in response to the at least one degree of movement, and generates a visual image that encourages the subject to generate the calculated compensatory movement.
[0007] In various embodiments, a method for preventing or reducing motion sickness is disclosed. The method includes a step of measuring at least one degree of movement. The measured degree of movement is a shared movement experienced by both the system executing the method and a subject. The at least one degree of movement can be a translational movement (e.g., forward, back, lateral, etc.) and/or a rotational movement. The method also includes calculating a compensatory movement in response to the at least one measured degree of movement. The compensatory movement can be a similar, opposite, and/or composite movement based on the at least one measured degree of movement. The method also includes generating an interactive visual image based on the compensatory movement and presenting the interactive visual image to the subject. The interactive visual image encourages the subject to initiate the compensatory movement.
[0008] In various embodiments, a system for preventing or reducing motion sickness is disclosed. The system includes a processor, a motion sensor, a visual output device, and an input device. The motion sensor is configured to measure at least one degree of movement of the system. The at least one degree of movement corresponds to a shared movement experienced by the system and a user. The processor receives the measured movement and generates a compensatory movement. The compensatory movement can be a similar, opposite, and/ or composite movement based on the measured movement. The processor generates a visual image on the visual output device based on the calculated compensatory movement. The visual image encourages a user to move the system to
generate the compensatory movement. The compensatory movement compensates for movement experienced by the user and avoids motion sickness in a user.
[0009] An embodiment of the invention is a system for preventing or reducing motion sickness in a subject, comprising a motion sensor adapted to measure movement experienced by the subject and the system; a visual output device; an input device; and a processor in signal communication with the motion sensor and the visual output device, wherein the processor is configured to: receive a motion measurement from the motion sensor, calculate a compensatory movement in response to the motion measurement, generate an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via the input device, and display the interactive visual image on the visual output device, wherein the calculated compensatory movement and the displayed interactive visual image are adapted to synchroni2e inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
[0010] Another embodiment of the invention is a method of preventing or reducing motions sickness in a subject, comprising measuring motion experienced by the subject; calculating a compensatory movement in response to the measured motion; generating an interactive visual image based on the calculated compensatory movement, wherein the interactive visual image encourages the subject to initiate the compensatory movement via an input device; and displaying the interactive visual image to the subject; wherein the calculated compensatory movement and the displayed interactive visual image are adapted to
synchronize inner ear information, visual information and body information of the subject when the subject initiates the compensatory movement.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The invention will be described in detail with reference to the following drawings in which like reference numerals refer to like elements wherein:
[0012] Figure 1A is a block diagram of a system for preventing or reducing motion sickness in a subject, in accordance with an illustrative embodiment of the present invention;
[0013] Figure IB is a schematic diagram showing the system of Fig. 1A positioned within and/or on a vehicle, in accordance with an illustrative embodiment of the present invention;
[0014] Figure 2 is a flowchart of a motion sickness compensation method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention;
[0015] Figure 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention;
[0016] Figure 4 is a schematic diagram of the input/output subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0017] Figure 5 is a schematic diagram of the communications interface of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0018] Figure 6 is a schematic diagram of the memory subsystem of Figure 3, in accordance with an illustrative embodiment of the present invention;
[0019] Figure 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention;
[0020] Figure 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention; and
[0021] Figures 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system of the present invention, in accordance with an illustrative embodiment of the present invention.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0022] In the following detailed description of various embodiments of the system and method of the present invention, numerous specific details are set forth in order to provide a thorough understanding of various aspects of one or more embodiments. However, the one or more embodiments may be practiced without some or all of these specific details. In other instances, well-known methods, procedures, and/or components have not been described in detail so as not to unnecessarily obscure aspects of embodiments.
[0023] Articles "a" and "an" are used herein to refer to one or to more than one (i.e. at least one) of the grammatical object of the article. By way of example, "an element" means
at least one element and can include more than one element. Unless otherwise defined, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
[0024] The drawing figures are not necessarily to scale and certain features of the invention may be shown exaggerated in scale or in somewhat schematic form in the interest of clarity and conciseness. In this description, relative terms such as "hori2ontal," "vertical," "up," "down," "top," "bottom," as well as derivatives thereof (e.g., "hori2ontally," "downwardly," "upwardly," etc.) should be construed to refer to the orientation as then described or as shown in the drawing figure under discussion. These relative terms are for convenience of description and normally are not intended to require a particular orientation.
[0025] Terms including "inwardly" versus "outwardly," "longitudinal" versus "lateral" and the like are to be interpreted relative to one another or relative to an axis of elongation, or an axis or center of rotation, as appropriate. Terms concerning electrical attachments, coupling and the like, such as "electrically connected," "electrically coupled," or "in signal communication" refer to a relationship wherein elements are electrically coupled to one another either directly or indirectly through intervening elements and through any combination of wired or wireless communication channels. The terms "user," "subject" and "passenger" are used interchangeably and refer to individuals that utili2e the present invention for reducing or preventing motion sickness. The terms "movement" and "motion" are also used interchangeably.
[0026] While preferred embodiments are disclosed, still other embodiments of the system and method of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments. As will be realized, the following disclosure is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the present invention. Also, the reference or non-reference to a particular embodiment of the invention shall not be interpreted to limit the scope of the present invention.
[0027] Figure 1 A is a block diagram of a system 2 for preventing or reducing motion sickness in a subject, and Figure IB is a schematic diagram showing the system 2 positioned within and/ or on a vehicle 6, in accordance with an illustrative embodiment of the present invention. The system 2 includes a motion sensor 8, a processor 14, an input device 9 and a visual output device 12.
[0028] The motion sensor 8 includes one or more sensors that sense inertial motion. For example, the motion sensor 8 can include one or more of the following sensor types: accelerometer(s); global positioning system (GPS); gyroscope(s); mechanical sensor(s); inclinometer(s); vibrations sensor(s); altimeter(s); optical-based sensor(s); and image -based sensor(s). The motion sensor 8 is configured to detect movement (e.g., acceleration) in one or more directions (e.g., degrees of movement). The one or more directions can correspond to one or more of six degrees of movement, including translational movement 15 (e.g., up/down, left/right, forward/backward) and/or rotational movement 16 (e.g., pitch, yaw, roll). In some embodiments, the motion sensor 8 includes a plurality of sensors each
configured to measure one or more degrees of movement. The motion M measured by the motion sensor 8 is the same movement experienced by a user 4 of the system 2.
[0029] As illustrated in Fig. IB, the system 2 is positioned within and/ or on a vehicle 6 such as, for example, a ground-based vehicle (e.g., car, bus, truck, train, snow-mobile, etc.), a water-based vehicle (e.g., boat, ship, submarine, jet-ski, etc.), a flight-based vehicle (e.g., airplane, helicopter, glider, rocket, etc.), and/ or any other vehicle 6. A user 4 is positioned within and/ or on the vehicle 6 and experiences movement (e.g., acceleration and/ or deceleration) of the vehicle 6.
[0030] In some cases, movement of the vehicle 6 can cause motion sickness due to a disconnect between motion of the vehicle 6 and the user 4. The system 2 is configured to compensate for the movement of the vehicle 6 to prevent or reduce motion sickness in the user 4.
[0031]The processor 14 is configured to receive a motion measurement M from the motion sensor 8 and to generate a compensatory output. The motion sensor 8 can be integrated into system 2, as shown in Figs. 1A and IB, or can be positioned remotely from system 2, as depicted by motion sensor 8' in Fig. IB. In addition, motions sensors can be both integrated into system 2 and also located remotely from system 2.
[0032] The compensatory output generated by processor 14 can be any suitable output configured to provide motion compensation to a user 4, such as, for example, a visual output displayed by the visual output device 12. The processor 14 preferably generates a visual output and displays it via the visual output device 12 in response to
motion M sensed by the motion sensor 8. The visual output generated by processor 14 and displayed by the visual output device 12 is adapted to encourage a user 4 to provide a compensatory input to the system 2 via input device 9, such that the compensatory input corresponds to motion experienced by the user 4.
[0033] The visual output device 12 can be any type of visual display known in the art, including, but not limited to a virtual reality display, a liquid crystal display, an organic light- emitting diode (OLED) display, a cathode ray tube display, a head-mounted display, a projection type display, a holographic display or any other display known in the art.
[0034] The visual output generated by the processor 14 and displayed by the visual output device 12 can be, for example, a game that encourages a user 6 to provide an input via the input device 9 to move an object on a path corresponding to the direction of the measured movement. The input device 9 can be any type of input device known in the art, including, but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or a motion sensor that detects tilt, lateral movement and/ or shaking, etc. of the visual output device 12. In general, any input device known in the art that is capable of allowing the user 4 to enter the recommended input into the system 2 may be used. Further, the input device 9 described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
[0035] The visual output is generated by the processor 14 so as to synchronize movements of the user 4 with movement of the vehicle 6, such that information generated by an inner ear of the user 4 correlates to actions taken by the user 4 thereby preventing or
reducing motion sickness. The visual output generated by the processor 14 and displayed by the visual output device 12 can include any suitable visual output, such as terrain, targets, tasks, and/or any other suitable visual output. The compensatory input suggested by the processor 14 can relate to game actions, such as, for example, shooting/tracking of a target, movement along a path, collection of one or more objects, placing of one or more objects, etc.
[0036] As discussed above, the visual output generated by the processor 14 and displayed by the visual output device 12 is based on a compensatory movement calculated by the processor 14 in response to a motion measurement M received from the motion sensor 8. The compensatory movement calculated by the processor 14 can include a similar, opposite, and/ or composite movement based on the sensed movement M. For example, in some embodiments, the compensatory movement can movement in a similar direction to the movement M but at a smaller amplitude. As another example, the compensatory movement can be movement in a single direction derived from sensed movement M in multiple directions. It will be appreciated that the compensatory movement calculated by the processor 14 and the corresponding visual output generated by the processor 14 can be any suitable compensatory movement adapted to elicit an appropriate compensatory input from a user 4 via the corresponding visual output displayed by the visual output device 12.
[0037] The system 2 can be incorporated into and/or used in conjunction with any suitable computing device, vehicle, and/ or other system. For example, the system 2 can be wholly and/ or partially incorporated into a mobile computing device (e.g., a mobile phone,
tablet computer, laptop computer, etc.), a computing device formed integrally with a vehicle (e.g., an in-flight entertainment system, a vehicle entertainment system, a vehicle display, etc.), a user wearable device, and/or any other suitable system. Although specific embodiments are discussed herein, including specific devices and/or vehicles, it will be appreciated that the system 2 can be incorporated into any suitable computing device and/ or vehicle while still falling within the scope of the present invention.
[0038] For example, if the system 2 is incorporated into a tablet computer, the motion sensor 8 is preferably incorporated into the tablet computer. The tablet computer's main processor can be adapted to function as the processor 14 of system 2 or alternatively, a separate processor can be incorporated into the tablet computer to perform the functions of processor 14. The visual output device 12 would be the tablet computer's display, and the input device can be the tablet computer's touchscreen and/ or motion sensors in the tablet computer that detects movement of the tablet computer by a user 4.
[0039] Figure 2 is a flowchart of a method for reducing or preventing motion sickness, in accordance with an illustrative embodiment of the present invention. The method 100 is discussed with reference to Figs. 1A, IB and 2.
[0040] At step 102, movement M (e.g., acceleration) is measured by the motion sensor 8 of system 2. The movement M is generated by a vehicle 6, such as a ground-based vehicle, water-based vehicle, flight-based vehicle, and/or other vehicle. The measured motion M can include motion in one or more degrees of movement, such as, for example, six degrees of movement. The motion sensor 8 can be integrated into system 2, the vehicle
6, a user's 4 clothing, and/or any other suitable device configured to detect motion experienced by the user 4.
[0041] At step 104, a compensatory output is generated by the processor 14 based on the measured motion M. The compensatory output can be any suitable output adapted to provide motion compensation to a user 4, and is preferably a visual output. The compensatory output is adapted to elicit a compensatory input from a user 4.
[0042] At step 106, a compensatory input is received by the processor 14. The compensatory input is generated by a user 4 via input device 9. The compensatory input is based on sensed movement M and is adapted to synchroni2e movement of the user 4 with experienced movement M. For example, in some embodiments, the compensatory input is adapted to mimic the movement of an operator of the vehicle 6. The compensatory input is adapted to reduce or prevent motion sickness in the user 4.
[0043] At step 108, the visual output of the system 2, via the visual output device 12, is updated in response to the compensatory input by the user 4. For example, in some embodiments the visual output is a game or other interactive visual output and the compensatory input by the user 4 via the input device 9 is translated to movement of one or more elements within the game.
[0044] Figure 3 is a schematic diagram of a motion sickness compensation system for preventing or reducing motion sickness in a subject, in accordance with another illustrative embodiment of the present invention. The motion sickness compensation system 2a is a representative device and may comprise a processor subsystem 204, an input/output
subsystem 206, a memory subsystem 208, a communications interface 210, and a system bus 212. In some embodiments, one or more of the motion sickness compensation system 2a components may be combined or omitted such as, for example, not including the communications interface 210. In some embodiments, the motion sickness compensation system 2a may comprise other components not combined or included in the components shown in Fig. 3.
[0045] For example, the motion sickness compensation system 2a may also comprise a power subsystem (not shown). In other embodiments, the motion sickness compensation system 2a may comprise multiple iterations of the components shown in Fig. 3. For example, the motion sickness compensation system 2a may comprise multiple memory subsystems 208. For the sake of conciseness and clarity, and not limitation, one of each of the listed components is shown in Fig. 3.
[0046] Processor subsystem 204 has the same functionality as the processor 14 described above in connection with Fig. 1A, and thus the following description of processor subsystem 204 applies equally to the processor 14 of Fig. 1 A. The processor subsystem 204 may comprise any processing circuitry operative to control the operations and performance of the motion sickness compensation system 2a. In various aspects, the processor subsystem 204 may be implemented as a general purpose processor, a chip multiprocessor (CMP), a dedicated processor, an embedded processor, a digital signal processor (DSP), a network processor, an input/output (I/O) processor, a media access control (MAC) processor, a radio baseband processor, a co-processor, a microprocessor such as a complex
instruction set computer (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, and/or a very long instruction word (VLIW) microprocessor, or other processing device. The processor subsystem 204 also may be implemented by a controller, a microcontroller, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a programmable logic device (PLD), and so forth.
[0047] In various aspects, the processor subsystem 204 may be arranged to run an operating system (OS) and various applications. Examples of an OS comprise, for example, operating systems generally known under the trade names of Apple OS, Microsoft Windows OS, Android OS, and any other proprietary or open source OS. Examples of applications comprise, for example, a telephone application, a camera (e.g., digital camera, video camera) application, a browser application, a multimedia player application, a gaming application, a messaging application (e.g., email, short message, multimedia), a viewer application, and so forth.
[0048] In some embodiments, the motion sickness compensation system 2a may comprise a system bus 212 that couples various system components including the processing subsystem 204, the input/output subsystem 206, the memory subsystem 208, and/or the communications subsystem 210. The system bus 212 can be any of several types of bus structure(s) including a memory bus or memory controller, a peripheral bus or external bus, and/ or a local bus using any variety of available bus architectures including, but not limited to, 9-bit bus, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MSA), Extended ISA (EISA), InteUigent Drive Electronics (IDE), VESA Local Bus (VLB),
Peripheral Component Interconnect Card International Association Bus (PCMCIA), Small Computers Interface (SCSI) or other proprietary bus, or any custom bus suitable for computing device applications.
[0049] Figure 4 is a schematic diagram of the input/output subsystem 206 of Figure 3, in accordance with an illustrative embodiment of the present invention. The input/ output subsystem 206 may comprise any suitable mechanism or component to at least enable a user to provide input to the motion sickness compensation system 2a and to enable the motion sickness compensation system 2a to provide output to the user. For example, the input/ output subsystem 206 may comprise any suitable input mechanism, including but not limited to, a button, keypad, keyboard, click wheel, touchscreen, or motion sensor. In some embodiments, the input/output subsystem 206 may comprise a capacitive sensing mechanism, or a multi-touch capacitive sensing mechanism. It will be appreciated that any of the input mechanisms described herein may be implemented as physical mechanical components, virtual elements, and/ or combinations thereof.
[0050] The input/ output subsystem 206 preferably comprises a visual peripheral output device 214 for providing a display visible to the user. Visual peripheral output device 214 has the same functionality as the visual output device 12 described above in connection with Fig. 1A, and thus the following description of the visual peripheral output device 214 applies equally to the visual output device 12 of Fig. 1A. The visual peripheral output device 214 may comprise a screen such as, for example, a Liquid Crystal Display (LCD) screen, incorporated into the motion sickness compensation system 2a. As another example, the
visual peripheral output device 214 may comprise a movable display or projecting system for providing a display of content on a surface remote from the motion sickness compensation system 2a. In some embodiments, the visual peripheral output device 214 can comprise a coder/ decoder, also known as a Codec, to convert digital media data into analog signals. For example, the visual peripheral output device 214 may comprise video Codecs, audio Codecs, or any other suitable type of Codec.
[0051] The visual peripheral output device 214 also may comprise display drivers, circuitry for driving display drivers, or both. The visual peripheral output device 214 may be operative to display content under the direction of the processor subsystem 204. For example, the visual peripheral output device 214 may be able to play media playback information, application screens for an application implemented on the motion sickness compensation system 2a, information regarding ongoing communications operations, information regarding incoming communications requests, or device operation screens, to name only a few.
[0052]The input/output subsystem 206 preferably comprises a motion sensor 216. Motion sensor 216 has the same functionality as the motion sensor 8 described above in connection with Fig. 1A, and thus the following description of the motion sensor 216 applies equally to the motion sensor 8 of Fig. 1A. The motion sensor 216 may comprise any suitable motion sensor operative to detect movement of the motion sickness compensation system 2a. For example, the motion sensor 216 may be operative to detect acceleration or
deceleration of the motion sickness compensation system 2a as manipulated by a user and/ or as caused by a vehicle 6.
[0053] In some embodiments, the motion sensor 216 may comprise one or more three-axis acceleration motion sensors (e.g., an accelerometer) operative to detect linear acceleration in three directions (i.e., the x (or left/right) direction, the y (or up /down) direction, and the 2 (or forward/backward) direction). As another example, the motion sensor 216 may comprise one or more two-axis acceleration motion sensors which may be operative to detect linear acceleration only along each of x (or left/ right) and y (or up/ down) directions (or any other pair of directions). In some embodiments, the motion sensor 216 may comprise an electrostatic capacitance (capacitance-coupling) accelerometer that is based on silicon micro-machined MEMS (Micro Electro Mechanical Systems) technology, a pie2oelectric type accelerometer, a pie2oresistance type accelerometer, or any other suitable accelerometer.
[0054] In some embodiments, the motion sensor 216 may be operative to directly detect rotation, rotational movement, angular displacement, tilt, position, orientation, motion along a non-linear (e.g., arcuate) path, or any other non-linear motions. For example, when the motion sensor 216 is a linear motion sensor, additional processing may be used to indirectly detect some or all of the non-linear motions. By comparing the linear output of the motion sensor 216 with a gravity vector (i.e., a static acceleration), the motion sensor 216 may be operative to calculate the tilt of the motion sickness compensation system 2a with respect to the y-axis. In some embodiments, the motion sensor 216 may instead or in
addition comprise one or more gyro-motion sensors or gyroscopes for detecting rotational movement. For example, the motion sensor 216 may comprise a rotating or vibrating element.
[0055] In some embodiments, the motion sensor 216 may comprise one or more controllers (not shown) coupled to the accelerometers or gyroscopes. The controllers may be used to calculate a moving vector of the motion sickness compensation system 2a. The moving vector may be determined according to one or more predetermined formulas based on the movement data (e.g., x, y, and z axis movement information) provided by the accelerometers or gyroscopes.
[0056] In some embodiments, the input/output subsystem 206 may comprise a virtual input/output system 218. The virtual input/output system 218 is capable of providing input/output options by combining one or more input/output components to create a virtual input type. For example, the virtual input/output system 218 may enable a user to input information through an on-screen keyboard which utilizes the touchscreen and mimics the operation of a physical keyboard, or using the motion sensor 216 to control a pointer on the screen instead of utilizing the touchscreen. As another example, the virtual input/output system 218 may enable alternative methods of input and output to enable use of the device by persons having various disabilities. For example, the virtual input/ output system 218 may convert on-screen text to spoken words to enable reading-impaired persons to operate the device.
[0057] In some embodiments, the input/ output subsystem 206 may comprise speciali2ed output circuitry associated with output devices such as, for example, an audio peripheral output device 220. The audio peripheral output device 220 may comprise an audio output including on or more speakers integrated into the motion sickness compensation system 2a. The speakers may be, for example, mono or stereo speakers. The audio peripheral output device 220 also may comprise an audio component remotely coupled to audio peripheral output device 220 such as, for example, a headset, headphones, and/ or ear buds which may be coupled to the audio peripheral output device 220 through the communications subsystem 210.
[0058] Figure 5 is a schematic diagram of the communications interface 210 of Figure 3, in accordance with an illustrative embodiment of the present invention. The communications interface 210 may comprise any suitable hardware, software, or combination of hardware and software that is capable of coupling the motion sickness compensation system 2a to one or more networks and/ or devices. The communications interface 210 may be arranged to operate with any suitable technique for controlling information signals using a desired set of communications protocols, services or operating procedures. The communications interface 210 may comprise the appropriate physical connectors to connect with a corresponding communications medium, whether wired or wireless.
[0059] In various aspects, a network may comprise local area networks (LAN) as well as wide area networks (WAN) including without limitation Internet, wired channels, wireless
channels, communication devices including telephones, computers, wire, radio, optical or other electromagnetic channels, and combinations thereof, including other devices and/or components capable of/associated with communicating data. For example, the communication environments comprise in-body communications, various devices, and various modes of communications such as wireless communications, wired communications, and combinations of the same.
[0060] Wireless communication modes comprise any mode of communication between points (e.g., nodes) that utilize, at least in part, wireless technology including various protocols and combinations of protocols associated with wireless transmission, data, and devices. The points comprise, for example, wireless devices such as wireless headsets, audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers.
[0061] Wired communication modes comprise any mode of communication between points that utilize wired technology including various protocols and combinations of protocols associated with wired transmission, data, and devices. The points comprise, for example, devices such as audio and multimedia devices and equipment, such as audio players and multimedia players, telephones, including mobile telephones and cordless telephones, and computers and computer-related devices and components, such as printers. In various implementations, the wired communication modules may communicate in accordance with a number of wired protocols. Examples of wired protocols may comprise Universal Serial Bus
(USB) communication, RS-232, RS-422, RS-423, RS-485 serial protocols, FireWire, Ethernet, Fibre Channel, MIDI, ATA, Serial ATA, PCI Express, T-l (and variants), Industry Standard Architecture (ISA) parallel communication, Small Computer System Interface (SCSI) communication, or Peripheral Component Interconnect (PCI) communication, to name only a few examples.
[0062] Accordingly, in various aspects, the communications interface 210 may comprise one or more interfaces such as, for example, a wireless communications interface 222, a wired communications interface 224, a network interface, a transmit interface, a receive interface, a media interface, a system interface 226, a component interface, a switching interface, a chip interface, a controller, and so forth. When implemented by a wireless device or within a wireless system, for example, the communications interface 210 may comprise a wireless interface 222 comprising one or more antennas 228, transmitters, receivers, transceivers, amplifiers, filters, control logic, and so forth.
[0063] In various aspects, the communications interface 210 may provide voice and/or data communications functionality in accordance with different types of cellular radiotelephone systems. In various implementations, the described aspects may communicate over wireless shared media in accordance with a number of wireless protocols. Examples of wireless protocols may comprise various wireless local area network (WLAN) protocols, including the Institute of Electrical and Electronics Engineers (IEEE) 802. xx series of protocols, such as IEEE 802.11a/b/g/n, IEEE 802.16, IEEE 802.20, and so forth. Other examples of wireless protocols may comprise various wireless wide area network
(WW AN) protocols, such as GSM cellular radiotelephone system protocols with GPRS, CDMA cellular radiotelephone communication systems with lxRTT, EDGE systems, EV- DO systems, EV-DV systems, HSDPA systems, and so forth. Further examples of wireless protocols may comprise wireless personal area network (PAN) protocols, such as an Infrared protocol, a protocol from the Bluetooth Special Interest Group (SIG) series of protocols, including Bluetooth Specification versions vl.O, vl.l, vl.2, v2.0, v2.0 with Enhanced Data Rate (EDR), as well as one or more Bluetooth Profiles, and so forth. Yet another example of wireless protocols may comprise near-field communication techniques and protocols, such as electro-magnetic induction (EMI) techniques. An example of EMI techniques may comprise passive or active radio-frequency identification (RFID) protocols and devices. Other suitable protocols may comprise Ultra Wide Band (UWB), Digital Office (DO), Digital Home, Trusted Platform Module (TPM), ZigBee, and so forth.
[0064] In various implementations, the described aspects may comprise part of a cellular communication system. Examples of cellular communication systems may comprise CDMA cellular radiotelephone communication systems, GSM cellular radiotelephone systems, North American Digital Cellular (NADC) cellular radiotelephone systems, Time Division Multiple Access (TDMA) cellular radiotelephone systems, Extended-TDMA (E- TDMA) cellular radiotelephone systems, Narrowband Advanced Mobile Phone Service (NAMPS) cellular radiotelephone systems, third generation (3G) wireless standards systems such as WCDMA, CDMA-2000, UMTS cellular radiotelephone systems compliant with the
Third-Generation Partnership Project (3GPP), fourth generation (4G) wireless standards, and so forth.
[0065] Figure 6 is a schematic diagram of the memory subsystem 208 of Figure 3, in accordance with an illustrative embodiment of the present invention. The memory subsystem 208 may comprise any machine-readable or computer-readable media capable of storing data, including both volatile /non- volatile memory and removable/non-removable memory. The memory subsystem 208 may comprise at least one non-volatile memory unit 230 and a local bus 234. The non-volatile memory unit 230 is capable of storing one or more software programs 232_l-232_n. The software programs 232_l-232_n may contain, for example, applications, user data, device data, and/or configuration data, or combinations therefore, to name only a few. The software programs 232_l-232_n may contain instructions executable by the various components of the motion sickness compensation system 2a.
[0066] In various aspects, the memory subsystem 208 may comprise any machine- readable or computer-readable media capable of storing data, including both volatile/ nonvolatile memory and removable/non-removable memory. For example, memory may comprise read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDR-RAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory (e.g., NOR or NAND flash memory), content addressable memory (CAM), polymer memory (e.g., ferroelectric
polymer memory), phase-change memory (e.g., ovonic memory), ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, disk memory (e.g., floppy disk, hard drive, optical disk, magnetic disk), or card (e.g., magnetic card, optical card), or any other type of media suitable for storing information.
[0067] In some embodiments, the memory subsystem 208 may contain a software program for motion sickness compensation using the capabilities of the motion sickness compensation system 2a and the motion sensor 204, as discussed in connection with FIGS. 1-2. In one embodiment, the memory subsystem 208 may contain an instruction set, in the form of a file 232_n for executing a method of motion sickness compensation. The instruction set may be stored in any acceptable form of machine readable instructions, including source code or various appropriate programming languages. Some examples of programming languages that may be used to store the instruction set comprise, but are not limited to: Java, C, C++, C#, Python, Objective-C, Visual Basic, or .NET programming. In some embodiments a compiler or interpreter is comprised to convert the instruction set into machine executable code for execution by the processing subsystem 204.
[0068] Figure 7 is a schematic diagram of the motion sickness compensation system of the present invention in a vehicle and that utilizes a handheld input device, in accordance with an illustrative embodiment of the present invention. The system 2b is similar to systems 2 and 2a described above, and detailed descriptions of common components are not repeated herein. The system 2b includes a handheld input device 9 configured to receive one or more inputs from a user 4. The handheld input device 9 can include any suitable
inputs such as one or more buttons, d-pads, joysticks, motion sensors, toggles, triggers, and/or any other suitable input device. Visual output devices 12 are positioned in-line with a user's 4 line of sight.
[0069] In some embodiments, local processors 14 are configured to receive one or more inputs from respective motion sensors, such as a local motion sensor 8a and/or a central motion sensor 8b. The motion sensors 8a, 8b are configured to detect motion in one or more degrees of freedom of a vehicle 6 containing the user 4 and the system 2b. The central motion sensor 8b can provide a signal indicative of the one or more degrees of freedom of the vehicle 6 to local processors 14. The local processors 14 generate images on the visual output devices 12 (e.g., displays) configured to elicit compensatory movement and/ or inputs from a respective user 4.
[0070] In some embodiments, the system 2b includes an audio output system 36, such as, for example, headphones. The audio output system 36 is configured to provide an audio component to the user 4 to further reduce motion sickness. For example, in some embodiments, the audio output system 36 is configured to provide an auditory distraction, such as game sounds, music, etc., in conjunction with the visual output from the visual output device 12. The audio output can be matched to the visual output of the visual output device 12, can be user selected audio output, and/or audio output generated by the system 2b.
[0071] Figure 8 is a schematic diagram of a handheld computing device that incorporates the motion sickness compensation system of the present invention, in
accordance with an illustrative embodiment of the present invention. The system 2c is similar to systems 2, 2a and 2b discussed above, and detailed descriptions of common components are not repeated herein. The system 2c includes a handheld computing device 40 containing one or more of the elements of the system 2c. For example, in some embodiments, the handheld computing device 40 includes a processor 44, a motion sensor 8, a visual output device 12, one or more input devices, and/ or any other suitable elements of the system 2. In some embodiments, the visual output device 12 includes a touchscreen or other input device formed integrally with, above, and/ or below the visual output device 12 to receive input from a user 4.
[0072] Figures 9A-9E are schematic diagrams of sample display screens generated by the motion sickness compensation system 2c of Fig. 8, in accordance with an illustrative embodiment of the present invention. Fig. 9A illustrates an initial or starting state of a visual output device 12. The visual output device 12 is configured to display a simulated vehicle or target 50 aligned with a simulated horizon 52.
[0073] Fig. 9B illustrates the visual output device 12 during an angular acceleration (e.g., a turn) of the vehicle 6 containing system 2c. The visual output device 12 shows the simulated horizon 52 tilted in response to the angular acceleration. For example, in the illustrated embodiment, the vehicle 6 has a leftward angular acceleration (e.g., a left turn) which corresponds to a lowered left-side and raised right-side of the simulated horizon 52. It will be appreciated that tilting or rotation of the simulated horizon 52 can be in any suitable direction corresponding to movement, such as angular acceleration, of the vehicle 6.
[0074] The tilted simulated horizon 52 encourages a user 4 to provide an input to align the simulated vehicle 50 with the simulated horizon 52. For example, in Fig. 9C, the handheld computing device 40 includes the visual output device 12 and one or more inputs (not shown). A user 4 tilts the handheld computing device 40 to rotate the simulated vehicle 50 into alignment with the simulated horizon 52. The movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological systems of the user 4. For example, in some embodiments, the input device is configured to mimic a control of the vehicle 6 (such as a steering wheel) to compensate for motion experienced by a passenger in the vehicle.
[0075] Fig. 9D illustrates the visual output device 12 during a linear acceleration (e.g., increasing/decreasing speed). The simulated horizon 52 is raised relative to a center line of the visual output device 12. The simulated horizon 52 can be moved at rate proportional to the movement/acceleration of the vehicle 6. In other embodiments, the simulated horizon 52 is moved at a fixed rate unrelated to the magnitude of the movement/ acceleration of the vehicle 6. Movement of the simulated horizon 52 encourages compensatory input from the user 4.
[0076] For example, as shown in Fig. 9E, a user 4 can tilt and/ or raise the handheld computing device 40 to compensate for movement of the simulated horizon 52. The movement of the handheld computing device 40 (or other input) provides simulated movement control that corresponds to the movement detected by one or more biological
systems of the user 4. Although the embodiment discussed illustrate movement of a handheld computing device 40, it will be appreciated that the compensatory input can be provided by any suitable input device, such as a controller, a handheld device, a head mounted device, a gesture tracking device, and/ or any other suitable input.
[0077] The foregoing embodiments and advantages are merely exemplary, and are not to be construed as limiting the present invention. The description of the present invention is intended to be illustrative, and not to limit the scope of the claims. Many alternatives, modifications, and variations will be apparent to those skilled in the art. Various changes may be made without departing from the spirit and scope of the invention.