WO2016017062A1 - Traitement d'informations pour une prévention de mal des transports dans un système d'affichage d'image - Google Patents
Traitement d'informations pour une prévention de mal des transports dans un système d'affichage d'image Download PDFInfo
- Publication number
- WO2016017062A1 WO2016017062A1 PCT/JP2015/003033 JP2015003033W WO2016017062A1 WO 2016017062 A1 WO2016017062 A1 WO 2016017062A1 JP 2015003033 W JP2015003033 W JP 2015003033W WO 2016017062 A1 WO2016017062 A1 WO 2016017062A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- contents
- abnormality
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/04—Diagnosis, testing or measuring for television systems or their details for receivers
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/12—Adjusting pupillary distance of binocular pairs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/285—Analysis of motion using a sequence of stereo image pairs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/002—Eyestrain reduction by processing stereoscopic signals or controlling stereoscopic devices
Definitions
- the technology disclosed in embodiments of the present description relates to an information processing apparatus, an information processing method, a computer program, and an image display system, which perform processing on image information that is to be displayed on a screen fixed to a head or a face of a user.
- the head mounted display includes, for example, an image display unit for each of the left and right eyes and is configured so as to be capable of controlling visual and auditory senses by using a headphone together with the head mounted display.
- the head mounted display can project different images on the left and right eyes such that by displaying parallax images to the left and right eyes, a 3-D image can be presented.
- Such a type of head mounted display forms a virtual image on a retina of each of the eyes for the user to view.
- a virtual image is formed on the object side.
- a head mounted display with a wide angle of visibility has been proposed in which a magnified virtual image of a display image is formed in each pupil of a user by disposing a virtual image optical system with a wide angle of visibility 25mm in front of each pupil and by disposing a display panel having an effective pixel range of 0.7 inches further in front of each optical system with the wide angle of visibility (see PTL 1, for example).
- head mounted displays have been proposed in which a head motion tracking device including a gyro sensor is attached to the head so as to allow a user to feel like a wide field-of-view image following a movement of a head of a user is real (see PTL 2 and PTL 3, for example).
- a head motion tracking device including a gyro sensor
- PTL 2 and PTL 3 for example
- an image processing apparatus may include a control device configured to detect an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generate a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- an image processing method may include detecting, by a processing device, an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generating, by the processing device, a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- a non-transitory storage medium may be recorded with a program executable by a computer.
- the program may include detecting an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generating a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- an image processing apparatus may include a control device configured to: detect an abnormality in accordance with information indicating movement of an image of contents to be displayed, using a threshold value indicated in metadata related to the contents.
- an image processing method may include detecting, by a processing device, an abnormality in accordance with information indicating movement of an image of contents to be displayed, using a threshold value indicated in metadata of the contents.
- One or more of embodiments of the technology disclosed in the present description can provide an information processing apparatus, an information processing method, a computer program, and an image display system that are excellent and are capable of preventing simulation sickness from being caused during viewing by processing an image that is to be displayed on a screen fixed to a head or a face of a user.
- FIG. 1 is a diagram schematically illustrating an exemplary configuration of an image display system 100 to which an embodiment of the technology disclosed in the present disclosure has been applied.
- FIG. 2 is a diagram schematically illustrating a modification of the image display system 100.
- FIG. 3 is a diagram illustrating a state in which a user mounting a head mounted display on the head is viewed from the front.
- FIG. 4 is a diagram illustrating a state in which the user wearing the head mounted display illustrated in FIG. 3 is viewed from above.
- FIG. 5 is a diagram illustrating a modification of the image display system 100 using the head mounted display.
- FIG. 6 is a diagram illustrating an exemplary functional configuration of the image display system 100 illustrated in FIG. 5.
- FIG. 5 is a diagram schematically illustrating an exemplary configuration of an image display system 100 illustrated in FIG. 5.
- FIG. 7 is a diagram for describing a mechanism that displays an image that follows the movement of the head of the user with the display device 400.
- FIG. 8 is a diagram illustrating a procedure for cutting out, from a wide visual field image, an image having a display angle of view that matches the position and orientation of the head of the user.
- FIG. 9 is a diagram illustrating an exemplary functional configuration that automatically detects an abnormal image.
- FIG. 10 is a diagram illustrating a coordinate system of the head of the user.
- FIG. 11 illustrates another exemplary functional configuration that automatically detects an abnormal image.
- FIG. 12 illustrates further another exemplary functional configuration that automatically detects an abnormal image.
- FIG. 13 is a diagram exemplifying optic flows generated in a plane of the image.
- FIG. 1 schematically illustrates an exemplary configuration of an image display system 100 to which an embodiment of the technology disclosed in the present disclosure has been applied.
- An image display system 100 illustrated in the drawing includes a head motion tracking device 200, a rendering device 300, and a display device 400.
- the head motion tracking device 200 is used by being mounted on a head of a user viewing an image that is displayed by the display device 400 and outputs position and orientation information of the head of the user at predetermined transmission periods to the rendering device 300.
- the head motion tracking device 200 includes a sensor unit 201, a position and orientation computation unit 202, and a communication unit 203 that transmits the obtained orientation information to the rendering device 300.
- the sensor unit 201 is constituted by combining a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor.
- the sensors are a triaxial gyro sensor, a triaxial acceleration sensor, and a triaxial geomagnetic sensor that are capable of detecting nine axes in total.
- the position and orientation computation unit 202 computes the position and orientation information of the head of the user on the basis of the result of the detection in nine axes detected by the sensor unit 201.
- the communication unit 203 transmits the computed orientation information to the rendering device 300.
- the head motion tracking device 200 and the rendering device 300 are interconnected through wireless communication such as Bluetooth (registered trademark) communication. Needless to say, rather than through wireless communication, the head motion tracking device 200 and the rendering device 300 may be connected to each other through a high-speed cable interface such as a Universal Serial Bus (USB).
- a high-speed cable interface such as a Universal Serial Bus (USB).
- the rendering device 300 performs rendering processing of the image that is to be displayed on the display device 400.
- the rendering device 300 is configured as an Android (registered trademark) based terminal, such as a smart-phone or a tablet computer, as a personal computer, or as a game machine; however, the rendering device 300 is not limited to the above devices.
- the rendering device 300 includes a first communication unit 301 that receives orientation information from the head motion tracking device 200, a rendering processor 302 that performs rendering processing of an image on the basis of the orientation information, a second communication unit 303 that transmits the rendered image to the display device 400, and an image source 304 that is a supply source of image data.
- the first communication unit 301 receives orientation information from the head motion tracking device 200 through Bluetooth (registered trademark) communication or the like. As described above, the orientation information is expressed by a rotation matrix.
- the image source 304 includes, for example, storage devices, such as a hard disk drive (HDD) and a solid state drive (SSD), that record image contents, a media reproduction device that reproduces a recording medium such as a Blu-ray (registered trademark) disc, a broadcasting tuner that tunes and receives a digital broadcasting signal, and a communication interface that receives image contents from an Internet server and the like.
- storage devices such as a hard disk drive (HDD) and a solid state drive (SSD)
- HDD hard disk drive
- SSD solid state drive
- media reproduction device that reproduces a recording medium such as a Blu-ray (registered trademark) disc
- a broadcasting tuner that tunes and receives a digital broadcasting signal
- a communication interface that receives image contents from an Internet server and the like.
- a video see-through image taken by an outside camera may be the image source 304.
- the rendering processor 302 From the image data of the image source 304, the rendering processor 302 renders an image that is to be displayed on the display device 400 side.
- the rendering processor 302 renders an image that has been extracted so as to have a display angle of view that corresponds to the orientation information received in the first communication unit 301 from, for example, an original entire celestial sphere image and from an original 4K image having a wide angle of view, which have been supplied from the image source 304.
- the rendering device 300 and the display device 400 are connected to each other by a cable such as a High-Definition Multimedia Interface (HDMI, registered trademark) cable or a Mobile High-definition Link (MHL) cable.
- a cable such as a High-Definition Multimedia Interface (HDMI, registered trademark) cable or a Mobile High-definition Link (MHL) cable.
- connection may be made through wireless communication, such as wireless HD or Miracast.
- the second communication unit 303 uses either one of the channels and transmits the image data rendered by the rendering processor 302 to the display device 400 in an uncompressed state.
- the display device 400 includes a communication unit 401 that receives an image from the rendering device 300 and a display unit 402 that displays the received image.
- the display device 400 is configured as a head mounted display that is fixed to a head or a face of a user viewing an image, for example.
- the communication unit 401 receives an uncompressed image data from the rendering device 300 through a channel such as a High-Definition Multimedia Interface (HDMI, registered trademark) cable or a Mobile High-definition Link (MHL) cable.
- the display unit 402 displays the received image data on a screen.
- HDMI High-Definition Multimedia Interface
- MHL Mobile High-definition Link
- the display unit 402 When the display device 400 is configured as a head mounted display, the display unit 402 will include, for example, left and right screens that are fixed to the left and right eyes of the user such that an image for the left eye and an image for the right eye are displayed.
- the display unit 402 is configured by a display panel such as a micro display including an organic electro-luminescence (EL) device or a liquid crystal display, or a laser scanning display such as a direct imaging retina display, for example.
- the display unit 402 includes a virtual image optical unit that magnifies and projects a display image of the display unit 402 and that forms a magnified virtual image having a predetermined angle of view in the pupils of the user.
- an image that has been extracted so as to have a display angle of view that corresponds to the position and orientation information of the head of the user is rendered from, for example, an original entire celestial sphere image or an original 4K image having a wide angle of view.
- a display area in the original image is moved so as to cancel out the orientation angle of the head of the user. Accordingly, an image that follows the movement of the head can be reproduced and the user can have an experience of looking out over a large screen.
- the display device 400 may be configured so as to change the audio output in accordance with the movement of the image.
- FIG. 2 schematically illustrates a modification of the image display system 100.
- the image display system 100 includes three separate devices, namely, the head motion tracking device 200, the rendering device 300, and the display device 400; however, in the example illustrated in FIG. 2, the function of the rendering device 300 is equipped in the display device 400.
- configuring the head motion tracking device 200 as an optional product that is externally attached to the display device 400 leads to reduction in size, weight, and cost of the display device 400.
- FIGS. 3 and 4 each illustrate an appearance configuration of the display device 400.
- the display device 400 is configured as a head mounted display that is fixed to a head or a face of a user viewing an image.
- FIG. 3 illustrates a state in which the user mounting the head mounted display on the head is viewed from the front
- FIG. 4 illustrates a state in which the user wearing the head mounted display is viewed from above.
- the head mounted display that is mounted on the head or the face of the user directly covers the eyes of the user and is capable of providing a sense of immersion to the user viewing the image. Furthermore, since it is not possible to see the display image from the outside (in other words, by others), while information is displayed, protection of privacy is facilitated. Different from the optical see-through type, it is not possible for the user mounting the immersive head mounted display to directly view the scenery of the actual world. If an outside camera that performs imaging of the scenery in the visual line direction of the user is equipped, by displaying the taken image, the user can indirectly view the scenery of the actual world (in other words, the scenery is displayed through video see-through).
- the head mounted display illustrated in FIG. 3 is a structure that has a shape similar to that of a pair of glasses and is configured to directly cover the left and right eyes of the user wearing the head mounted display.
- Display panels that the user views are disposed on the inner side of a head mounted display body and at positions opposing the left and right eyes.
- the display panels are each configured by a micro display such as an organic EL device or a liquid crystal display, or by a laser scanning display such as a direct imaging retina display, for example.
- Microphones are installed in the vicinities of the left and right ends of the head mounted display body. By having microphones on the left and right in a substantially symmetrical manner and by recognizing only the audio (the voice of the user) oriented at the center, the voice of the user can be separated from the ambient noise and from speech sound of others such that, for example, malfunction during control performed through voice input can be prevented.
- touch panels to which the user can perform touch input with his/her fingertip or the like, are disposed on the outer side of the head mounted display body.
- a pair of left and right touch panels are provided; however, a single or three or more touch panels may be provided.
- the head mounted display includes, on the side opposing the face of the user, display panels for the left and right eyes.
- the display panels are each configured by a micro display such as an organic EL device or a liquid crystal display, or by a laser scanning display such as a direct imaging retina display, for example.
- a micro display such as an organic EL device or a liquid crystal display
- a laser scanning display such as a direct imaging retina display, for example.
- the display image on the display panel is viewed by the left and right eyes of the user as a magnified virtual image.
- an interpupillary distance adjustment mechanism is equipped between the display panel for the right eye and the display panel for the left eye.
- FIG. 5 illustrates a modification of the image display system 100 using the head mounted display.
- the illustrated image display system 100 includes the display device (the head mounted display) 400 used by the user by being mounted on the head or the face, the head motion tracking device 200 that is not shown in FIG. 5, and an imaging device 500 that is equipped in a mobile device 600 such as a multirotor.
- the mobile device 600 may be a radio controlled device that is remotely controlled wirelessly by the user through a controller 700, or may be a mobile object piloted by another user or a mobile object that is driven autonomously.
- a first person view (FPV) technology in which piloting is performed while viewing a first-person viewpoint (a pilot viewpoint) image taken with a wireless camera equipped in a radio controlled device such as a helicopter.
- a mobile object controller including a mobile object equipped with an imaging device, and a wearable PC that is operated by an operator to perform remote control of the mobile object has been made (see PTL 4, for example).
- PTL 4 a proposal of a mobile object controller including a mobile object equipped with an imaging device, and a wearable PC that is operated by an operator to perform remote control of the mobile object has been made (see PTL 4, for example).
- PTL 4 for example
- a signal that controls the operation of the mobile object is received to control the operation of the mobile object itself
- a signal that controls the equipped imaging device is received to control the imaging operation
- a video signal and an audio signal that the imaging device outputs are transmitted to the wearable PC.
- a signal that controls the operation of the mobile object is generated in accordance with the control of the operator and, furthermore, a signal that controls the operation of the imaging device in accordance with the voice of the operator is generated.
- the signals are wirelessly transmitted to the mobile object and an output signal of the imaging device is wirelessly received to reproduce a video signal.
- the video signal is displayed on the monitor screen.
- FIG. 6 illustrates an exemplary functional configuration of the image display system 100 illustrated in FIG. 5.
- the illustrated image display system 100 includes three devices, namely, the head motion tracking device 200 that is mounted on the head of the user, the display device 400 that is worn on the head or the face of the user, the imaging device 500 that is equipped in the mobile object (not shown in FIG. 6).
- the head motion tracking device 200 is used by being mounted on the head of the user viewing an image displayed with the display device 400 and outputs position and orientation information of the head of the user at predetermined transmission periods to the display device 400.
- the head motion tracking device 200 includes the sensor unit 201, the position and orientation computation unit 202, and the communication unit 203.
- the sensor unit 201 is constituted by combining a plurality of sensor elements such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor and detects the orientation angle of the head of the user.
- the sensors are a triaxial gyro sensor, a triaxial acceleration sensor, and a triaxial geomagnetic sensor that are capable of detecting nine axes in total.
- the position and orientation computation unit 202 computes the position and orientation information of the head of the user on the basis of the result of the detection in nine axes with the sensor unit 201.
- the head motion tracking device 200 and the display device 400 are interconnected through wireless communication such as Bluetooth (registered trademark) communication.
- wireless communication such as Bluetooth (registered trademark) communication.
- the head motion tracking device 200 and the display device 400 may be connected to each other through a high-speed cable interface such as a Universal Serial Bus (USB).
- USB Universal Serial Bus
- the position and orientation information of the head of a user that has been obtained in the position and orientation computation unit 202 is transmitted to the display device 400 through the communication unit 203.
- the imaging device 500 includes an omnidirectional camera 501 and a communication unit 502 and is used by being equipped in the mobile device 600.
- the omnidirectional camera 501 is configured by, for example, disposing a plurality of cameras radially so that the main axis directions thereof are each oriented outwards; accordingly, the imaging range is made omnidirectional.
- the specific configuration of the omnidirectional camera that can be applied to the image display system 100 according to the present embodiment, refer to the description of the Patent Application No. 2014-128020 that has already been assigned to the present applicant.
- an embodiment of the technology disclosed in the present description is not limited to a configuration of a specific omnidirectional camera.
- the imaging device 500 and the display device 400 are interconnected through wireless communication such as Wireless Fidelity (Wi-Fi).
- Image information taken by the omnidirectional camera 501 is transmitted to the display device 400 through the communication unit 502.
- the display device 400 is configured as a head mounted display, for example.
- the head motion tracking device 200 is configured as an independent device with respect to the display device 400 (for example, the head motion tracking device 200 is manufactured and sold as an optional product of the head mounted display); however, the head mounted display may be configured such that the head motion tracking device 200 and the display device 400 are integral with each other.
- the display device 400 includes the first communication unit 301, the second communication unit 303, the rendering processor 302, and the display unit 402.
- the display unit 402 When the display device 400 is configured as a head mounted display, the display unit 402 will include, for example, left and right screens that are fixed to the left and right eyes of the user such that an image for the left eye and an image for the right eye are displayed.
- the display unit 402 is configured by a display panel such as a micro display including an organic electro-luminescence (EL) device or a liquid crystal display, or a laser scanning display such as a direct imaging retina display, for example.
- the display unit 402 includes a virtual image optical unit (not shown) that magnifies and projects a display image of the display unit 402 and that forms a magnified virtual image having a predetermined angle of view in the pupils of the user.
- the first communication unit 301 receives position and orientation information of the head of the user from the head motion tracking device 200 through the communication unit 203. Furthermore, the first communication unit 301 receives image information taken by the omnidirectional camera 501 from the second communication unit 303 and the imaging device 500 through the communication unit 502.
- the rendering processor 302 renders, from the omnidirectional image, an image that has been extracted so as to have a display angle of view that corresponds to the position and orientation information of the head of the user.
- a display area in the original image is moved so as to cancel out the orientation angle of the head of the user such that an image that follows the movement of the head can be reproduced and the user can have an experience of looking out over a large screen.
- FIG. 7 a mechanism for displaying an image, which follows the movement of the head of the user, with the display device 400 in the image display system 100 described above is illustrated.
- the rendering device 300 moves the center of an area 702 that is to be extracted from an omnidirectional image or an original 4K image 701 having a wide angle of view, for example, so as to follow the orientation of the head of the user and renders an image of the area 702 that has been extracted so as to have a predetermined angle of view around the above center position.
- the rendering device 300 rotates an area 702-1 in accordance with a roll component of the head motion of the user, moves an area 702-2 in accordance with a tilt component of the head motion of the user, and moves an area 702-3 in accordance with a pan component of the head motion of the user such that the display area is moved so as to cancel out the movement of the head that has been detected by the head motion tracking device 200.
- On the display device 400 side an image in which the display area moves in the original image 701 so as to follow the movement of the head of the user can be presented.
- FIG. 8 a procedure for cutting out, from a wide field-of-view image, an image having a display angle of view that matches the position and orientation of the head of the user is illustrated.
- a wide field-of-view image is input from the image source 304 (F801).
- the sensor unit 201 detects the orientation angle of the head of the user and, on the basis of the result of the detection by the sensor unit 201, the position and orientation computation unit 202 computes an orientation angle q h of the head of the user (F802). Then, the computed head orientation angle q h is transmitted to the rendering device 300 through the communication unit 203.
- the rendering processor 302 cuts out, from the wide field-of-view image, a display angle of view corresponding to the head orientation angle q h of the user and renders an image (F803).
- the rendering processor 302 cuts out, from the wide field-of-view image, a display angle of view corresponding to the head orientation angle q h of the user and renders an image (F803).
- scaling and deformation may be performed.
- An image in which the display angle of view is changed in accordance with the viewpoint position and the angle of visibility of the user is referred to as a "free viewpoint image".
- the rendering device 300 transmits the free viewpoint image that the rendering processor 302 has rendered to the display device 400 through the first communication unit 301 and displaying is performed in the display device 400 (F804).
- an angle of visibility is computed in accordance with position and orientation information of the head of the user detected by the head motion tracking device 200 and a display angle of view that matches the angle of visibility is extracted from the original wide field-of-view image.
- the image display system 100 when the user is viewing the free viewpoint image or the wide field-of-view image, it is not possible for the user to avert seeing an image that may cause simulation sickness unintended by the user.
- the display device 400 when the display device 400 is, as is the case of the head mounted display, used while being fixed to the head or the face of the user, simulation sickness is easily caused even in a relatively short time.
- an abnormal image that may cause simulation sickness is automatically detected such that an appropriate simulation sickness prevention operation is achieved.
- FIG. 9 illustrates an exemplary functional configuration that automatically detects an abnormal image.
- the illustrated abnormality detection function can be incorporated in the rendering processor 302, for example.
- An abnormality detection unit 901 input with the position and orientation information of the head from the head motion tracking device 200, detects whether the free viewpoint image that has been rendered with the procedure illustrated in FIGS. 7 and 8 is an image that may cause simulation sickness that is unintended by the user.
- the abnormality detection unit 901 detects that the free viewpoint image will become an abnormal free viewpoint image on the basis of the above equations (1) and (2).
- the abnormality detection unit 901 When detecting that the free viewpoint image will become an abnormal free viewpoint image, the abnormality detection unit 901 outputs a detection signal 902 to the display device 400 (or the display unit 402) and instructs an appropriate simulation sickness prevention operation to be executed. Note that the details of the simulation sickness prevention operation will be described later.
- FIG. 11 illustrates another exemplary functional configuration that automatically detects an abnormal image.
- the illustrated abnormality detection function can be incorporated in the rendering processor 302, for example.
- An abnormality detection unit 1101 input with the position and orientation information of the head from the head motion tracking device 200, detects whether the free viewpoint image that has been rendered with the procedure illustrated in FIGS. 7 and 8 is an image that may cause simulation sickness that is unintended by the user.
- the difference with the exemplary configuration illustrated in FIG. 9 is that a movement information acquisition unit 1102 that acquires movement information is provided.
- the movement information that the movement information acquisition unit 1102 acquires is information related to the movement of the image of the original contents on which the rendering processor 302 performs processing, such as the free viewpoint image, and is provided as metadata accompanying the contents, for example.
- the abnormality detection unit 1101 When detecting that the free viewpoint image will become an abnormal free viewpoint image, the abnormality detection unit 1101 outputs a detection signal 1103 to the display device 400 (or the display unit 402) and instructs an appropriate simulation sickness prevention operation to be executed. Note that the details of the simulation sickness prevention operation will be described later.
- FIG. 12 illustrates further another exemplary functional configuration that automatically detects an abnormal image.
- the illustrated abnormality detection function may be incorporated in the rendering processor 302, for example.
- the image information acquisition unit 1202 acquires image that is to be displayed on the display device 400 from the image source 304. For example, an image that has been reproduced with a Blu-ray disc player is acquired. Then, an abnormality detection unit 1201 analyzes the image that the image information acquisition unit 1202 has acquired and detects whether the image is an image that causes simulation sickness that is unintended by the user.
- each of the picture elements has a different optical flow.
- the abnormality detection unit 1201 detects that an abnormal image will be displayed on the display device 400, outputs a detection signal 1203 to the display device 400 (or the display unit 402), and instructs an appropriate simulation sickness prevention operation to be executed.
- the image display system 100 may be operated by combining the functional configuration illustrated in FIG. 12 that detects an abnormal image with the functional configuration illustrated in FIG. 9 or FIG. 11.
- the simulation sickness prevention operation that is performed in the display device 400 in accordance with the detection of abnormality in the image will be exemplified below.
- the display unit 402 is blacked out.
- a message screen indicating that abnormality has been detected is displayed.
- Display of the moving image is temporarily stopped.
- (4) Following of the position and orientation of the head in the free viewpoint image is temporarily stopped.
- Video see-through display is performed.
- Power of the display device 400 is turned off. (7) The state in which the display device 400 is fixed to the head or face of the user is canceled.
- (1) to (6) responding to the abnormality detection of the image control the display in the display unit 402 and prevents simulation sickness unintended by the user from occurring.
- (1) to (4) and (6) can be applied to display devices in general (including large-screen displays and multifunctional terminals such as smart phones and tablet computers) that display an image following the head motion.
- (5) is a method for performing a video see-through display by performing imaging of the scenery in the visual line direction of the user with an outside camera when the display device 400 is configured as a head mounted display (see FIGS. 3 and 4). The user not only can avert an image causing unintended simulation sickness, but also can avoid danger by indirectly viewing the scenery of the actual world.
- (7) is a method that uses a mechanical operation.
- the display device 400 is configured as a head mounted display (see FIGS. 3 and 4)
- the above can be achieved by a mechanism in which, for example, the fitted head mounted display is taken off or, rather than the whole head mounted display, only the display units are taken off.
- a head mounted display has been proposed (see PTL 5, for example) in which a display surface is supported in an openable and closable manner with a movable member and is set to an open state such that the head mounted display is set to a second state that allows the peripheral visual field of the user to be obtained.
- the above head mounted display can be applied to an embodiment of the technology disclosed in the present description.
- An embodiment of the technology disclosed in the present description can be preferably applied to cases in which a free viewpoint image or a wide field-of-view image are viewed with an immersive head mounted display; however, needless to say, the technology can be applied to a transmission type head-mounted display as well.
- an embodiment of the technology disclosed in the present description may be applied in a similar manner to a case in which the free viewpoint image is viewed not with a head mounted display but by fixing the screen of an information terminal, such as a smartphone or a tablet computer, on the head or the face and, furthermore, in a case in which an image with a wide angle of visibility is viewed with a large-screen display.
- an information terminal such as a smartphone or a tablet computer
- An image processing apparatus including: a control device configured to: detect an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generate a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- a control device configured to: detect an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generate a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- the operation for prevention of simulation sickness includes at least one of blacking out the display, displaying a message screen indicating that the abnormality is detected, temporarily stopping display of the image of contents, temporarily stopping the generating of the free viewpoint image in accordance with the position or orientation information, causing the display to be see-through, turning off power of the display or canceling a state in which the display is fixed to a head or face of a user.
- the information indicating the movement of the image of the contents is provided as metadata accompanying the contents.
- the free viewpoint image is generated from an omnidirectional image or a wide field-of-view image.
- the free viewpoint image is an image of an area extracted from the omnidirectional image or the wide field-of-view such that the free viewpoint image has a predetermined angle of view around a center of the area.
- the position or orientation information of the display is indicated in information from a sensor.
- the information from the sensor indicates movement of the display in a direction and orientation of the display.
- the information from the sensor indicates movement of a head of a user in a direction and orientation of the head of the user.
- An image processing method including: detecting, by a processing device, an abnormality in accordance with at least one of (i) position or orientation information of a display or (ii) information indicating movement of an image of contents to be displayed to the display; and generating, by the processing device, a free viewpoint image in accordance with the image of contents and the position or orientation information of the display.
- An image processing apparatus including: a control device configured to: detect an abnormality in accordance with information indicating movement of an image of contents to be displayed, using a threshold value indicated in metadata related to the contents. (20) The apparatus according to (19), wherein the metadata is provided accompanying the contents.
- the threshold value is a time function associated with a scene of the contents.
- An image processing method including: detecting, by a processing device, an abnormality in accordance with information indicating movement of an image of contents to be displayed, using a threshold value indicated in metadata of the contents.
- image display system 200 head motion tracking device 201 sensor unit 202 position and orientation computation unit 203 communication unit 300 rendering device 301 first communication unit 302 rendering processor 303 second communication unit 304 image source 400 display device 401 communication unit 402 display unit 500 imaging device 501 omnidirectional camera 502 communication unit 600 mobile device 700 controller 901 abnormality detection unit 1101 abnormality detection unit 1102 movement information acquisition unit 1201 abnormality detection unit 1203 image information acquisition unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
L'invention concerne un appareil de traitement d'image qui peut comprendre un dispositif de commande configuré pour détecter une anomalie conformément à (i) des informations de position ou d'orientation d'un dispositif d'affichage et/ou (ii) des informations indiquant le déplacement d'une image de contenus à afficher sur le dispositif d'affichage ; et générer une image de point de vue libre conformément à l'image de contenus et aux informations de position ou d'orientation du dispositif d'affichage.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP15734264.3A EP3175324A1 (fr) | 2014-07-28 | 2015-06-17 | Traitement d'informations pour une prévention de mal des transports dans un système d'affichage d'image |
| US15/318,116 US20170111636A1 (en) | 2014-07-28 | 2015-06-17 | Information processing apparatus, information processing method, computer program, and image display system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014-153351 | 2014-07-28 | ||
| JP2014153351A JP2016031439A (ja) | 2014-07-28 | 2014-07-28 | 情報処理装置及び情報処理方法、コンピューター・プログラム、並びに画像表示システム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016017062A1 true WO2016017062A1 (fr) | 2016-02-04 |
Family
ID=53510959
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/003033 Ceased WO2016017062A1 (fr) | 2014-07-28 | 2015-06-17 | Traitement d'informations pour une prévention de mal des transports dans un système d'affichage d'image |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20170111636A1 (fr) |
| EP (1) | EP3175324A1 (fr) |
| JP (1) | JP2016031439A (fr) |
| WO (1) | WO2016017062A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108040179A (zh) * | 2017-12-25 | 2018-05-15 | 广东欧珀移动通信有限公司 | 屏幕切换方法及相关产品 |
| EP3322186A1 (fr) * | 2016-11-14 | 2018-05-16 | Thomson Licensing | Procédé et dispositif de transmission des données représentatives d'une image |
| GB2574487A (en) * | 2018-10-26 | 2019-12-11 | Kagenova Ltd | Method and system for providing at least a portion of content having six degrees of freedom motion |
| GB2575932A (en) * | 2018-10-26 | 2020-01-29 | Kagenova Ltd | Method and system for providing at least a portion of content having six degrees of freedom motion |
| US10586514B2 (en) | 2017-05-01 | 2020-03-10 | Elbit Systems Ltd | Head mounted display device, system and method |
| EP3677993A4 (fr) * | 2017-08-29 | 2020-09-09 | Sony Corporation | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Families Citing this family (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6367166B2 (ja) * | 2015-09-01 | 2018-08-01 | 株式会社東芝 | 電子機器及び方法 |
| US20170115489A1 (en) * | 2015-10-26 | 2017-04-27 | Xinda Hu | Head mounted display device with multiple segment display and optics |
| JP6087453B1 (ja) * | 2016-02-04 | 2017-03-01 | 株式会社コロプラ | 仮想空間の提供方法、およびプログラム |
| KR102561860B1 (ko) * | 2016-10-25 | 2023-08-02 | 삼성전자주식회사 | 전자장치 및 그 제어방법 |
| CN110300994B (zh) | 2017-02-23 | 2023-07-04 | 索尼公司 | 图像处理装置、图像处理方法以及图像系统 |
| JP6955351B2 (ja) * | 2017-03-16 | 2021-10-27 | ヤフー株式会社 | 保護装置、保護方法および保護プログラム |
| CN109387939B (zh) * | 2017-08-09 | 2021-02-12 | 中强光电股份有限公司 | 近眼式显示装置及其显示影像的校正方法 |
| JP6934806B2 (ja) * | 2017-11-02 | 2021-09-15 | キヤノン株式会社 | 表示装置、表示装置の制御方法 |
| US11125997B2 (en) | 2017-11-10 | 2021-09-21 | Sony Interactive Entertainment Inc. | Information processing apparatus, information processing method, and program |
| JP7059662B2 (ja) * | 2018-02-02 | 2022-04-26 | トヨタ自動車株式会社 | 遠隔操作システム、及びその通信方法 |
| JP2019152980A (ja) * | 2018-03-01 | 2019-09-12 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
| US10558038B2 (en) * | 2018-03-16 | 2020-02-11 | Sharp Kabushiki Kaisha | Interpupillary distance adjustment mechanism for a compact head-mounted display system |
| CN109192158A (zh) * | 2018-09-21 | 2019-01-11 | 重庆惠科金渝光电科技有限公司 | 显示面板的控制方法、显示面板及存储介质 |
| US11043194B2 (en) | 2019-02-27 | 2021-06-22 | Nintendo Co., Ltd. | Image display system, storage medium having stored therein image display program, image display method, and display device |
| CN110232711B (zh) * | 2019-06-05 | 2021-08-13 | 中国科学院自动化研究所 | 面向海产品抓取的双目视觉实时感知定位方法、系统、装置 |
| JP6655751B1 (ja) * | 2019-07-25 | 2020-02-26 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 映像表示制御装置、方法およびプログラム |
| US11333888B2 (en) | 2019-08-05 | 2022-05-17 | Facebook Technologies, Llc | Automatic position determination of head mounted display optics |
| US10991343B2 (en) * | 2019-08-05 | 2021-04-27 | Facebook Technologies, Llc | Automatic image alignment with head mounted display optics |
| US11023041B1 (en) * | 2019-11-07 | 2021-06-01 | Varjo Technologies Oy | System and method for producing images based on gaze direction and field of view |
| JP7467094B2 (ja) * | 2019-12-09 | 2024-04-15 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP7643946B2 (ja) * | 2021-06-11 | 2025-03-11 | 株式会社デンソーテン | 情報処理装置、情報処理方法および情報処理プログラム |
| JP2024115563A (ja) * | 2021-07-12 | 2024-08-27 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP2025095216A (ja) * | 2023-12-14 | 2025-06-26 | 本田技研工業株式会社 | 車両用映像表示装置 |
| JP7784586B1 (ja) * | 2025-04-30 | 2025-12-11 | ナブテスコ株式会社 | 映像処理システム |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09106322A (ja) | 1995-10-09 | 1997-04-22 | Data Tec:Kk | ヘッドマウントディスプレイにおける姿勢角検出装置 |
| JP2001209426A (ja) | 2000-01-26 | 2001-08-03 | Nippon Telegr & Teleph Corp <Ntt> | 移動体制御装置 |
| EP1998554A1 (fr) * | 2006-03-23 | 2008-12-03 | Panasonic Corporation | Appareil d'imagerie de contenu |
| JP2010256534A (ja) | 2009-04-23 | 2010-11-11 | Fujifilm Corp | 全方位画像表示用ヘッドマウントディスプレイ装置 |
| US20120121138A1 (en) * | 2010-11-17 | 2012-05-17 | Fedorovskaya Elena A | Method of identifying motion sickness |
| JP2012141461A (ja) | 2010-12-29 | 2012-07-26 | Sony Corp | ヘッド・マウント・ディスプレイ |
| US20120320224A1 (en) * | 2011-06-14 | 2012-12-20 | Olympus Corporation | Information processing device, server system, image processing system, and information storage device |
| US20130093843A1 (en) * | 2010-03-10 | 2013-04-18 | Sung-Moon Chun | Method for configuring stereoscopic moving picture file |
| US20130235169A1 (en) * | 2011-06-16 | 2013-09-12 | Panasonic Corporation | Head-mounted display and position gap adjustment method |
| US20130241955A1 (en) * | 2010-11-09 | 2013-09-19 | Fujifilm Corporation | Augmented reality providing apparatus |
| JP2013200325A (ja) | 2012-03-23 | 2013-10-03 | Sony Corp | ヘッドマウントディスプレイ |
| JP2014120044A (ja) * | 2012-12-18 | 2014-06-30 | Dainippon Printing Co Ltd | 映像出力装置、映像出力方法、及びプログラム |
| JP2014128020A (ja) | 2012-12-27 | 2014-07-07 | Fujitsu Ltd | 移動局装置及び通信方法 |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07216621A (ja) * | 1994-02-09 | 1995-08-15 | Sega Enterp Ltd | 頭部装着用具 |
| JPH11161190A (ja) * | 1997-11-25 | 1999-06-18 | Seiko Epson Corp | 頭部装着型表示装置 |
| JP3902907B2 (ja) * | 2000-06-29 | 2007-04-11 | キヤノン株式会社 | 画像処理装置及び方法と画像形成装置 |
| US20070121423A1 (en) * | 2001-12-20 | 2007-05-31 | Daniel Rioux | Head-mounted display apparatus for profiling system |
| JP3793142B2 (ja) * | 2002-11-15 | 2006-07-05 | 株式会社東芝 | 動画像加工方法及び装置 |
| JP2004219664A (ja) * | 2003-01-14 | 2004-08-05 | Sumitomo Electric Ind Ltd | 情報表示システム及び情報表示方法 |
| JP4285287B2 (ja) * | 2004-03-17 | 2009-06-24 | セイコーエプソン株式会社 | 画像処理装置、画像処理方法およびそのプログラム、記録媒体 |
| JP2006128780A (ja) * | 2004-10-26 | 2006-05-18 | Konica Minolta Photo Imaging Inc | デジタルカメラ |
| US20070012143A1 (en) * | 2005-07-13 | 2007-01-18 | Tracy Gary E | Socket for socket wrench |
| JP4525692B2 (ja) * | 2007-03-27 | 2010-08-18 | 株式会社日立製作所 | 画像処理装置、画像処理方法、画像表示装置 |
| KR101265956B1 (ko) * | 2007-11-02 | 2013-05-22 | 삼성전자주식회사 | 블록 기반의 영상 복원 시스템 및 방법 |
| JP5099697B2 (ja) * | 2008-03-31 | 2012-12-19 | 国立大学法人 名古屋工業大学 | 音声・ビデオ出力方式、音声・ビデオ出力方式実現プログラム及び音声・ビデオ出力装置 |
| JP2010050645A (ja) * | 2008-08-20 | 2010-03-04 | Olympus Corp | 画像処理装置、画像処理方法及び画像処理プログラム |
| JP2010257266A (ja) * | 2009-04-27 | 2010-11-11 | Sharp Corp | コンテンツ出力システム、サーバー装置、コンテンツ出力装置、コンテンツ出力方法、コンテンツ出力プログラム、及びコンテンツ出力プログラムを記憶した記録媒体 |
| JP5779968B2 (ja) * | 2010-05-19 | 2015-09-16 | リコーイメージング株式会社 | 天体自動追尾撮影方法及びカメラ |
| US9632315B2 (en) * | 2010-10-21 | 2017-04-25 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more fresnel lenses |
| US9615064B2 (en) * | 2010-12-30 | 2017-04-04 | Pelco, Inc. | Tracking moving objects using a camera network |
| WO2013150789A1 (fr) * | 2012-04-05 | 2013-10-10 | パナソニック株式会社 | Dispositif d'analyse vidéo, procédé d'analyse vidéo, programme et circuit intégré |
| US9978180B2 (en) * | 2016-01-25 | 2018-05-22 | Microsoft Technology Licensing, Llc | Frame projection for augmented reality environments |
| US9978181B2 (en) * | 2016-05-25 | 2018-05-22 | Ubisoft Entertainment | System for virtual reality display |
-
2014
- 2014-07-28 JP JP2014153351A patent/JP2016031439A/ja active Pending
-
2015
- 2015-06-17 US US15/318,116 patent/US20170111636A1/en not_active Abandoned
- 2015-06-17 EP EP15734264.3A patent/EP3175324A1/fr not_active Withdrawn
- 2015-06-17 WO PCT/JP2015/003033 patent/WO2016017062A1/fr not_active Ceased
Patent Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09106322A (ja) | 1995-10-09 | 1997-04-22 | Data Tec:Kk | ヘッドマウントディスプレイにおける姿勢角検出装置 |
| JP2001209426A (ja) | 2000-01-26 | 2001-08-03 | Nippon Telegr & Teleph Corp <Ntt> | 移動体制御装置 |
| EP1998554A1 (fr) * | 2006-03-23 | 2008-12-03 | Panasonic Corporation | Appareil d'imagerie de contenu |
| JP2010256534A (ja) | 2009-04-23 | 2010-11-11 | Fujifilm Corp | 全方位画像表示用ヘッドマウントディスプレイ装置 |
| US20130093843A1 (en) * | 2010-03-10 | 2013-04-18 | Sung-Moon Chun | Method for configuring stereoscopic moving picture file |
| US20130241955A1 (en) * | 2010-11-09 | 2013-09-19 | Fujifilm Corporation | Augmented reality providing apparatus |
| US20120121138A1 (en) * | 2010-11-17 | 2012-05-17 | Fedorovskaya Elena A | Method of identifying motion sickness |
| JP2012141461A (ja) | 2010-12-29 | 2012-07-26 | Sony Corp | ヘッド・マウント・ディスプレイ |
| US20120320224A1 (en) * | 2011-06-14 | 2012-12-20 | Olympus Corporation | Information processing device, server system, image processing system, and information storage device |
| US20130235169A1 (en) * | 2011-06-16 | 2013-09-12 | Panasonic Corporation | Head-mounted display and position gap adjustment method |
| JP2013200325A (ja) | 2012-03-23 | 2013-10-03 | Sony Corp | ヘッドマウントディスプレイ |
| JP2014120044A (ja) * | 2012-12-18 | 2014-06-30 | Dainippon Printing Co Ltd | 映像出力装置、映像出力方法、及びプログラム |
| JP2014128020A (ja) | 2012-12-27 | 2014-07-07 | Fujitsu Ltd | 移動局装置及び通信方法 |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3322186A1 (fr) * | 2016-11-14 | 2018-05-16 | Thomson Licensing | Procédé et dispositif de transmission des données représentatives d'une image |
| WO2018086960A1 (fr) * | 2016-11-14 | 2018-05-17 | Thomson Licensing | Procédé et dispositif de transmission de données représentant une image |
| US10586514B2 (en) | 2017-05-01 | 2020-03-10 | Elbit Systems Ltd | Head mounted display device, system and method |
| US11004425B2 (en) | 2017-05-01 | 2021-05-11 | Elbit Systems Ltd. | Head mounted display device, system and method |
| EP3677993A4 (fr) * | 2017-08-29 | 2020-09-09 | Sony Corporation | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| CN108040179A (zh) * | 2017-12-25 | 2018-05-15 | 广东欧珀移动通信有限公司 | 屏幕切换方法及相关产品 |
| GB2574487A (en) * | 2018-10-26 | 2019-12-11 | Kagenova Ltd | Method and system for providing at least a portion of content having six degrees of freedom motion |
| GB2575932A (en) * | 2018-10-26 | 2020-01-29 | Kagenova Ltd | Method and system for providing at least a portion of content having six degrees of freedom motion |
| GB2575932B (en) * | 2018-10-26 | 2020-09-09 | Kagenova Ltd | Method and system for providing at least a portion of content having six degrees of freedom motion |
| US10902554B2 (en) | 2018-10-26 | 2021-01-26 | Kagenova Limited | Method and system for providing at least a portion of content having six degrees of freedom motion |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3175324A1 (fr) | 2017-06-07 |
| US20170111636A1 (en) | 2017-04-20 |
| JP2016031439A (ja) | 2016-03-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016017062A1 (fr) | Traitement d'informations pour une prévention de mal des transports dans un système d'affichage d'image | |
| US10692300B2 (en) | Information processing apparatus, information processing method, and image display system | |
| EP3029552B1 (fr) | Système de réalité virtuelle et procédé de commande de modes de fonctionnement d'un tel système | |
| KR102233223B1 (ko) | 화상 표시 장치 및 화상 표시 방법, 화상 출력 장치 및 화상 출력 방법과, 화상 표시 시스템 | |
| EP3008548B1 (fr) | Systèmes et appareil de casque | |
| US11190756B2 (en) | Head-mountable display system | |
| US11435593B1 (en) | Systems and methods for selectively augmenting artificial-reality experiences with views of real-world environments | |
| JP6540691B2 (ja) | 頭部位置検出装置及び頭部位置検出方法、画像処理装置及び画像処理方法、表示装置、並びにコンピューター・プログラム | |
| JP6378781B2 (ja) | 頭部装着型表示装置、及び映像表示システム | |
| US9582073B2 (en) | Image processing device and image processing method, display device and display method, computer program, and image display system | |
| JPWO2016017245A1 (ja) | 情報処理装置及び情報処理方法、並びに画像表示システム | |
| CN111602082A (zh) | 用于包括传感器集成电路的头戴式显示器的位置跟踪系统 | |
| GB2517057A (en) | Head-mountable apparatus and systems | |
| WO2016199731A1 (fr) | Visiocasque, procédé de commande d'affichage et programme | |
| US20240196045A1 (en) | Video display system, observation device, information processing method, and recording medium | |
| US12205502B2 (en) | Information processing apparatus and information processing method | |
| WO2016158080A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme | |
| WO2018096315A1 (fr) | Réalité virtuelle | |
| JP2018190078A (ja) | コンテンツ表示プログラム、コンピュータ装置、コンテンツ表示方法、及びコンテンツ表示システム | |
| WO2016082063A1 (fr) | Dispositif de commande de casque d'affichage 3d |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15734264 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15318116 Country of ref document: US |
|
| REEP | Request for entry into the european phase |
Ref document number: 2015734264 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2015734264 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |