[go: up one dir, main page]

WO2016052061A1 - Visiocasque - Google Patents

Visiocasque Download PDF

Info

Publication number
WO2016052061A1
WO2016052061A1 PCT/JP2015/074907 JP2015074907W WO2016052061A1 WO 2016052061 A1 WO2016052061 A1 WO 2016052061A1 JP 2015074907 W JP2015074907 W JP 2015074907W WO 2016052061 A1 WO2016052061 A1 WO 2016052061A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
proximity sensor
screen
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/074907
Other languages
English (en)
Japanese (ja)
Inventor
加藤 剛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of WO2016052061A1 publication Critical patent/WO2016052061A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present invention relates to a head mounted display.
  • HMD Eyeglass-type head-mounted displays
  • products that can be used by general users are increasing.
  • see-through type display installed in front of the HMD user's eyes, and the HMD user can visually recognize the image (virtual image) displayed on the display and display the same display.
  • the outside world real image
  • a smartphone or the like employs a touch-type image display screen as a user interface, and a predetermined input can be performed by touching the screen.
  • a predetermined input can be performed by touching the screen.
  • Patent Document 1 as an example using a non-contact user interface, when a user's hand enters the imaging range of a camera, the user's hand is extracted from the captured image and the gesture of the hand is recognized.
  • a head mounted display is disclosed in which a display image is an image on the next page.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an HMD having a non-contact user interface that can perform an intuitive operation and has low power consumption and low risk of erroneous detection. To do.
  • a head-mounted display reflecting one aspect of the present invention includes a mounting member mounted on a user's head and a screen on which the user can display an image so as to be visible And a display unit supported by the mounting member so that the display member is positioned in front of at least one eye of the user, and supported by the mounting member in a front proximity range.
  • a proximity sensor that detects the presence of an object in a detection area and generates an output; and a control device that controls a screen display of the display unit based on the output of the proximity sensor.
  • the detection area of the proximity sensor is a visual view of the eye facing the display member of the user. It is located within.
  • this head mounted display it is possible to provide a head mounted display having a non-contact user interface with low power consumption and low risk of erroneous detection.
  • an intuitive operation is possible, and a head mounted display having a user interface with excellent operability can be provided.
  • HMD head mounted display
  • FIG. 1 is a perspective view of the HMD 100 according to the present embodiment.
  • FIG. 2 is a front view of the HMD 100 according to the present embodiment.
  • FIG. 3 is a view of the HMD 100 according to the present embodiment as viewed from above.
  • the right side and the left side of the HMD 100 refer to the right side and the left side for the user wearing the HMD 100.
  • the HMD 100 of this embodiment has a frame 101 as a mounting member.
  • a frame 101 that is U-shaped when viewed from above has a front part 101a to which two spectacle lenses 102 are attached, and side parts 101b and 101c extending rearward from both ends of the front part 101a.
  • the two spectacle lenses 102 attached to the frame 101 may or may not have refractive power.
  • the cylindrical main body 103 is fixed to the front part 101 a of the frame 101 on the upper part of the spectacle lens 102 on the right side (which may be on the left side according to the user's dominant eye).
  • the main body 103 is provided with a display unit 104.
  • a display control unit 104DR (see FIG. 6 described later) that controls display of the display unit 104 based on an instruction from the processor 121 described later is disposed. If necessary, a display unit may be arranged in front of both eyes.
  • FIG. 4 is a schematic cross-sectional view showing the configuration of the display unit 104.
  • the display unit 104 includes an image forming unit 104A and an image display unit 104B.
  • the image forming unit 104A is incorporated in the main body unit 103, and includes a light source 104a, a unidirectional diffuser 104b, a condenser lens 104c, and a display element 104d.
  • the image display unit 104B which is a so-called see-through type display member, is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the light source 104a has a function of illuminating the display element 104d.
  • the light source 104a emits light having a predetermined wavelength width, whereby the image light obtained by illuminating the display element 104d can have a predetermined wavelength width, and the hologram optical element 104h transmits the image light. When diffracted, the image can be observed by the user over the entire observation angle of view at the position of the pupil B. Further, the peak wavelength for each color of the light source 104a is set in the vicinity of the peak wavelength of the diffraction efficiency of the hologram optical element 104h, so that the light use efficiency is improved.
  • the light source 104a is composed of LEDs that emit RGB light, the cost of the light source 104a can be reduced, and a color image is displayed on the display element 104d when the display element 104d is illuminated. The color image can be visually recognized by the user.
  • each of the RGB LED elements has a narrow emission wavelength width, the use of a plurality of such LED elements enables high color reproducibility and bright image display.
  • the display element 104d displays an image by modulating the light emitted from the light source 104a in accordance with image data, and is configured by a transmissive liquid crystal display element having pixels that serve as light transmitting regions in a matrix. Has been. Note that the display element 104d may be of a reflective type.
  • the eyepiece prism 104f totally reflects the image light from the display element 104d incident through the base end face PL1 by the opposed parallel inner side face PL2 and outer side face PL3, and passes through the hologram optical element 104h to the user's pupil.
  • external light is transmitted and guided to the user's pupil, and is composed of, for example, an acrylic resin together with the deflecting prism 104g.
  • the eyepiece prism 104f and the deflection prism 104g are joined by an adhesive with the hologram optical element 104h sandwiched between inclined surfaces PL4 and PL5 inclined with respect to the inner surface PL2 and the outer surface PL3.
  • the deflection prism 104g is joined to the eyepiece prism 104f, and becomes a substantially parallel flat plate integrated with the eyepiece prism 104f. By joining the deflecting prism 104g to the eyepiece prism 104f, it is possible to prevent distortion in the external image observed by the user through the display unit 104.
  • the hologram optical element 104h diffracts and reflects the image light (light having a wavelength corresponding to the three primary colors) emitted from the display element 104d, guides it to the pupil B, enlarges the image displayed on the display element 104d, and enlarges the user's pupil. It is a volume phase type reflection hologram guided as a virtual image.
  • the hologram optical element 104h has, for example, three wavelength ranges of 465 ⁇ 5 nm (B light), 521 ⁇ 5 nm (G light), and 634 ⁇ 5 nm (R light) with a peak wavelength of diffraction efficiency and a wavelength width of half the diffraction efficiency. The light is diffracted (reflected).
  • the peak wavelength of diffraction efficiency is the wavelength at which the diffraction efficiency reaches a peak
  • the wavelength width at half maximum of the diffraction efficiency is the wavelength width at which the diffraction efficiency is at half maximum of the diffraction efficiency peak. is there.
  • the reflection-type hologram optical element 104h has high wavelength selectivity, and only diffracts and reflects light having a wavelength in the above-mentioned wavelength range (near the exposure wavelength).
  • the hologram optical element 104h is transmitted, and a high external light transmittance can be realized.
  • the light emitted from the light source 104a is diffused by the unidirectional diffusion plate 104b, condensed by the condenser lens 104c, and enters the display element 104d.
  • the light incident on the display element 104d is modulated for each pixel based on the image data input from the display control unit 104DR, and is emitted as image light. Thereby, a color image is displayed on the display element 104d.
  • Image light from the display element 104d enters the eyepiece prism 104f from its base end face PL1, is totally reflected a plurality of times by the inner side face PL2 and the outer side face PL3, and enters the hologram optical element 104h.
  • the light incident on the hologram optical element 104h is reflected there, passes through the inner side surface PL2, and reaches the pupil B.
  • the user can observe an enlarged virtual image of the image displayed on the display element 104d, and can visually recognize it as a screen formed on the image display unit 104B.
  • the hologram optical element 104h constitutes a screen, or it can be considered that a screen is formed on the inner surface PL2.
  • “screen” may refer to an image to be displayed.
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h transmit almost all of the external light, the user can observe an external field image (real image) through them. Therefore, the virtual image of the image displayed on the display element 104d is observed so as to overlap with a part of the external image. In this manner, the user of the HMD 100 can simultaneously observe the image provided from the display element 104d and the external image via the hologram optical element 104h. Note that when the display unit 104 is in the non-display state, the image display unit 104B is transparent, and only the external image can be observed.
  • a display unit is configured by combining a light source, a liquid crystal display element, and an optical system.
  • a self-luminous display element for example, an organic EL display
  • Element for example, an organic EL display
  • a transmissive organic EL display panel having transparency in a non-light emitting state may be used.
  • the proximity sensor 105 disposed near the center of the frame 101 and the lens 106 a of the camera 106 disposed near the side are provided on the front of the main body 103 so as to face the front. It has been.
  • the “proximity sensor” refers to a detection region in the proximity range in front of the detection surface of the proximity sensor in order to detect that an object, for example, a part of the human body is close to the user's eyes. This means that a signal is output by detecting whether or not it exists.
  • the “proximity range” refers to a range where the distance from the detection surface of the proximity sensor is within 200 mm. If the distance from the proximity sensor is 200 mm or less, the user can put the palm in and out of the user's field of view with the arm bent, and can be easily operated by gestures using the hand. In addition, there is less risk of erroneously detecting a human body or furniture other than the user.
  • control device determines that the object exists in the proximity range based on a signal output from the proximity sensor when the object enters the detection area in the proximity range in front of the proximity sensor.
  • an effective signal may be output from the proximity sensor to the control device.
  • a passive proximity sensor has a detection unit that detects invisible light and electromagnetic waves emitted from an object when the object approaches.
  • a passive proximity sensor there are a pyroelectric sensor that detects invisible light such as infrared rays emitted from an approaching human body, and a capacitance sensor that detects a change in electrostatic capacitance between the approaching human body and the like.
  • the active proximity sensor includes an invisible light and sound wave projection unit, and a detection unit that receives the invisible light and sound wave reflected and returned from the object.
  • Active proximity sensors include infrared sensors that project infrared rays and receive infrared rays reflected by objects, laser sensors that project laser beams and receive laser beams reflected by objects, and project ultrasonic waves. Then, there is an ultrasonic sensor that receives ultrasonic waves reflected by an object. Note that the passive proximity sensor is excellent in low power consumption. An active proximity sensor can easily improve the reliability of detection, and can detect even when the user is wearing gloves, for example. A plurality of types of proximity sensors may be used in combination.
  • Proximity sensors are generally smaller and cheaper than cameras, and consume less power.
  • the proximity sensor cannot perform complicated detection such as detection of the shape of the object, but can determine the approach or separation of the object, so that the HMD is operated by passing the hand or holding the hand.
  • complicated image processing required for gesture recognition by analysis of the image captured by the camera is also unnecessary.
  • FIG. 5 is an enlarged view of the proximity sensor 105 used in the present embodiment as viewed from the front.
  • the proximity sensor 105 includes a light receiving unit 105a for invisible light such as infrared light emitted from a human body.
  • the light receiving unit 105a forms light receiving regions RA to RD as detection units arranged in 2 rows and 2 columns, and when invisible light is received, a corresponding signal is individually received from each of the light receiving regions RA to RD. It is output.
  • the right sub-body portion 107 is attached to the right side portion 101b of the frame 101
  • the left sub-body portion 108 is attached to the left side portion 101c of the frame 101.
  • the right sub-main body portion 107 and the left sub-main body portion 108 have an elongated plate shape, and have elongated protrusions 107a and 108a on the inner side, respectively.
  • the right sub-body portion 107 is attached to the frame 101 in a positioned state
  • the elongated protrusion 108 a is attached to the side of the frame 101.
  • the left sub-main body portion 108 is attached to the frame 101 in a positioned state.
  • An acceleration sensor 109 and a gyro 110 (see FIG. 6 to be described later) as an attitude detection device that generates an output corresponding to the attitude are mounted in the right sub-main body unit 107.
  • a sound unit 111 (see FIG. 6 described later) having a speaker (or earphone) and a microphone is provided in the main body portion 108.
  • the main main body 103 and the right sub main body 107 are connected so as to be able to transmit signals through a wiring HS, and the main main body 103 and the left sub main body 108 are connected so as to be able to transmit signals through a wiring (not shown). Yes.
  • FIG. 1 As schematically illustrated in FIG.
  • the right sub-main body 107 is connected to the control unit CTU via a cord CD extending from the rear end.
  • the acceleration sensor 109 and the gyro 110 may be integrated into a six-axis sensor.
  • HMD based on the output signal produced
  • the main main body 103 and the left sub main body 108 may be configured to be wirelessly connected.
  • FIG. 6 is a block diagram of main circuits of the HMD 100.
  • the control unit CTU includes a processor 121, an operation unit 122, a communication unit 123 that exchanges data with the outside, a ROM 124 that stores programs, a RAM 125 that stores image data, and a battery 126 that supplies power to each unit. And have.
  • the processor 121 can use an application processor used in a smartphone or the like, but the type of the processor 121 is not limited. For example, if an application processor has built-in hardware necessary for image processing such as GPU and Codec as a standard, it can be said that the processor is suitable for a small HMD.
  • the processor 121 and the display control unit 104DR constitute a control device that controls screen display based on the output of the proximity sensor 105.
  • the signal is input to the processor 121.
  • the processor 121 controls image display on the display unit 104 via the display control unit 104DR.
  • the processor 121 receives a signal from the gyro 110 and the acceleration sensor 109 as an attitude detection device and an output signal from the microphone 111.
  • the processor 121 receives power from the battery 126, operates in accordance with a program stored in the ROM 124, inputs image data from the camera 106 according to an operation input such as power-on from the operation unit 122, and stores the image data in the RAM 125. Communication with the outside can be performed via the communication unit 123 as necessary. Furthermore, as will be described later, it is possible to detect a gesture operation in a non-contact manner and execute image control corresponding to the detected gesture operation. Further, the processor 121 can execute a non-display mode to be described later through the display control unit 104DR.
  • FIG. 7 is a front view when the user US wears the HMD 100 of the present embodiment.
  • FIG. 8 is a side view (a) and a top view (b) when the user US wears the HMD 100, and shows it together with the user's hand.
  • FIG. 9 is a diagram showing an image visually recognized by the user US through the see-through type image display unit 104B.
  • the gesture operation is an operation in which at least the hand HD of the user US approaches or separates and can be detected by the processor 121 of the HMD 100 via the proximity sensor 105. As shown in FIG.
  • the screen 104i of the image display unit 104B is arranged so as to overlap the effective visual field EV of the user's eye facing the image display unit 104B (here, positioned in the effective visual field EV).
  • the detection area SA of the proximity sensor 105 is in the visual field of the user's eye facing the image display unit 104B.
  • the detection area SA is located within the stable focus field of the user's eye or inside the visual field (within about 90 ° horizontal and within about 70 ° vertical), and more preferably located inside the stable focus field.
  • the proximity sensor 105 may be installed with its arrangement and orientation adjusted so as to overlap with the effective visual field EV or the inner visual field (horizontal within about 30 °, vertical within about 20 °).
  • the detection area SA overlaps the screen 104i.
  • the detection region SA of the proximity sensor 105 by setting the detection region SA of the proximity sensor 105 within the visual field of the eye of the user US while the user US is wearing the frame 101 that is a mounting member on the head, the screen is displayed through the screen 104i. While observing the hand, the approach and retraction of the hand to the detection area SA of the proximity sensor 105 can be reliably recognized without accompanying eye movement.
  • the gesture operation can be reliably performed while recognizing the detection area SA. .
  • the gesture operation can be performed more reliably. If the detection area SA overlaps the screen 104i, the gesture operation can be performed more reliably.
  • the entire detection unit is regarded as one light receiving unit, and the maximum detection range of the light receiving unit is regarded as a detection region.
  • FIG. 9 when the detection area SA of the proximity sensor 105 is set to overlap the screen 104i, an image indicating the detection area SA is displayed on the screen 104i (for example, the range of the area SA is indicated by a solid line). ), The user can surely recognize the detection area SA, so that the operation by the gesture can be performed more reliably.
  • gesture operation detection If there is nothing in front of the user US, the light receiving unit 105a does not receive invisible light, so the processor 121 determines that no gesture operation is performed. On the other hand, as shown in FIG. 8, when the user's US hand HD is brought close to the user US's eyes, invisible light emitted from the hand HD can be detected by the light receiving unit 105a as shown by the dotted line. . Accordingly, the processor 121 determines that a gesture operation has been performed.
  • the light receiving unit 105a has the light receiving regions RA to RD arranged in 2 rows and 2 columns (see FIG. 5). Therefore, when the user US moves the hand HD closer to the front of the HMD 100 from either the left, right, up, or down directions, the output timings of signals detected in the light receiving areas RA to RD are different.
  • 10 and 11 are examples of signal waveforms of the light receiving areas RA to RD, where the vertical axis represents the signal intensity of the light receiving areas RA to RD and the horizontal axis represents the time.
  • the vertical axis represents the signal intensity of the light receiving areas RA to RD
  • the horizontal axis represents the time.
  • the light receiving areas RA and RC first receive invisible light. Therefore, as shown in FIG. 10, first, the signals in the light receiving areas RA and RC rise, the signals in the light receiving areas RB and RD rise later, and the signals in the light receiving areas RA and RC fall, and then the light receiving areas RB, The RD signal falls.
  • the processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the right to the left. Accordingly, the processor 121 controls the display control unit 104DR to change the display so that the page is turned from the image G1 to the image G2, for example, in accordance with the movement of the hand HD as shown in FIG. be able to.
  • the signals of the light receiving areas RA and RB rise first, the signals of the light receiving areas RC and RD rise after a delay, and the signals of the light receiving areas RA and RB further fall, and then the signals of the light receiving areas RC and RD. Is falling.
  • the processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the top to the bottom.
  • the presence and movement of the hand HD can be reliably detected by using the proximity sensor and positioning the detection region within the visual field of the user's eye facing the image display unit. Accordingly, the user US can view the hand operation and the operation in conjunction with each other, and can realize an intuitive operation. Furthermore, since the proximity sensor is small and consumes less power than the camera, the continuous operation time of the HMD 100 can be extended.
  • the execution of the gesture detection process by the processor 121 or the energization to the proximity sensor is stopped to detect the gesture operation. It may be temporarily interrupted.
  • the operation for detecting the gesture operation can be started and stopped by any means (for example, the operation unit 122).
  • the processor 121 may start and stop the operation for detecting the gesture operation by detecting the gesture operation.
  • the proximity sensor 105 is operated intermittently, the user US approaches the hand HD in front of the proximity sensor 105, and the hand is held as it is for a predetermined time (for example, about 1 second).
  • 105a continuously outputs the intermittent signal during that time, so that the processor 121 that detects this can start control for detecting the gesture operation, or the proximity sensor 105 can be returned to the normal detection operation.
  • the above gesture operation can also be used to cancel the sleep mode.
  • the light receiving unit 105a of the proximity sensor 105 can detect the relative light reception amount of invisible light, a change in the light reception amount can be achieved by moving the hand HD closer to or away from the proximity sensor 105.
  • the processor 121 can determine different operations of the hand HD. It is conceivable to detect this and assign it to the operation of starting or stopping the gesture operation.
  • the proximity sensor 105 can detect an object other than the hand HD of the user US (for example, another person's hand) from its detection characteristics, the hand HD is detected when starting or stopping the operation for detecting the gesture operation. It is also preferable to use means other than that from the viewpoint of reducing malfunctions.
  • the processor 121 by providing the processor 121 with a voice recognition function, for example, the microphone 111 acquires voices such as “start” and “stop” of the user US, and an output signal from the microphone generated accordingly. May be activated or stopped by the processor 121 analyzing and recognizing the voice. When listening to music, the volume can be adjusted by voice recognition.
  • the processor 121 waits until the light receiving unit 105a detects invisible light in step S102. If invisible light is detected, the processor 121 enters the hand HD in step S103. It is determined that the gesture operation is performed as follows from the timing of the signals in the areas RA to RD. (1) Signal rise in areas RA and RC first, followed by signal rise in areas RB and RD: The hand HD has entered in the direction from right to left. (2) Signal rise in areas RB and RD first, followed by signal rise in areas RA and RC: The hand HD has entered in the direction from left to right.
  • step S104 the processor 121 stores the determined approach direction of the hand HD.
  • step S105 the processor 121 waits until the light receiving unit 105a does not detect invisible light. If the light receiving unit 105a does not detect it, the processor 121 determines the direction of detachment of the hand HD in the region RA to RD in step S106. From the timing, it is determined that the gesture operation is performed as follows. (5) Signal fall in areas RA and RC first, followed by signal fall in areas RB and RD: Hand HD left in the direction from right to left. (6) Signal fall in areas RB and RD first, followed by signal fall in areas RA and RC: Hand HD left in the direction from left to right.
  • step S107 the processor 121 determines whether the entry direction and the withdrawal direction of the hand HD match. If the entering direction and the leaving direction of the hand HD do not match, there is a risk of erroneous detection of the gesture operation. In this example, the gesture operation is not detected and the process proceeds to step S113. Note that a case where the entering direction and the leaving direction of the hand HD are different may be detected and used for another control. Alternatively, the time from the entry to the withdrawal of the hand HD may be measured with a timer, and if the movement is within the time, it may be determined that the gesture operation is correct. For example, if a signal indicating withdrawal is not output even after 1 second or more from the signal indicating entry, the gesture action may be ignored or determined as another pattern gesture action.
  • step S107 if it is determined in step S107 that the approach direction and the withdrawal direction of the hand HD match, the processor 121 determines a gesture operation in the subsequent step S108. Specifically, when the processor 121 determines that the gesture operation (1) + (5) has been continuously performed, in step S109, the screen moves from right to left (turns or scrolls). In this manner, the display unit 104 is controlled via the display control unit 104DR. When the processor 121 determines that the gesture operation (2) + (6) has been continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from left to right in step S110. The unit 104 is controlled.
  • step S111 the processor 121 displays the display via the display control unit 104DR so that the screen moves from top to bottom.
  • the unit 104 is controlled.
  • the processor 121 determines that the gesture operation (4) + (8) has been continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from the bottom to the top in step S112.
  • the unit 104 is controlled.
  • the user US can turn the page and scroll the image in the same direction as the movement of the hand while visually confirming the movement of the hand HD entering and leaving the front through the image display unit 104B.
  • the desired operation can be performed intuitively, so it is easy to use.
  • step S113 the processor 121 returns the flow to step S102 and continues control unless a signal instructing the end of the gesture operation detection operation is input.
  • the processor 121 ends the gesture operation detection control process.
  • gesture operations are likely to be carried out continuously, it usually waits for detection of the next gesture operation. However, if the user US forgets the stop operation, it is consumed if the non-operation time is long. In order to reduce power consumption, detection of gesture operation may be stopped by setting a timer of the processor 121.
  • FIG. 13 is a diagram illustrating an example of an image that is switched by a gesture operation.
  • the movement of the hand up and down, left and right can be recognized as a swipe operation, and can be switched by sliding in accordance with this.
  • the following operation is controlled by the processor 121 in accordance with a signal from the proximity sensor 105.
  • the home screen G11 is displayed on the display unit 104. In the home screen G11 shown in FIG.
  • the current date, temperature, humidity, and the like are displayed at the top of the screen.
  • a gesture operation is performed by moving the hand HD from the bottom to the top
  • a non-display mode in which no screen display is performed is entered, and the display unit 104 is switched to the non-display screen G01.
  • the user US can observe only the external image through the display unit 104. From this screen display (that is, during the execution of the non-display mode), when the gesture operation is performed by moving the hand HD from top to bottom, the non-display mode is canceled and the display unit 104 is switched to the home screen G11.
  • the gesture operation is performed by moving the hand HD from the right to the left from the display of the home screen G11
  • the music display mode is displayed and the music title display screen G10 is displayed on the display unit 104.
  • the title, author, and the like flowing from the speaker or earphone are displayed at the top of the screen, and the volume control VOL is displayed on the left side of the screen.
  • the volume of the song to be played increases one by one, and conversely the hand HD from the top to the bottom.
  • the volume decreases by one tick and the display changes with it.
  • the display unit 104 is switched to the home screen G11.
  • the display mode 104 is displayed and the imaging field angle display screen G12 is displayed on the display unit 104.
  • the user US can capture the subject within the rectangular frame displayed on the imaging angle-of-view display screen G12 with the camera 106 as a still image or a moving image.
  • the imaging may be started / finished in accordance with the switching of the screen, or the imaging may be started / finished after the standby time has elapsed from that point. Imaging may be started / stopped by operation of the operation unit or by voice. From this screen display, when the hand HD is moved from right to left and a gesture operation is performed, the display unit 104 is switched to the home screen G11.
  • the setting mode is displayed and the setting display screen G21 is displayed on the display unit 104.
  • the display unit 104 switches to another setting display screen G20, or the hand HD is moved from the left to the right.
  • the display unit 104 switches to another setting display screen G22, and each different setting can be performed using the operation unit 122, for example.
  • the display unit 104 is switched to the home screen G11. In addition, you may perform the movement to the vertical direction or horizontal direction for selecting a candidate item from one display screen (menu screen) containing a some selection item using the detection of gesture operation.
  • the user US can operate the screen by moving the hand HD in front of the display unit 104 while viewing the display screen of the display unit 104. Therefore, it is not necessary to move the viewpoint to view the operation unit like a device having an operation panel. It can appear to move and operate. Therefore, an easy-to-use user interface can be provided. Moreover, unlike the gesture recognition by the camera, the recognized area can be accurately grasped.
  • each screen is merely an example, and arbitrary items may be added or replaced from various screens including images associated with other functions. Moreover, what screen should be switched according to the moving direction of the hand, how to set the animation at the time of switching may be set as appropriate according to the preference of the user.
  • FIG. 14 is a diagram showing a part of the HMD according to the modification of the present embodiment.
  • two proximity sensors 105U and 105D are provided apart from the top and bottom of the main body 103.
  • the arrangement and orientation of the proximity sensors 105U and 105D are set appropriately so that the upper proximity sensor 105U detects only the hand HD positioned above, and the lower proximity sensor 105D detects only the hand HD positioned below. Make it detectable.
  • different signals can be input to the processor 121 depending on whether the hand HD is moved upward or downward, and the operation can be divided into different operations depending on the position to which the hand is moved.
  • the proximity sensor may be spaced from side to side.
  • the same signal is detected from the proximity sensor 105 when the user US swings the head of the user US up and down and left and right with the hand HD held in front of the HMD 100 and stationary. Therefore, when the processor 121 determines that the user US has swung his head up and down and left and right based on a signal from the acceleration sensor 109 as the posture detection device, the processor 121 detects the head when the signal is output from the proximity sensor 105. Image control (for example, high-speed page turning) different from the case where the hand HD is moved up, down, left and right after being stationary, or other control can be performed.
  • Image control for example, high-speed page turning
  • the processor 121 can detect that the user US is facing upward or downward with a signal from the gyro 110 as the posture detection device, a signal is output from the proximity sensor 105 by bringing the hand HD closer.
  • a signal is output from the proximity sensor 105 by bringing the hand HD closer.
  • different image control for example, proper use of page turning and screen scrolling
  • control is also performed using the output from the attitude detection device, so that variations in control increase.
  • the processor 121 when the acceleration sensor 109 (or a GPS device not shown) as the movement detection device detects that the user US wearing the HMD 100 is moving by walking, the processor 121 outputs a signal from the proximity sensor 105. Even when “” is input, it is possible to ignore the signal and not control the image display. This is because the user's safety can be secured by not switching the images carelessly while walking. However, display such as navigation which is not based on the signal from the proximity sensor can be arbitrarily performed.
  • control device may control the screen display of the display based on the signal from the proximity sensor and the electric signal from the microphone.
  • variation of control according to the signal of the proximity sensor can be increased by performing page turning or scroll switching by voice recognition from the analysis of the output signal from the microphone.
  • voice recognition after detecting a gesture operation, when a short voice is input with a microphone and converted into an electric signal within a certain period of time, it is possible to perform voice recognition to determine a command and speed up the control. .
  • a short voice often has a low voice recognition rate as a command.
  • combining with a gesture operation increases the possibility that an ambiguous voice is accepted as a command.
  • a pyroelectric sensor which is a kind of passive sensor is used as a proximity sensor.
  • a light emitting unit that emits invisible light such as infrared light and a light emitting unit emit light.
  • An active type sensor having a light receiving unit that detects invisible light reflected by an object may be used. If an active sensor is used, there is an advantage that gesture operation can be detected even when the user is wearing gloves.
  • HMD 101 Frame 101a Front part 101b Side part 101c Side part 101d Long hole 101e Long hole 102 Eyeglass lens 103 Main body part 104 Display 104A Image forming part 104B Image display part 104DR Display control part 104a Light source 104b Unidirectional diffuser 104c Condensing lens 104d Display element 104f Eyepiece prism 104g Deflection prism 104h Hologram optical element 104i Screen 105 Proximity sensor 105U, 105D Proximity sensor 105a Light receiving part 106 Camera 106a Lens 107 Right sub-main part 107a Projection 108 Left sub-main part 108a Projection 109 Acceleration sensor 110 Gyro 111 Audio Unit 121 Processor 122 Operation unit 12
  • the communication unit 126 cells CD code CTU control unit HD hand HS wiring PL1 proximal face PL2 inner surface PL3 outer surface PL4 inclined surface PL4, PL5 inclined surface RA-RD light receiving region SA detection area EV effective field US users

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un visiocasque possédant une interface utilisateur qui peut être mise en œuvre de façon intuitive et qui est moins sujette à des détections erronées tout en consommant moins d'énergie. Le visiocasque comporte : un élément de montage qui est monté sur la tête d'un utilisateur ; une unité d'affichage qui comprend un élément d'affichage de type transparent possédant un écran pouvant afficher une image visible à l'utilisateur et qui est porté par l'élément de montage de telle manière que l'élément d'affichage est positionné en face d'au moins l'un des yeux de l'utilisateur ; un capteur de proximité qui est porté par l'élément de montage, qui détecte la présence d'un objet dans une zone de détection à l'intérieur d'une plage de proximité avant, et qui génère une sortie ; et un dispositif de commande qui commande l'affichage d'écran de l'unité d'affichage sur la base de la sortie du capteur de proximité. Lorsque l'élément de montage est monté sur la tête de l'utilisateur, la zone de détection du capteur de proximité est positionnée dans le champ visuel de l'œil de l'utilisateur faisant face à l'élément d'affichage.
PCT/JP2015/074907 2014-09-30 2015-09-02 Visiocasque Ceased WO2016052061A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014199813 2014-09-30
JP2014-199813 2014-09-30

Publications (1)

Publication Number Publication Date
WO2016052061A1 true WO2016052061A1 (fr) 2016-04-07

Family

ID=55630105

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/074907 Ceased WO2016052061A1 (fr) 2014-09-30 2015-09-02 Visiocasque

Country Status (1)

Country Link
WO (1) WO2016052061A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018092674A1 (fr) * 2016-11-18 2018-05-24 コニカミノルタ株式会社 Appareil d'affichage et procédé d'entrée de gestes
CN109154656A (zh) * 2016-05-19 2019-01-04 哈曼国际工业有限公司 具有可见反馈的支持姿势的音频装置
JP2021517763A (ja) * 2018-03-01 2021-07-26 グーグル エルエルシーGoogle LLC 仮想現実および拡張現実低持続性のためのアクティブlcdシャッタ

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0581503A (ja) * 1991-09-20 1993-04-02 Seikosha Co Ltd 人体移動方向判別装置
JP2009086729A (ja) * 2007-09-27 2009-04-23 Sogo Keibi Hosho Co Ltd 監視システム、警備装置、センサおよび監視方法
JP2012065027A (ja) * 2010-09-14 2012-03-29 Canon Inc 撮像装置及び撮像装置の制御方法
JP2013025220A (ja) * 2011-07-25 2013-02-04 Nec Corp 安全確保システム、装置、方法及びプログラム
JP2013137413A (ja) * 2011-12-28 2013-07-11 Brother Ind Ltd ヘッドマウントディスプレイ
WO2014128789A1 (fr) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Dispositif de reconnaissance de forme, programme de reconnaissance de forme et procédé de reconnaissance de forme
JP2014174401A (ja) * 2013-03-11 2014-09-22 Seiko Epson Corp 画像表示システム及び頭部装着型表示装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0581503A (ja) * 1991-09-20 1993-04-02 Seikosha Co Ltd 人体移動方向判別装置
JP2009086729A (ja) * 2007-09-27 2009-04-23 Sogo Keibi Hosho Co Ltd 監視システム、警備装置、センサおよび監視方法
JP2012065027A (ja) * 2010-09-14 2012-03-29 Canon Inc 撮像装置及び撮像装置の制御方法
JP2013025220A (ja) * 2011-07-25 2013-02-04 Nec Corp 安全確保システム、装置、方法及びプログラム
JP2013137413A (ja) * 2011-12-28 2013-07-11 Brother Ind Ltd ヘッドマウントディスプレイ
WO2014128789A1 (fr) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Dispositif de reconnaissance de forme, programme de reconnaissance de forme et procédé de reconnaissance de forme
JP2014174401A (ja) * 2013-03-11 2014-09-22 Seiko Epson Corp 画像表示システム及び頭部装着型表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ATSUSHI TOCHIO ET AL.: "Attempt of Virtual Reality User Interface Using Kinect and HMD", FIT2013 DAI 12 KAI FORUM ON INFORMATION TECHNOLOGY KOEN RONBUNSHU, vol. 3, 20 August 2013 (2013-08-20), pages 523 - 526 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109154656A (zh) * 2016-05-19 2019-01-04 哈曼国际工业有限公司 具有可见反馈的支持姿势的音频装置
CN109154656B (zh) * 2016-05-19 2023-08-25 哈曼国际工业有限公司 具有可见反馈的支持姿势的音频装置
WO2018092674A1 (fr) * 2016-11-18 2018-05-24 コニカミノルタ株式会社 Appareil d'affichage et procédé d'entrée de gestes
JP2021517763A (ja) * 2018-03-01 2021-07-26 グーグル エルエルシーGoogle LLC 仮想現実および拡張現実低持続性のためのアクティブlcdシャッタ
JP7292294B2 (ja) 2018-03-01 2023-06-16 グーグル エルエルシー 仮想現実および拡張現実低持続性のためのアクティブlcdシャッタ

Similar Documents

Publication Publication Date Title
JP6617974B2 (ja) 電子機器、電子機器の制御方法及びその制御プログラム
TWI498771B (zh) 可辨識手勢動作的眼鏡
JP5957875B2 (ja) ヘッドマウントディスプレイ
US10191281B2 (en) Head-mounted display for visually recognizing input
JP6786792B2 (ja) 情報処理装置、表示装置、情報処理方法、及び、プログラム
US20130088434A1 (en) Accessory to improve user experience with an electronic display
CN105511846A (zh) 电子设备和显示控制方法
WO2015096335A1 (fr) Système de reconnaissance d'interactions et dispositif d'affichage
JP2016218899A (ja) ウェアラブル電子機器およびウェアラブル電子機器のジェスチャー検知方法
KR20130053377A (ko) 텍스트 입력장치와 포인터 위치정보 입력장치가 구비된 복합 휴먼 인터페이스 장치
JP2009104429A (ja) ヘッドマウントディスプレイ装置及び携帯装置
KR101268209B1 (ko) 포인터 위치정보 입력부와 포인터 실행 명령부를 갖는 휴먼 인터페이스 장치
WO2016052061A1 (fr) Visiocasque
CN106155604B (zh) 显示处理方法和电子设备
CN106155605B (zh) 显示处理方法和电子设备
WO2016072271A1 (fr) Dispositif d'affichage, procédé de commande de dispositif d'affichage et son programme de commande
JP6176368B2 (ja) ヘッドマウントディスプレイ及び情報表示装置
WO2017104525A1 (fr) Dispositif d'entrée, dispositif électronique et visiocasque
TWI492099B (zh) 可辨識手勢動作的眼鏡
KR101891837B1 (ko) 증강현실 인터페이스를 이용하는 착용형 디스플레이 장치
WO2017094557A1 (fr) Dispositif électronique et visiocasque
JPWO2017065050A1 (ja) 入力装置、電子機器、電子機器の入力方法及びその入力プログラム
JPWO2017065051A1 (ja) 入力装置、電子機器、電子機器の入力方法及びその入力プログラム
JP2017228300A (ja) 画像表示装置、ヘッドマウントディスプレイ、情報表示装置、表示処理方法及びプログラム
WO2018092674A1 (fr) Appareil d'affichage et procédé d'entrée de gestes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15847831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15847831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP