WO2009142015A1 - プロジェクタ - Google Patents
プロジェクタ Download PDFInfo
- Publication number
- WO2009142015A1 WO2009142015A1 PCT/JP2009/002232 JP2009002232W WO2009142015A1 WO 2009142015 A1 WO2009142015 A1 WO 2009142015A1 JP 2009002232 W JP2009002232 W JP 2009002232W WO 2009142015 A1 WO2009142015 A1 WO 2009142015A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- correction
- light
- unit
- projection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/005—Projectors using an electronic spatial light modulator but not peculiar thereto
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/3433—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using light modulating elements actuated by an electric field and being other than liquid crystal devices and electrochromic devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3164—Modulator illumination systems using multiple light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3173—Constructional details thereof wherein the projection device is specially adapted for enhanced portability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
Definitions
- the present invention relates to a projector that projects an image onto an object to be projected.
- Pocket-sized small projectors can always be carried around, and their usage is significantly different from conventional stationary projectors.
- Such a small projector has high portability, and a user can watch an image by projecting it on a nearby wall or object while holding the projector in his hand.
- Such a pocket-sized small projector can be used even during movement such as walking. For this reason, in a pocket-sized small projector, it is assumed that the projection object to be a screen changes from moment to moment.
- the screen to be the projection object is not limited to a planar shape and is not necessarily white. Even in such a case, in order to present a high-quality image, it is preferable to correct the projected image in accordance with the surface shape and color of the projection object.
- a pocket-sized small projector has a correction function that obtains information on the shape and color of the target screen and corrects the projected image based on the information. It is essential to have it.
- a test pattern image as a correction image is projected onto a projection object such as a screen before projecting the image.
- Image correction is performed by capturing the test pattern image and recognizing the shape of the projection object.
- the image to be projected is generally a rectangle
- the four corners of the projected image are detected, whether the projected image is distorted into a trapezoid, and the like, and the correction of the trapezoidal distortion is performed based on the determination result.
- the conventional projection image correction method described above is based on a stationary projector, and once the projection image is corrected, it is basically a constant projection without changing the relative position of the projector and the screen. The image is corrected.
- the projection object that becomes the screen changes every moment, such as a small projector having high portability, it is necessary to correct the projection image every moment according to the change of the projection object. For this reason, the conventional correction method for projected images on the assumption that the relative positions of the projector and the screen are fixed cannot be corrected appropriately.
- the image can be corrected from moment to moment while the image is projected.
- a method for detecting the four corners of the image is basically based on the assumption that the image is projected onto a flat screen, and it is not possible to detect distortion or the like inside the projected image. Can not.
- the object to be projected is not limited to a flat surface such as a wall as described above. For this reason, projection onto a curved surface or projection onto an object having a three-dimensional shape having concavities and convexities is assumed, which cannot be handled by a conventional correction method.
- Patent Document 1 since the image A and the image B, which are two types of images on which test pattern images are superimposed, are displayed, the user sees an image on which two test pattern images are superimposed. This is because the test pattern image overlaps the image that is originally intended to be observed, and thus a significant deterioration in image quality is inevitable. Furthermore, in order to detect a test pattern image, an arithmetic processing function for extracting the test pattern from the image captured by the imaging device by the operation of the image A and the image B is necessary, and there is a problem that the processing becomes complicated. It was.
- a correction method disclosed in Patent Document 1 has been proposed as a correction method for a projection image of a small projector in which a projection object to be screened changes every moment.
- a test pattern is superimposed on an image signal, and the test pattern image is detected simultaneously with the observation of the image.
- the white background is all gray, and two images on which the test pattern image is superimposed are projected and photographed. Is used to emphasize and extract the test pattern image.
- a projector includes a light source, a light modulation unit that modulates light emitted from the light source based on an image signal, and a periodic main image signal.
- a display control unit that outputs and controls display of an image signal to the light modulation unit, a projection unit that projects light modulated by the light modulation unit, and an imaging that captures an image based on the light projected from the projection unit
- the display control unit inserts a correction image signal for projecting a correction image that is visually recognized as a white or gray uniform screen when time-integrated between the periodic main image signals. It is characterized by.
- a correction image that is visually recognized as a white or gray uniform screen is inserted between the periodic main image signals when time integration other than the main image is performed. Image is not recognized. For this reason, it is possible to insert the correction image while suppressing deterioration of the image quality of the main image.
- FIG. 2A is an explanatory diagram showing an example of the insertion timing of the main image signal of each color, the insertion timing of the correction image signal, and the timing of capturing the correction image according to an embodiment of the present invention.
- FIG. 2B is an explanatory diagram illustrating an example of the main image and the correction image in FIG. 2A. It is explanatory drawing which shows each pattern image shown to FIG. 2A and 2B, and the image on which these pattern images were superimposed.
- FIG. 4A is an explanatory diagram showing an example of the main image signal insertion timing, the correction image signal insertion timing, and the correction image capturing timing according to the embodiment of the present invention.
- FIG. 4A is an explanatory diagram showing an example of the main image signal insertion timing, the correction image signal insertion timing, and the correction image capturing timing according to the embodiment of the present invention.
- FIG. 4B is an explanatory diagram illustrating an example of the main image and the correction image in FIG. 4A.
- FIG. 5A is an explanatory diagram illustrating a state in which a pattern image of white laser light is projected.
- FIG. 5B is an explanatory diagram showing a state in which a pattern image of monochromatic laser light is projected.
- FIG. 6A is an explanatory diagram showing an example of the insertion timing of the main image signal of each color, the insertion timing of the correction image signal, and the timing of capturing the correction image according to an embodiment of the present invention.
- FIG. 6B is an explanatory diagram illustrating an example of the main image and the correction image in FIG. 6A.
- FIG. 7A is an explanatory diagram illustrating a state in which a pattern image is projected onto a planar projection target object.
- FIG. 7B is an explanatory diagram illustrating a captured pattern image.
- FIG. 8A is an explanatory diagram showing an example of the main image signal insertion timing, correction image signal insertion timing, and correction image capturing timing according to another embodiment of the present invention.
- FIG. 8B is an explanatory diagram illustrating an example of the main image and the correction image in FIG. 8A. It is explanatory drawing which shows a mode that the uniform image is projected on the to-be-projected object in which reflectances differ partially.
- FIG. 8A is an explanatory diagram illustrating a state in which a pattern image is projected onto a planar projection target object.
- FIG. 7B is an explanatory diagram illustrating a captured pattern image.
- FIG. 8A is an explanatory diagram showing an example of the main image signal insertion timing, correction image signal insertion timing, and correction image capturing timing
- FIG. 10A is an explanatory diagram showing an example of the insertion timing of the main image signal of each color, the insertion timing of the correction image signal, and the timing of capturing the correction image according to another embodiment of the present invention.
- FIG. 10B is an explanatory diagram illustrating an example of the main image and the correction image in FIG. 10A. It is explanatory drawing which shows the comparison of the color reproduction range of a laser light source, and the color reproduction range of LED.
- FIG. 12A is an explanatory diagram showing a case where the correction image to be projected is a two-dimensional code image.
- FIG. 12B is an explanatory diagram showing a case where the correction image to be projected is a two-dimensional code image.
- FIG. 15A is an explanatory diagram illustrating a case where a half mirror is used as an example of the configuration illustrated in FIG. 14.
- FIG. 15B is an explanatory diagram showing a case where a polarization beam splitter is used as an example of the configuration shown in FIG.
- FIG. 15B shows the optical system using reflection type LCOS and a polarization beam splitter. It is explanatory drawing for demonstrating a scanning type projection system.
- FIG. 20A is an explanatory diagram illustrating a relative angle between the projector and the projection object and a pattern image to be captured.
- FIG. 20B is an explanatory diagram illustrating a relative angle between a projector and an object to be projected, which is different from FIG. 20A, and a pattern image to be captured.
- FIG. 22A is an explanatory diagram showing an application example of the projector according to the embodiment of the invention.
- FIG. 22B is an explanatory diagram illustrating an application example of the projector according to the embodiment of the invention. It is explanatory drawing which shows schematic structure of the scanning-type image display apparatus which concerns on one embodiment of this invention. It is a schematic diagram which shows schematic structure of the photodetector with which the scanning-type image display apparatus of FIG. 23 is provided.
- FIG. 25A is an explanatory diagram illustrating an example of a detection signal of the photodetector in FIG. 24.
- FIG. 25B is an explanatory diagram illustrating another example of the detection signal of the photodetector in FIG. 24.
- FIG. 26A is an explanatory diagram showing the scanning line on the projection surface and the amount of reflected light for each wavelength on the scanning line.
- FIG. 26B is an explanatory diagram illustrating the scanning line on the projection surface and the amount of reflected light for each wavelength on the scanning line.
- FIG. 27A is an explanatory diagram illustrating the scanning line on the projection surface and the amount of reflected light for each wavelength on the scanning line.
- FIG. 27B is an explanatory diagram illustrating the scanning line on the projection surface and the amount of reflected light for each wavelength on the scanning line. It is explanatory drawing which shows schematic structure of the scanning-type image display apparatus which concerns on one embodiment of this invention. It is explanatory drawing which shows schematic structure of the scanning-type image display apparatus which concerns on other embodiment of this invention. It is explanatory drawing which shows schematic structure of the scanning-type image display apparatus which concerns on other embodiment of this invention. It is a top view which shows the photodetector and light source output detector with which the scanning-type image display apparatus of FIG. 30 is provided. It is explanatory drawing which shows schematic structure of the scanning-type image display apparatus which concerns on other embodiment of this invention.
- Embodiment 1 A projector according to Embodiment 1 of the present invention will be described below with reference to FIGS.
- FIG. 1 shows a schematic configuration of the projector 22 according to the first embodiment of the present invention.
- the projector 22 includes a laser light source 1R, a laser light source 1G, a laser light source 1B, a collimator lens 4, a lenticular lens 5, a spatial modulation element 6, a projection lens 7, and a dichroic mirror 12 for each color.
- red, blue, and green laser beams are sequentially emitted.
- the green laser light becomes substantially parallel light by the collimating lens 4, is reflected by the mirror 17, and passes through the dichroic mirror 12.
- the blue laser light is made substantially parallel light by the collimating lens 4 and then combined with the green laser light by the dichroic mirror 12.
- the red laser light is made substantially parallel light by the collimating lens 4 and then combined with the green laser light and the blue laser light by the dichroic mirror 12.
- the combined laser light becomes diffused light by the lenticular lens 5 and enters the spatial modulation element 6.
- the spatial modulation element 6 modulates incident light based on a periodic main image signal.
- the projection lens (projection unit) 7 projects the light modulated by the spatial modulation element 6 onto a screen (not shown). Further, the image sensor 40 captures an image displayed by the light projected from the projection lens 7. An image captured by the image sensor 40 is processed by the image correction controller 41.
- the image correction controller (display control unit / correction unit) 41 outputs an image signal including a periodic main image signal to the spatial modulation element 6 and performs display control.
- FIG. 2A and 2B show the main image signal insertion timings of RGB colors under the control of the image correction controller 41, the correction image signal insertion timing, and the timing at which the pattern image is captured by the image sensor 40.
- FIG. 2A and 2B show the main image signal insertion timings of RGB colors under the control of the image correction controller 41, the correction image signal insertion timing, and the timing at which the pattern image is captured by the image sensor 40.
- the image sensor (imaging unit) 40 As shown in FIGS. 2A and 2B, the image sensor (imaging unit) 40 according to the present embodiment generates a correction image in synchronization with each timing of projecting the pattern image A and the pattern image B as the correction image. Take an image.
- the projector 22 uses a time division method as an image display method, and switches the laser light of each color at high speed. That is, the laser light sources 1R, 1G, and 1B of the respective colors are lit at regular intervals, and the R, G, and B color images constituting the main image are sequentially and periodically projected in accordance with the lighting timing.
- the drive frequency of each color image of R, G, and B is at least 60 Hz or more, and preferably 180 Hz or more.
- the driving frequency is 60 Hz or less, color afterimage noise called color breaking noise is remarkably generated, and the image quality is greatly deteriorated.
- the driving frequency is 180 Hz or higher, the color braking noise is hardly recognized by the user.
- a pattern image is projected until the next set of main images composed of R, G, and B is projected.
- a pattern image B is generated until a set of main images of the next R, G, and B are further projected. Inserted. Thereafter, similarly, the pattern image A and the pattern image B are alternately inserted between a set of sequentially projected main images.
- a correction image signal for projecting a correction image that is visually recognized as a white or gray uniform screen has a period. Inserted between typical main image signals.
- the pattern image A and the pattern image B are displayed by turning on the laser light source 1R, the laser light source 1G, and the laser light source 1B at the same time. That is, each pattern image A and B of this embodiment is not a pattern image created by monochromatic light, but a pattern image created by light superimposed with red laser light, green laser light, and blue laser light. Yes. Further, the laser output power ratios of the R, G, and B laser light sources 1R, 1G, and 1B are set so that the superimposed light becomes white light. White light is generally expressed using the black body radiation temperature.
- red laser light (wavelength 640 nm), green laser light (wavelength 532 nm), and blue laser light (wavelength 448 nm) emitted from the projection lens 7 of the projector 22
- the power ratio is red: green: blue, which is 1: 0.83: 0.78.
- FIG. 3 shows a pattern image A, a pattern image B, and an image obtained by superimposing the pattern image A and the pattern image B shown in FIGS. 2A and 2B.
- the pattern image A and the pattern image B are made of white light. For this reason, the image on which they are superimposed is an image having only white light. Further, the pattern image A has a white lattice portion. On the other hand, the pattern image B has a black grid portion. For this reason, when the pattern image A and the pattern image B are superimposed, a uniform gray image (light is only white light) is obtained. In order to recognize an image in which the pattern image A and the pattern image B are superimposed, the user recognizes that the gray image is inserted in the main image.
- the user does not directly recognize the lattice pattern image A and the pattern image B, but recognizes a uniform gray image in which the pattern image A and the pattern image B are superimposed. Therefore, it is possible to suppress deterioration in image quality in the main image. Further, by using a lattice pattern image as the correction image, it is possible to easily know the unevenness of the surface of the projection object serving as a screen.
- the projector 22 obtains the contrast of the projection image using the image captured by the image sensor 40, and adjusts the brightness of the pattern image according to the contrast.
- the contrast of the image is high, the user can recognize the image even with weak light. For this reason, it is preferable to set the brightness of the pattern image low.
- the light for projecting the correction image is weak, so that the main image is prevented from becoming white and deteriorated. It is possible to provide an image with little.
- the power input to the laser light source is small, a projector with low power consumption can be realized.
- a set of (R) images, (G) images, and (B) images that are sequentially projected are used as the pattern images of the pattern image A and the pattern image B.
- the R, G, and B correction image signals for projecting the correction image are the R, G, and B colors that sequentially project the (R) image, (G) image, and (B) image that constitute the main image. Are sequentially inserted after the main image signal.
- each image signal for correction is visually recognized as a white or gray uniform screen when time other than the main image is integrated by the control of the image correction controller (display control unit) 41 shown in FIG. It is inserted between periodic main image signals so as to project a correction image.
- the timing at which the image sensor 40 captures the pattern image A and the pattern image B as correction images is synchronized with the timing at which these pattern images are projected.
- the spatial modulation element 6 can be modulated at 360 Hz or more. In this case, since the R, G, and B color images of the main image can be projected at 60 Hz or higher, flickering of the main image can be suppressed.
- FIG. 5A and 5B show a comparison between a case where a pattern image made of white light is projected on a projection object having partially different reflectance and a case where a pattern image made of monochromatic light is projected. Yes.
- FIG. 5A is a diagram illustrating a state in which a pattern image of white light is projected
- FIG. 5B is a diagram illustrating a state in which a pattern image of red (monochromatic light) laser light is projected.
- the projected object 81 having partially different reflectances has a smaller reflectance of green laser light in the region 82 surrounded by the dotted line than in other regions.
- the region 83 of the lattice pattern image is a bright line as shown in FIG. Since the reflectance of the green laser light at is low, it becomes a dark line as compared with the region 83.
- the reflectance of the red laser in the region 82 is the same as that in the other regions. Therefore, when a red pattern image is projected, the region 85 and the region 86 are bright lines having the same luminance. As a result, when the pattern image is captured, the brightness of the bright lines of the lattice pattern image becomes uniform as compared with the case of white light, and the state of the pattern image can be easily recognized. With this configuration, image processing using a pattern image can be facilitated.
- an image sensor that does not include a color image capturing unit may be used as the image sensor 40 according to the present embodiment.
- image correction may be performed using only the brightness of the captured pattern image.
- an image pickup device that does not have a color image pickup function the cost of the projector can be reduced.
- the correction image signals for sequentially projecting the (R) image, (G) image, and (B) image constituting the pattern image A are the (R) image and (G) constituting the main image.
- An image and (B) an image are sequentially inserted after each color main image signal onto which the image is projected.
- the R, G, and B correction image signals for projecting the pattern image B are the main image signals for sequentially projecting the (R) image, (G) image, and (B) image constituting the main image. Later, it is inserted at the same time. Thereby, since the frequency
- the projector 22 has parallax for projection and imaging because the projection lens 7 and the imaging lens of the imaging device 40 are separated from each other. For this reason, the pattern image projected on the projection object from the projection lens 7 and the pattern image captured by the image sensor 40 are different from each other by the amount of the parallax.
- FIG. 7A and 7B are diagrams showing projection of a pattern image and imaging.
- FIG. 7A shows a state in which the pattern image is projected in a state where the projection lens 7 and the planar projection object face each other.
- FIG. 7B shows an image obtained by imaging the projected pattern image shown in FIG. 7A by the image sensor 40.
- the projection lens 7 projects a pattern image 56 onto a planar projection object (not shown). In this case, the image sensor 40 captures a distorted pattern image 57 as shown in FIG. 7B due to the parallax with the projection lens 7.
- the pattern image to be captured changes depending on the relative angle between the projection lens 7 and the surface of the projection object and the unevenness of the surface of the projection object.
- the projector 22 according to the present embodiment compares the captured pattern image with, for example, the pattern image 57, so that the relative angle between the projection lens 7 and the projection object, the surface of the projection object surface, and the like. It is comprised so that an unevenness
- the image can be corrected by distorting the main image so that the pattern image to be captured becomes the pattern image 57.
- a correction image in which the in-plane of the projection object is visually recognized as uniform white or gray is generated every moment, and the correction image May be inserted between periodic main images. That is, when the color or luminance of the projection object itself captured by the image sensor 40 is canceled during the non-projection period of the main image projected at a predetermined frequency, and the time integration is performed, the in-plane surface of the projection object is uniformly white.
- a correction image signal for projecting a correction image visually recognized as gray may be generated, and the correction image signal may be inserted between the periodic main image signals.
- the image sensor 40 projects the image at a timing during the non-projection period of the main image projected at a predetermined frequency.
- the object itself is picked up every moment.
- a correction image in which the in-plane of the projection object is visually recognized as uniform white or gray is generated every moment, and the correction image is periodically generated.
- the correction image can be used substantially as color correction or luminance correction of the main image, so that a projector that projects the main image with higher image quality can be realized.
- the pattern image is not limited to the lattice pattern, and may be a checkered pattern or a concentric pattern. Needless to say, a plurality of types of pattern images may be used in combination.
- a DMD Digital Micro-mirror Device: registered trademark of Texas Instruments
- DMD has a high driving frequency of about 720 Hz, color braking can be further reduced.
- the spatial modulation element 6 of the projector 22 is small, the focal length of the projection lens 7 is short, and the aperture value is large.
- the pixel pitch of the spatial modulation element 6 is 5 ⁇ m
- the focal length of the projection lens 7 is 6 mm
- the aperture value of the projection lens 7 is 2
- the distance between the projector 22 and the projection object is 500 mm
- the permissible circle of confusion is 1 pixel.
- the depth of field is 142 mm.
- the depth of field of the projection lens 7 is a value that becomes deeper as the focal length is shorter or the aperture is narrowed down. If the subject depth is deep, a pattern image with less blur can be captured even when the projection object has irregularities.
- the size of the projection lens 7 increases in proportion to the size of the spatial modulation element 6 when the aperture value and the angle of view of the projector 22 are the same. Therefore, if the size of the spatial modulation element 6 is small, the size of the projection lens 7 can be reduced, and a small projector 22 can be realized.
- the laser is used as the light source, the aperture value of the illumination light can be increased even if the spatial modulation element 6 is small. This is because the laser is close to an ideal point light source, so that the etendue (product of the area of the light source and the light spreading solid angle) is small, and it is easy to collect light and to make it substantially parallel light. If a laser is used as the light source, the aperture value of the illumination light can be increased. Therefore, the projection lens 7 can be easily designed, and a low-cost projector 22 can be realized.
- a laser light source as the light source of the present invention.
- a time-division image display method can be realized by turning the light source on and off, and a low-cost projector can be realized. Further, since the light source itself is small, a small projector can be realized. Furthermore, since the laser light source has a small spectral width, it has excellent color reproducibility and facilitates color correction of the main image for the projection object.
- an LED may be used as the light source of the projector 22.
- the unit price is lower than that of a laser. Therefore, a projector 22 can be realized at a lower cost than that using a laser as a light source.
- a time-division image display method can be realized by turning the light source on and off, so that the color wheel necessary for the lamp light source can be removed, and a low-cost projector can be realized. Further, since the light source itself is small, a small projector can be realized.
- Embodiment 2 A projector according to Embodiment 2 of the present invention will be described below with reference to FIGS.
- a uniform image is used as the correction image instead of the lattice pattern image of the first embodiment.
- FIG. 8 shows the timing at which the uniform image A as the correction image of the present embodiment is projected and the timing at which the image sensor 40 captures the uniform image A.
- the uniform image A is an image having a uniform luminance created by light superimposed with red laser light, green laser light, and blue laser light.
- the laser output power ratios of the red laser light source 1R, the green laser light source 1G, and the blue laser light source 1B are set so that white light can be obtained by superimposing the laser beams of these colors.
- White light is generally expressed using the black body radiation temperature.
- the method for setting the laser output power ratio the same method as in the first embodiment can be used, and thus the description thereof is omitted here.
- the uniform image A is a single white light image
- the uniform image A is between the main image composed of one set of R, G, B and the next main image composed of R, G, B. Is projected, the user does not feel a change in the hue of the main image. Further, by capturing the projected uniform image A and knowing the brightness and darkness of the color and brightness, the reflectance according to the wavelength of the incident light on the surface of the projection object serving as a screen is known.
- the reflectance of the projection object to be the screen can be easily known.
- FIG. 9 is a diagram showing a state in which there are portions with different reflectivities on the object to be projected and a uniform image 54 is projected.
- the region 55 is a region where the reflectance of laser light having a wavelength of around 532 nm is low. That is, the region 55 is a region where the reflectance of green laser light is low.
- the image correction controller 41 (FIG. 1) corrects the color of the image and increases the green component of the main image projected onto the region 55. With this configuration, it is possible to obtain a high-quality main image in which deterioration of image quality due to a difference in reflectance on the surface of the projection object is suppressed.
- the brightness of the main image may be corrected according to the brightness of the surface of the projection object.
- the distribution of the amount of light returning from the surface of the projection object to the user's eyes can be known from the brightness distribution of the captured uniform image A.
- the main image is corrected according to the luminance distribution of the captured uniform image. For example, an area where the luminance of the captured uniform image is low is recognized as a dark area by the user because the amount of light returning to the user's eyes is small.
- the image correction controller 41 performs image correction so as to increase the brightness of the main image projected onto the area. Accordingly, the user can observe the main image with uniform brightness even when the reflection direction of light on the surface of the projection object and the degree of light scattering vary depending on the region.
- the uniform image A may be configured to project images of each color in a time division manner.
- FIG. 10A shows a case where the uniform image A is composed of a red image, a green image, and a blue image that are sequentially projected in a time division manner.
- the correction image signals of R, G, and B colors forming the uniform image A are inserted.
- the image sensor 40 captures only the correction image in synchronization with the projection timing of the correction image.
- the image sensor 40 according to the present embodiment may not have the function of capturing a color image, as in the case of the first embodiment.
- the reflectance corresponding to the wavelength of the incident light may be specified using only the brightness of the captured uniform image A.
- FIG. 11 is a diagram showing the color reproduction range of the laser light source and the LED.
- FIG. 11 is a chromaticity diagram represented by the color coordinate x and the color coordinate y, and the region 85 is a visible region.
- the color coordinate 91R is the color coordinate of the red laser beam
- the color coordinate 91G is the color coordinate of the green laser beam
- the color coordinate 91B is the color coordinate of the blue laser beam.
- the color coordinate 92R is the color coordinate of the red LED light
- the color coordinate 92G is the color coordinate of the green LED light
- the color coordinate 92B is the color coordinate of the blue LED light.
- a region 86 is a color reproduction range region when a red laser light source, a green laser light source, and a blue laser light source are used as a light source
- a region 87 is a case where a red LED, a green LED, and a blue LED are used as a light source.
- This is the color reproduction range area.
- the color coordinate of the image included in the main image signal is the color coordinate 89 in FIG. 11, and the color of the captured image is the color coordinate 88 due to the reflectance characteristics of the projection object surface. .
- Embodiment 3 A projector according to Embodiment 3 of the present invention will be described below with reference to FIGS.
- Each projector according to the present embodiment has the same basic configuration as the projector 22 according to the first embodiment shown in FIG. Therefore, the same reference numerals are used for the same components as in the first embodiment, and detailed description thereof is omitted.
- the correction image is a two-dimensional code image and can include various kinds of information. That is, the present embodiment is different from the above-described embodiments in that a pattern image composed of a two-dimensional code image A and a two-dimensional code image B is used as a correction image.
- FIG. 12A shows a two-dimensional code image A as a correction image and an information image.
- FIG. 12B shows a two-dimensional code image B as a correction image in which the luminance of the two-dimensional code A is inverted.
- the two-dimensional code image A and the two-dimensional code image B are visually recognized as a white or gray uniform screen, so the user can observe the main image without recognizing the two-dimensional code. it can.
- the image sensor 40 captures an image in synchronization with each timing of projecting the two-dimensional code image A. Thereby, the two-dimensional code A inserted while the main image signal is projected can be acquired, and information included in the two-dimensional code image A can be obtained. As described above, when the two-dimensional code image A having information is used, it is possible to convey more information as compared with a method of transmitting information by temporally modulating illumination light.
- the image correction controller 41 (FIG. 1) has a function as an analysis unit that extracts information from the two-dimensional code imaged by the image sensor 40.
- the image sensor 40 may be configured to image in synchronization with each timing of projecting the two-dimensional code image A and the two-dimensional code image B, similarly to the configuration of FIG. 2A.
- information acquisition and main image correction can be performed with the two-dimensional code image A
- main image correction can be performed with the two-dimensional code image B.
- each correction image signal of the two-dimensional image A is similar to FIG. 4A, after each main image signal that sequentially projects the (R) image, (G) image, and (B) image constituting the main image, As in FIG. 2A, the (R) image, the (G) image, and the (B) image constituting the main image may be inserted at the same time after each main image signal for sequentially projecting. .
- the lattice pattern image A and the pattern image B are inserted between the main images and projected, etc.
- the two-dimensional code image A and the two-dimensional code image B may be used in combination with other correction images.
- a correction image such as a lattice pattern image or a uniform image
- the insertion frequency of the two-dimensional code may be lower than the insertion frequency of the other correction images. This is because a two-dimensional code mainly for information transmission can sufficiently perform the information transmission function even if it is inserted at intervals of several seconds, and follows the surface of the projection object that changes from moment to moment. This is because the insertion frequency is not as high as that of other correction images inserted to perform correction.
- the two-dimensional code is composed of the two-dimensional code image A and the two-dimensional code image B as described above, the pair of the two two-dimensional code images A and B is a time so that the two-dimensional code is not visually recognized by the user. Needless to say, it should be inserted between the main images in close proximity.
- the projector according to the present embodiment may further include a synchronization signal generation device, a synchronization signal transmission device, and a synchronization signal reception device.
- FIG. 13 shows a state where a plurality of users are using the projector according to the present embodiment.
- the projector When multiple people use the projector, it is possible to copy the main images.
- the projected image is created by an individual, and information about who and when and where the image was captured is required.
- the plurality of projectors 25 and 26 include signal transmission / reception units 29X and 29Y that transmit and receive a synchronization signal 27 indicating the timing for projecting the two-dimensional code image, and the timing for projecting the two-dimensional code image.
- the synchronization is performed using the synchronization signal 27.
- the signal transmission / reception unit 29X of the projector 25 transmits / receives the signal of the projector 26 of the user Y.
- the synchronization signal 27 is received from the unit 29Y, and only the two-dimensional code correction image can be captured. Therefore, the user X and the user Y can acquire data related to images projected by other users.
- the timing of projecting the correction image of each projector can be synchronized, and only the correction images of other projectors that are projecting images are captured and acquired. It becomes possible.
- Embodiment 4 A projector according to Embodiment 4 of the present invention will be described below with reference to FIG.
- the projector according to the present embodiment includes a lens actuator (not shown) that drives the projection lens 7 in addition to the projection optical system and the imaging optical system of the projector 22 according to the first embodiment.
- the projector includes a half mirror 15 and a projection imaging lens 16 that serves as both a projection lens and an imaging lens.
- the light 66 for one pixel output from the spatial modulation element 60 passes through the half mirror 15 and is projected onto the uneven projection target surface 62 by the projection imaging lens 16 that serves as both a projection lens and an imaging lens. Is done.
- reference numeral 63 indicates an image plane on which the spatial modulation element 60 forms an image with the projection imaging lens 16.
- the light 66 for one pixel output from the spatial modulation element 60 forms an image on the projection object surface 62.
- the light reflected from the projection object surface 62 is imaged on the image sensor 61.
- the projection object surface 62 has irregularities. For this reason, an image formed on a surface that does not coincide with the imaging surface 63 is blurred.
- the image correction controller 41 (FIG. 1) recognizes the unevenness of the projection object surface 62 from the displacement amount of the projection imaging lens 16 and the pixel imaged at that time, and The image processing according to the unevenness of the projection object surface 62 is performed. For example, the projection imaging lens 16 is moved so that the convex surface of the projection object surface 62 becomes the imaging plane of the projection imaging lens 16.
- the resolution of the main image projected onto the convex portion is lowered according to the blur of the pixel on the concave surface of the projection object surface 62.
- an image having a uniform resolution can be provided regardless of the unevenness of the projection object.
- the projection lens also serves as the imaging lens, it is possible to reduce the size and cost, and to provide a small projector with low power consumption.
- a polarization beam splitter 72 shown in FIG. 15B may be used instead of the half mirror 15.
- FIG. 15A and 15B show schematic configurations when the half mirror 15 is used and when the polarization beam splitter 72 is used, respectively.
- FIG. 15A is a configuration diagram when the half mirror 15 is used
- FIG. 15B is a configuration diagram when the polarization beam splitter 72 is used.
- the half mirror 15 is a semi-transmissive mirror that transmits, for example, 50% light and reflects 50% light. For this reason, the light intensity decreases by 50% each time the half mirror 15 passes or reflects.
- the light 101 emitted from the spatial modulation element 60 passes through the half mirror 15 by 50% and becomes light 102. Further, the light 102 is projected onto the screen by the projection imaging lens 16. Then, the light 103 reflected from the screen passes through the projection imaging lens 16, is reflected 50% by the half mirror 15, becomes light 104, and is imaged by the imaging device 75.
- the configuration using the polarization beam splitter 72 shown in FIG. 15B includes a quarter-wave plate 76.
- the liquid crystal display element 8 that controls the polarization of light is preferably used as the spatial modulation element.
- the quarter-wave plate 76 generates a phase difference of 90 degrees, and circularly polarized light can be obtained when the slow axis is set to 45 degrees with respect to linearly polarized light.
- the liquid crystal display element 8 modulates light by controlling the polarization.
- the polarization beam splitter 72 separates transmission and reflection in accordance with the polarization. By adjusting the polarization beam splitter 72 and the liquid crystal display element 8 to transmit the polarized light, the light intensity is not reduced. It can be transmitted through the polarization beam splitter 72.
- the linearly polarized light 105 emitted from the liquid crystal display element 8 passes through the polarization beam splitter 72, becomes circularly polarized light 106 by the quarter wavelength plate 76, and is projected onto the screen by the projection imaging lens 16.
- the circularly polarized light 107 reflected from the screen passes through the projection imaging lens 16 and becomes linearly polarized light 108 by the quarter wavelength plate 76. Since the polarization direction of the linearly polarized light 108 is orthogonal to that of the linearly polarized light 105, the linearly polarized light 108 is reflected by the polarization beam splitter 72 and imaged by the imaging element 75.
- the configuration using the liquid crystal display element 8, the polarization beam splitter 72, and the quarter wavelength plate 76 of FIG. It is possible to increase the light utilization efficiency of the image to be performed. Since the light use efficiency of the projected image is high, a projector with low power consumption can be provided. Further, since the light use efficiency of the image to be captured is high, the exposure time of the image sensor is short, and a fine image can be captured.
- the optical system can be miniaturized, so that the projector can be miniaturized. Furthermore, since polarized light can be used efficiently, light use efficiency can be increased, and a projector with low power consumption can be realized.
- a reflective LCOS (Liquid Crystal On Silicon) 71 may be used as the spatial modulation element.
- the reflective LCOS 71 can increase the light utilization efficiency compared to the transmissive liquid crystal display element. This is because the transmission type liquid crystal display element has a low aperture ratio because a matrix-like wiring called a black matrix is included in the light transmission region of the liquid crystal display element.
- wiring can be created behind the reflective surface, which is the display surface, so that the aperture ratio can be increased.
- FIG. 16 is a diagram showing a configuration example when the reflective LCOS 71 is used.
- the reflective LCOS 71 uses the polarizing beam splitter 72 due to its configuration, it is not necessary to newly prepare the polarizing beam splitter 72.
- the polarization beam splitter 72 reflects the linearly polarized light 73 emitted from the light source 70, and the reflective LCOS 71 reflects the polarization direction of an arbitrary pixel by turning 90 degrees.
- the reflected light 74 whose polarization has been rotated by 90 degrees passes through the polarization beam splitter 72, becomes circularly polarized light by the quarter wavelength plate 76, and projects an image onto the screen. Furthermore, by arranging the image sensor 75 at the position shown in FIG.
- the light reflected from the screen becomes linearly polarized light by the quarter-wave plate 76, is reflected by the polarization beam splitter 72, and enters the image sensor 75.
- a projector with high light utilization efficiency and low power consumption can be provided.
- the reflective LCOS 71 when used for the spatial modulation element, it is preferable to use a light source that emits single polarized light, such as a semiconductor laser, for example.
- the reflective LCOS 71 modulates the light by controlling the polarization direction of the light according to the image signal. Therefore, the light incident on the reflective LCOS 71 needs to be linearly polarized light with uniform polarization.
- the projection optical system in the case of light having random polarization, is configured to transmit a polarizing filter in order to align the polarization direction, and to cut other than the appropriate polarization component. Since the laser beam can emit linearly polarized light whose polarization is aligned, a polarizing filter for aligning polarization is not required, and the light utilization efficiency can be increased.
- the projector according to the present embodiment is different from the above-described embodiments in that a scanning projection method is used.
- Other configurations are the same as those of the projector 22 according to the first embodiment.
- FIG. 17 schematically shows a scanning projection method.
- the projector includes a laser light source 1, a collimating lens 4, a first scanning mirror 42, a second scanning mirror 43, and an image control unit 44.
- the beam emitted from the laser light source 1 becomes substantially parallel light by the collimator lens 4 and is projected onto the screen 110 by the first scanning mirror 42 and the second scanning mirror 43.
- the image control unit 44 modulates the power of the laser light source 1 at each pixel in accordance with the main image signal. Since the scanning speed of the substantially parallel light is sufficiently faster than the temporal resolution of the human eye, the user recognizes the projected light as a two-dimensional image.
- FIG. 18 shows a state where a part of the scanned pattern image is captured.
- the dotted line indicates the correction pattern image 112 to be scanned, and the solid line indicates the captured pattern image 113.
- the pattern image 112 is a striped pattern, and the imaging unit 111 captures one striped pattern.
- FIG. 19 is a diagram showing a case where the projection object to be the screen has irregularities. When an object to be projected (not shown) serving as a screen is uneven, for example, the dotted line in FIG.
- the captured pattern image 115 is only one stripe pattern of the scanned pattern image 114, even if the striped pattern of the captured pattern image 115 is discontinuous as shown in FIG. It is possible to easily know the unevenness of the projection object that becomes the screen. Therefore, it is possible to correct the main image with high accuracy, and it is possible to provide a high-quality main image.
- Embodiment 6 A projector according to Embodiment 6 of the present invention will be described below with reference to FIGS. 20A and 20B.
- the projector 120 according to the present embodiment has the same basic configuration as the projector 22 according to the first embodiment shown in FIG. Therefore, the same reference numerals are used for the same components as in the first embodiment, and detailed description thereof is omitted.
- the projector according to the present embodiment is different from the projectors of the above-described embodiments in that the projector has an image correction function including predictive control.
- Image correction of the main image to be projected is performed using the frame data of the captured pattern image. More specifically, image correction is performed by predicting, for example, how the object to be projected changes from the difference between the previous pattern image and the previous pattern image.
- FIG. 20A and 20B show how the pattern image as the correction image changes in accordance with the change in the relative angle between the planar projection target 121 and the projector 120.
- FIG. FIG. 20A shows a pattern image 123 that is captured in a state where the relative angle between the projector 120 and the projection target 121 forms an angle 122 ( ⁇ 1).
- FIG. 20B shows a pattern image 133 captured in a state where the relative angle between the projector 120 and the projection target 121 forms an angle 132 ( ⁇ 2).
- the relative angle 122 between the projector 120 and the projection target 121 increases from 122 ( ⁇ 1) to the angle 132 ( ⁇ 2).
- the relative angle 122 and the relative angle 132 can be obtained from the captured pattern image 123 and the pattern image 133. For example, if the relative angle 122 is ⁇ 1 and the relative angle 132 is ⁇ 2, ⁇ 3 is obtained by the following equation.
- the relative angle of the next state in FIG. 20B is predicted to be the state of ⁇ 3, and the main image to be displayed next is corrected.
- the projector 77 includes a switching information input unit 47 for switching whether or not a correction image signal is inserted between the periodic main image signals, and a motion sensor 37 for detecting the projector. 1 is different from the projector 22 shown in FIG. 1 in that the image correction controller 41 (switching unit) switches whether or not the correction image signal is inserted based on the output of the unit 47 or the motion sensor 37.
- the main image need not be corrected every moment.
- inserting a correction image is equivalent to inserting a gray image for a user observing the main image, and therefore the contrast of the main image slightly decreases.
- the switching information input unit 47 may be, for example, a switch provided in the housing of the projector 77 or a remote controller that enables external control.
- the image correction controller 41 functioning as a switching unit inserts correction images continuously for a predetermined period, and it is necessary to insert correction images based on the comparison results of the correction images captured during the period.
- the projector 77 includes a motion sensor 37 that detects the movement of the projector 77, and the image correction controller 41 that functions as a switching unit inserts the correction image signal when the motion sensor 37 detects a movement.
- the control is performed so that the correction image signal is not inserted.
- the projector 77 can automatically determine whether or not a correction image has been inserted in accordance with the motion detection result of the motion sensor 37. Therefore, the projector 77 can accurately determine whether image correction is necessary, and can perform correction efficiently. You can insert images.
- an angular velocity sensor can be used as the motion sensor 37.
- an angular velocity sensor When projecting with the projector 77 in hand, an angular velocity is generated by vibration of the hand. For this reason, when an angular velocity sensor is provided as the motion sensor 37, when the angular velocity is detected from the angular velocity sensor, it is determined that the projector 77 is moving, a correction image signal is inserted, and correction of the main image is started every moment. And it is sufficient.
- the angular velocity sensor may be used as the motion sensor 47.
- the motion sensor 47 is not limited to an angular velocity sensor, and a sensor that can detect the movement of the projector 77, such as an acceleration sensor or a geomagnetic sensor, can be used.
- FIGS. 22A and 22B are diagrams showing examples of using the projector according to Embodiment 8 of the present invention. Since the projector 22 according to the present embodiment includes an image sensor, the projector 22 can be used not only as a projection element but also as an image sensor.
- the projector 22 projects a pattern image 51 on a projection object to grasp the shape of the projection object, and also has a three-dimensional shape existing in the projection object in the captured image.
- a function of separating the object 50 and the two-dimensional (planar) background 80 is provided.
- the shape of the object to be projected is stored by imaging the object to be projected while irradiating the pattern image 51, while only the object to be projected is also imaged at a timing when the pattern image 51 is not projected. Since the shape of the projection object can be analyzed from the projection object imaged while irradiating the pattern image 51, the three-dimensional object 50 existing in the projection object and the two-dimensional background 51 can be separated. This is possible, and only the object 50 can be cut out and stored. Therefore, it is possible to perform processing (trimming processing) in which the background 51 is removed from the captured image and only the object 50 is automatically cut out.
- the projector 22 can project the image 52 of the object that has been cut out onto another screen with the scene portion 24 removed. It is also possible to synthesize and project the image 52 of the object that has been cut back with another image.
- a scanning image display device that scans an intensity-modulated laser beam in accordance with an image signal to form an image is an image display device that can be further reduced in size and power consumption and has excellent portability. realizable.
- Patent Document 2 discloses a scanning image display device that can correct the positional relationship between the image display device and a video display object.
- a scanning image display device and an image display target are applied by the principle of triangulation by irradiating an image display object with a measurement laser beam and measuring an incident angle of reflected light to a photodetector. It is possible to correct the displayed image by measuring the distance from the object and the shape of the image display object.
- the projection surface has uneven color or a pattern, or is a projection surface made of a member of a different material, or a part of the projection surface is missing, the color unevenness or pattern is offset.
- the conventional scanning type image display apparatus has a problem that correction based on such spectral reflectance cannot be performed.
- the present embodiment measures the spectral reflectance of the projection surface when displaying images on various projection surfaces with the scanning image display device, and based on the spectral reflectance of the projection surface. It is an object of the present invention to provide a small scanning image display device having a function of correcting an image to be displayed and an image correction method.
- FIG. 23 shows a schematic configuration of a scanning image display apparatus 200 according to Embodiment 9 of the present invention.
- the scanning image display apparatus 200 includes a laser light source 201R, a laser light source 202B, a laser light source 203G, a drive circuit 204, a control unit 205, a spatial modulation element 206, dichroic mirrors 207 to 209, and a scanning unit for each color. 210, a photodetector 213, a correction unit 214, and a storage unit 215. Red, blue, and green laser beams are emitted from the laser light sources 201R, 202B, and 203G of the respective colors.
- the laser light source 201R and the laser light source 202B are semiconductor lasers.
- the laser light source 203G is a so-called second harmonic generation (SHG) green laser light source in which a solid-state laser emitting infrared laser light by semiconductor laser excitation and a second harmonic generation element are combined.
- the laser light sources 201R, 202B, and 203G for the respective colors are driven by the drive circuit 204.
- the intensity of the green laser light emitted from the laser light source 203G is modulated by the spatial modulation element 206 provided in the optical path.
- an acousto-optic element As the spatial modulation element 206, an acousto-optic element, an electro-optic element, or the like can be used.
- the dichroic mirrors 207 to 209 have a function of reflecting light of a specific wavelength and a function of transmitting light of a wavelength other than the specific wavelength, and match the optical axes of R, G, and B laser beams having different wavelengths.
- the scanning unit 210 scans the projection surface 212 with the laser light 211.
- a piezoelectric drive type or electrostatic drive type micromirror can be used as the scanning unit 210.
- the laser light reflected and scattered by the projection surface 212 is received by the photodetector 213.
- the control unit 205 controls the drive circuit 204, the scanning unit 210, and the modulator 206 based on the video signal.
- the output of the photodetector 213 is input to the control unit 205 and recorded in the storage unit 215 together with a value that specifies the scanning angle of the scanning unit 210.
- the correcting unit 214 performs image correction based on the output of the photodetector 213 or the value recorded in the storage unit 215 and the spectral reflectance for each scanning angle of the scanning unit 210, and the image corrected by the control unit 205. Input the signal.
- FIG. 24 shows a schematic configuration diagram of the photodetector 213.
- the photodetector 213 includes a circuit board 221 and detection units 222a, 222b, and 222c formed on the circuit board 221.
- the detection units 222a, 222b, and 222c are equipped with color filters that transmit red light, green light, and blue light, respectively.
- the laser light reflected and scattered from the projection surface 212 enters the photodetectors 222a, 222b, and 222c, passes through a color filter according to the wavelength of the laser light, and can detect the amount of light received for each laser wavelength.
- the circuit board 221 sends the received light amount for each laser wavelength to the control unit 205 as an electrical signal.
- the circuit board 221 is adjusted in advance so that the red, green, and blue light quantity signals become the same when white light enters the photodetector 213.
- the photodetectors 222a, 222b, and 222c may be provided with three individual photodetectors, or one photodetector may be divided into regions.
- 25A and 25B show examples of detection signals of the photodetector 213 when no color filter is used.
- the laser light source 201R, the laser light source 202B, and the laser light source 203G for each color are assumed to operate in a color sequential mode that emits light alone for a short time. At this time, when the emission intensity of the laser is drawn along the time axis, the result is as shown in FIG. 25A. If the output of the photodetector is extracted for a short time in synchronization with the laser emission timing, the intensity of the reflected scattered light of each laser light source can be measured as shown in FIG. 25B. In this case, a color filter is not necessary, and the circuit may be one channel, so that the configuration of the photodetector 213 can be simplified.
- the control unit 205 receives an image signal from an image input unit (not shown) and controls the drive circuit 204, the modulator 206, and the scanning unit 210.
- the drive circuit 204 drives the laser light source 201R, the laser light source 202B, and the laser light source 203G for each color, and directly modulates the laser light source 201R and the laser light source 202B, which are semiconductor lasers. Since the laser light source 203G, which is SHG green laser light, cannot be directly modulated at high speed, it is modulated by the modulator 206.
- the R, G, and B laser beams whose intensity is appropriately modulated by the image signal are combined into one beam by the dichroic mirrors 207 to 209. Then, the laser beam 211 is scanned by the scanning unit 210 driven by the control unit 205 according to the image signal to form an image on the projection surface 212.
- the laser light reflected and scattered by the projection surface 212 is detected for each wavelength of the laser light 211 by the photodetector 213.
- the function of separating the laser light 211 for each wavelength may be realized by the photodetector 213 with a color filter shown in FIG.
- a light emission method that gives a time difference for each wavelength shown in FIGS. 25A and 25B may be used.
- the intensity of each laser beam detected by the photodetector 213 is sent to the control unit 205.
- the control unit 205 stores a value indicating the scanning angle of the scanning unit 210, that is, a driving amount of a biaxial mirror or a signal of a scanning angle detection unit (not shown) and a signal of the photodetector 213, as a storage unit. Record at 215. Since the spectral reflectance for each scanning angle is recorded in the storage unit 215, the spectral reflectance distribution of the projection surface 212 is obtained, and based on this information, the correcting unit 214 is an image input unit (not shown). ) To the control unit 205 is corrected.
- FIGS. 26A, 26B, 27A, and 27B A process for correcting an image according to the present embodiment will be described with reference to FIGS. 26A, 26B, 27A, and 27B.
- 26A and 26B are diagrams showing the scanning line on the projection surface 212 and the amount of reflected light for each wavelength on the scanning line.
- the projected surface 212 shown in FIGS. 26A and 26B is composed of two projected surfaces 212a and 212b having different reflectivities.
- Reference numeral 231 denotes a scanning line that is a scanning locus of the laser beam 211, and the scanning lines 231 gather to form a scanning region 232.
- the regions 233, 234, and 235 are regions where the spectral characteristics of reflectance are different from those of the surroundings on the projection surface 212, and are, for example, colored unevenness or colored patterns.
- the projection surface 212 is in a state where the upper right portion is missing from the scanning region 232.
- the signal of the photodetector 213 for the scanning line 231a is as shown in FIG. 26B.
- the red, green, and blue light quantity signals are detected at the same intensity at a location on the projection surface 212, and all the red, green, and blue signals are detected from where the scanning line 231a hits the missing portion of the projection surface 212. Disappear. Therefore, it is possible to determine that the projection surface 212 is missing when the predetermined threshold value is determined and the light amount signals of the three colors are equal to or less than the threshold value.
- the threshold value is provided because ambient background light may enter the photodetector and the light amount signal may not become zero. If the missing portion of the projection surface 212 can be detected in this way, power can be reduced by not irradiating the missing portion with laser light, which is beneficial for power saving of the portable device.
- FIG. 27A is also a diagram showing the scanning line on the projection surface 212 and the amount of reflected light for each wavelength on the scanning line.
- the signal of the photodetector 213 for the scanning line 231b is as shown in FIG. 27B.
- the red, green, and blue signals are detected at the same intensity at a location on the projection surface 212, and the red, green, and blue light quantity signals are not the same from where the scanning line 231b hits the region 233 of the projection surface 212. . Since the ratio of the reflected light amount in the region 233 is high for the red light amount signal, it can be determined that the region 233 has high color unevenness of the red component.
- the amount of green and blue light can be increased to correct color unevenness.
- the color unevenness may be corrected by reducing the amount of red light.
- the amount of red light is reduced, there is a difference in brightness from the surrounding area, but the power consumption is reduced, which is beneficial for power saving of the portable device.
- the signal of the light detector 213 for the scanning line 231b is compared at the detection position 241b on the projection surface 212a and the detection position 241c on the projection surface 212b, the light quantity ratio of red, green, and blue at the detection position 241b and the detection position Since the light quantity ratios of red, green, and blue at 241c are equal, there is no color difference between the detection position 241b and the detection position 241c, but it can be determined that there is a difference in brightness. In this manner, by comparing the light quantity ratios of red, green, and blue within the scanning region 232, color unevenness and light / dark differences can be detected.
- the laser light source 201R for each color the laser By correcting the light amounts of the light source 202B and the laser light source 203G, it is possible to display a high-quality image that is not affected by color unevenness or brightness difference on the projection surface.
- the projection surface is irradiated with white light for simplification of explanation, but it is also possible to obtain a spectral reflectance distribution from a normal display image without being limited to white light. is there.
- the spectral reflectance distribution is obtained by comparing with the light quantity ratio of the three primary colors of laser light measured by the photodetector. I understand. Further, the reflectance distribution can also be known by comparing the total light amounts of the three primary color laser beams measured by the photodetector.
- a configuration including a plurality of laser light sources has been described.
- a single laser light source may be used, and it is needless to say that a color filter is unnecessary in this case.
- FIG. 28 shows a schematic configuration of the scanning image display apparatus according to the tenth embodiment.
- this scanning image display apparatus is different from the scanning image display apparatus shown in FIG. 23 in that it includes a position stability detection unit 251.
- the position stability detection unit 251 detects a change in posture or a change in position of the scanning image display device and sends a detection signal to the control unit.
- the position stability detection unit 251 may be any unit that can detect acceleration, angular acceleration, inclination, and the like, such as an acceleration sensor, a geomagnetic sensor, and a gyro.
- the scanning image display apparatus includes the position stability detection unit 251 so that it can detect posture changes and position changes.
- the position of the displayed image also moves greatly. Therefore, the image correction may be stopped because the observer does not care about the image quality without performing image correction.
- the position of the display image is fixed when there is no change in posture or position, a high-quality display image can be obtained by performing image correction.
- the control unit 205 determines that the posture change or the position change has disappeared from the detection signal of the position stability detection unit 251
- the image correction data is irradiated with white light for a short time so as not to be noticed by the observer. Can be obtained.
- the spectral reflectance distribution measurement frequency for image correction can be changed according to the posture and position change frequency. Also good. That is, if there is no change in posture / position, the spectral reflectance distribution may be measured only once in the initial stage, and if the posture / position frequency is high, the spectral reflectance distribution may be increased.
- the position stability detection unit 251 it becomes possible to detect a change in the position of the display image. Therefore, it is possible to perform switching so as not to perform image correction depending on the presence or absence of the position change. In the case where image correction that moves greatly is unnecessary, it is possible to reduce the power by stopping the image correction.
- FIG. 29 shows a schematic configuration of a scanning image display apparatus according to Embodiment 11 of the present invention.
- the scanning image display apparatus includes a polarization beam splitter 261 corresponding to laser beams of R, G, and B, a light source output detector 262 that detects the light output of the laser light source of each color, and a pin A hole 264, a right-angle prism 266, and a photodetector 267 are provided.
- the light source output detector 262 is configured with a photodiode or the like, similar to the photodetector 267.
- the pair of light source output detectors 262 and photodetectors 267 may be provided on the same substrate. It is also possible to divide the same photodiode into regions and use one as the light source output detector 262 and the other as the photodetector 267.
- the scanning image display apparatus further includes quarter wave plates 263 and 265.
- the quarter-wave plates 263 and 265 convert the polarization state of the transmitted laser light from linearly polarized light to circularly polarized light.
- 264 is a pinhole
- 266 is a right-angle prism.
- FIG. 28 shows a schematic configuration of a scanning image display apparatus according to Embodiment 11 of the present invention.
- the same components as those in the scanning image display apparatus according to the tenth embodiment shown in FIG. 28 are identical components as those in the scanning image display apparatus according to the tenth embodiment shown in FIG.
- the laser light emitted from the laser light source 201 is linearly polarized light, and the polarization plane is slightly rotated from the polarization plane that passes through the polarization beam splitter 261. For this reason, most of the laser light incident on the polarization beam splitter 261 is transmitted, but part of the laser light is reflected and incident on the light source output detector 262 to detect the amount of light.
- the polarization state becomes circularly polarized light, and then is reflected by the dichroic mirror 209, passes through the pinhole 264, and is projected by the scanning unit 210. Scanned on surface 212.
- the polarization state of the laser light returns from circularly polarized light to linearly polarized light, but is linearly polarized with a polarization plane orthogonal to the polarization plane of the original linearly polarized light.
- the returned laser light is reflected by the polarization beam splitter 261, passes through the quarter-wave plate 265, is reflected twice by the right-angle prism 266, and passes through the quarter-wave plate 265 again. Since the plane of polarization of the laser light transmitted through the quarter-wave plate 265 and incident on the polarization beam splitter 261 is further rotated by 90 degrees, the laser light is transmitted through the polarization beam splitter 261 and incident on the photodetector 267.
- the light source output detector 262 by providing the light source output detector 262, it is possible to detect outputs from the laser light sources 201R, 202B, and 203G of the respective colors. As a result, even if the outputs of the laser light sources 201R, 202B, and 203G of the respective colors vary, the color unevenness and reflection / scattering rate of the projection surface 212 can be detected with high accuracy. In addition, by detecting the amount of laser light returning through the scanning unit 210 with the photodetector 267, the color unevenness and the reflection / scattering rate of the portion of the projection surface 212 irradiated with the laser light can be accurately measured. It is possible to detect well.
- the pinhole 264 is provided, light from a portion other than the portion irradiated with the laser light of the projection unit 212 can be shielded, so that the color unevenness and the reflection / scattering rate of the portion irradiated with the laser light can be further increased. It can be detected with high accuracy.
- the lower limit of the divergence angle of the laser beam is determined by the beam diameter and wavelength. For example, if the laser beam has a wavelength of 630 nm and the beam diameter is 1 mm, the lower limit value of the divergence angle is 0.8 mrad. Actually, there are many cases in which the divergence angle is 2 to 3 times the lower limit value, so it may be considered that there is a possibility of spreading to about 2.4 mrad.
- the scanning unit 210 is used.
- the diameter (m) of the pinhole 264 is D + L ⁇ 0.0024, where D (m) is the diameter of the laser beam at L and the distance between the scanning unit 210 and the pinhole 264 is L (m). That is all. Further, since it is difficult to determine the deterioration of the image quality even including the return light from the adjacent scanning portion, the diameter (m) of the pinhole 264 is D + L ⁇ 0.0072. It can be seen that
- FIG. 30 shows a schematic configuration of a scanning image display apparatus according to Embodiment 12 of the present invention.
- the scanning image display apparatus includes a laser light source 201R, a laser light source 202B, a laser light source 203G, a drive circuit 204, a control unit 205, a spatial modulation element 206, dichroic mirrors 207 to 209, and a scanning unit 210.
- the output laser intensity of the laser light source 201R and the laser light source 202B is directly modulated by the drive circuit 204.
- the intensity of the green laser light emitted from the laser light source 203G is modulated by the drive circuit 204 by the spatial modulation element 206 provided in the optical path.
- the inspection-type image display device 200 further includes a light branching element 271 composed of a polarization hologram and a diffraction grating, a photodetector / light source output detector 272, quarter-wave plates 273, 274, and a mirror 275.
- a light branching element 271 composed of a polarization hologram and a diffraction grating
- a photodetector / light source output detector 272 quarter-wave plates 273, 274, and a mirror 275.
- the light branching element 271 is an optical element that switches diffraction / transmission depending on the polarization direction of the incident laser light and performs diffraction according to the wavelength of the laser light.
- the light branching element 271 according to the present embodiment is provided so as to diffract in a direction perpendicular to the paper surface when the plane of polarization is parallel to the paper surface of FIG.
- FIG. 31 is a top view showing the photodetector / light source output detector 272.
- the detector / light source output detector 272 includes light source output detectors 272a, 272b, and 272c, and light detectors 272d, 272e, and 272f.
- the photodetector / light source output detector 272 is formed by dividing a photodiode into regions.
- the laser light emitted from the laser light source 201R, laser light source 202B, and laser light source 203G of each color is partially reflected by the polarization beam splitter 261 as in the eleventh embodiment, and further, the wavelength of the laser light by the light branching element 271. And is incident on the light source output detection units 272a, 272b, and 272c shown in FIG.
- the longest wavelength of laser light is incident on 272c, and the shortest wavelength is incident on 272a. That is, blue laser light, green laser light, and red laser light are incident on the light source output detection units 272a, 272b, and 272c in this order.
- the laser light returned from the projection surface 212 passes through the quarter-wave plate 273, the polarization beam splitter 261, the quarter-wave plate 274, the mirror 275, and the polarization beam splitter 261 in this order, and the optical branching element. 271 is incident. Since the polarization plane of the laser light incident on the optical branching element 271 is parallel to the paper surface of FIG. 30, it is diffracted by the polarization hologram in a direction perpendicular to the paper surface and further diffracted by the difference in wavelength. Are incident on the light detectors 272d, 272e, and 272f of the light source output detector / light detector 272 shown in FIG.
- spectral data of reflection and scattering of the projection surface 212 is obtained from a comparison between the light amount detected by the light source output detection unit and the light amount detected by the light detection unit, correction is performed in comparison with the original image signal.
- each color laser light source 201R, laser light source 202B, and laser light source 203G is integrated, so that an effect of being small and low cost can be obtained.
- FIG. 32 is a diagram showing a configuration of a scanning image display apparatus according to Embodiment 13 of the present invention. 32, the same reference numerals are used for the same components as those in the scanning image display device according to the twelfth embodiment shown in FIG. 30, and detailed description thereof is omitted.
- reference numeral 281 denotes a multiplier which multiplies an input signal and outputs it.
- Reference numeral 282 denotes a low-pass filter that removes the AC component of the input signal and outputs a DC component.
- the control unit 205 controls the drive circuit 4 together with, for example, a 100 MHz modulation signal in addition to the video signal.
- the laser light source 201R, laser light source 202B, and laser light source 203G for each color are modulated at 100 MHz in addition to the modulation by the video signal.
- Signals detected by the light detection units 272d, 272e, and 272f of the photodetector / light source output detector 272 are input to the multiplier 281.
- the multiplier 281 also receives a 100 MHz modulated signal from the control unit 205.
- reference numeral 281 denotes a multiplier which multiplies an input signal and outputs the result.
- Reference numeral 282 denotes a low-pass filter that removes the AC component of the input signal and outputs a DC component.
- the control unit 205 controls the drive circuit 4 together with, for example, a 100 MHz modulation signal in addition to the video signal.
- the laser light source 201R, laser light source 202B, and laser light source 203G for each color are modulated at 100 MHz in addition to the modulation by the video signal.
- Signals detected by the light detection units 272d, 272e, and 272f of the photodetector / light source output detector 272 are input to the multiplier 281.
- the multiplier 281 Since the multiplier 281 also receives the 100 MHz modulated signal from the control unit 205, the multiplier 281 applies only the component modulated at 100 MHz to the DC signal detected by the photodetectors 272d, 272e, and 272f.
- the noise component is output as an AC component.
- the low-pass filter 282 at the subsequent stage of the multiplier 281 extracts and outputs only the DC component, the detection signal from which the noise component has been removed is input to the control unit 205.
- detection signals with less noise are obtained by modulating the outputs of the laser light sources 201R, 202B, and 203G of the respective colors and synchronously detecting the detection signals detected by the light detection unit with the modulation signals. Therefore, highly accurate correction is possible.
- a projector includes a light source, a light modulation unit that modulates light emitted from the light source based on an image signal, and the image signal including a periodic main image signal.
- a display control unit that outputs and controls display to the light modulation unit, a projection unit that projects light modulated by the light modulation unit, an imaging unit that captures an image based on the light projected from the projection unit, And the display control unit inserts a correction image signal for projecting a correction image that is visually recognized as a white or gray uniform screen when time-integrated between the periodic main image signals.
- a correction image that is visually recognized as a white or gray uniform screen is inserted between the periodic main image signals when time integration other than the main image is performed. Image is not recognized. Therefore, it is possible to insert the correction image while suppressing deterioration of the image quality of the main image. Then, if this correction image is picked up by an image pickup unit and used for image correction, the image can be corrected in accordance with the projected object that changes from moment to moment with a simple configuration while suppressing deterioration of the image quality of the main image. Can be realized.
- the imaging unit captures only the correction image in synchronization with the projection timing of the correction image.
- the correction image is at least two or more images that cancel out the difference in color or luminance in the image plane when time integration is performed.
- the correction image can be a combination of at least two or more images that cancel out the difference in color or luminance in the image plane when time integration is performed. Therefore, each correction image can be configured with a high degree of freedom as a combination of various colors, brightness, and image patterns.
- the projection unit projects the correction image with white light.
- the modulation frequency of the correction image can be reduced, and a low-cost and low power consumption projector can be realized.
- the projection unit projects the correction image including at least a red image, a green image, and a blue image.
- the projection unit projects a correction image including at least two images having a complementary color relationship.
- the modulation frequency of the correction image can be reduced, and a low-cost and low power consumption projector can be realized.
- the projection unit cancels the color or brightness of the projection object itself imaged by the imaging unit during a non-projection period of the main image projected at a predetermined frequency, and the in-plane of the projection object is obtained when time integration is performed. It is preferable to project a correction image that is visually recognized as uniform white or gray.
- the correction image can be used substantially as color correction or luminance correction of the main image, a projector that projects a higher-quality main image can be realized.
- the correction image is at least two pattern images with inverted luminance.
- the pattern image can be inserted into the main image without being recognized by the user.
- At least one of the pattern images is a lattice pattern.
- the image processing apparatus further includes a correction unit that corrects the main image signal based on the correction image captured by the imaging unit, and the correction unit corrects distortion of the main image.
- At least one of the pattern images is a two-dimensional code.
- the two-dimensional code can be inserted into the main image without degrading the image quality of the main image and without being recognized by the user.
- an analysis unit that extracts information from the two-dimensional code imaged by the imaging unit is further included.
- the correction image is preferably one or more uniform images.
- the image processing apparatus further includes a correction unit that corrects the main image signal based on the correction image captured by the imaging unit, and the correction unit corrects the color of the main image.
- a time-division image display method can be realized by turning on and off the light source, the color wheel necessary for the lamp light source can be removed, and a low-cost projector can be realized. Further, since the light source itself is small, a small projector can be realized.
- a laser light source it is preferable to use a laser light source as the light source.
- a time-division image display method can be realized by turning on and off the light source, and a low-cost projector can be realized. Further, since the light source itself is small, a small projector can be realized. Furthermore, since the laser light source has a small spectral width, it has excellent color reproducibility and facilitates color correction of the main image for the projection object.
- the projection unit includes a projection lens that projects the light modulated by the light modulation unit, and the imaging unit shares the projection lens as an imaging lens.
- the projection lens and the imaging lens can be shared by one lens, and a compact and low-cost projector can be realized.
- the projection unit is provided between a polarization beam splitter that separates light according to polarized light, and the polarization beam splitter and the projection lens, and calculates a phase difference between a forward path and a return path of light that reciprocates through the projection lens. It is preferable to include a quarter wave plate that is ⁇ / 2.
- the optical system can be miniaturized, so the projector can be miniaturized. Furthermore, since polarized light can be used efficiently, light use efficiency can be increased, and a projector with low power consumption can be realized.
- the light modulation unit is preferably a spatial modulation element that spatially modulates light.
- the above configuration makes it easy to increase the power of the light source, thereby realizing a projector with high brightness.
- the light use efficiency can be increased and a projector with low power consumption can be realized.
- micromirrors capable of changing the angle as the spatial modulation element.
- the driving frequency of the time division method can be increased, it is possible to obtain an image in which color braking is further reduced. In addition, it is possible to further suppress the user's recognition of the correction image inserted into the main image, and it is possible to observe the main image with high image quality.
- the light modulation unit preferably includes a scanning unit that scans light two-dimensionally and an intensity modulation unit that can modulate the intensity of light incident on the scanning unit from the light source.
- the driving frequency of the main image is 180 Hz or more.
- the image processing apparatus further includes an arithmetic processing unit that predicts and calculates a change in the projection object based on the plurality of correction images captured in different frames by the imaging unit.
- the main image can be corrected by predicting a change in the relative angle between the projector and the object to be projected and a change in the shape of the object to be projected, and the main image can be observed with high image quality.
- the apparatus further includes a signal transmission / reception unit that transmits / receives a synchronization signal indicating timing for projecting the correction image.
- the timing for projecting the correction image of each projector can be synchronized, and only the correction images of other projectors that are projecting images are captured. Can be obtained.
- the image forming apparatus further includes a switching unit that switches whether to insert the correction image signal between the periodic main image signals.
- the above configuration further includes a motion sensor that detects the motion of the projector, and the switching unit inserts the correction image signal when the motion sensor detects the motion, while the motion sensor detects the motion. When not, it is preferable not to insert the correction image signal.
- a projector includes a light source, a light modulation unit that modulates light emitted from the light source based on an image signal, and the image signal including a periodic main image signal as the light modulation unit.
- a display control unit that outputs and controls display, a projection unit that projects light modulated by the light modulation unit onto a projection object, an imaging unit that captures the projection object, and a projection at a predetermined frequency
- a correction image in which the color or luminance of the projection object itself captured by the imaging unit during the non-projection period of the main image is canceled and time-integrated allows the in-plane of the projection object to be visually recognized as uniform white or gray
- an image generation unit that generates a correction image signal for projecting the correction signal, wherein the display control unit inserts the correction image signal between the periodic main image signals.
- the image capturing unit is in the non-projection period of the main image projected at a predetermined frequency.
- the projected object itself is picked up every moment.
- a correction image in which the in-plane of the projection object is visually recognized as uniform white or gray is generated every moment, and the correction image is periodically generated.
- the correction image can be used substantially as color correction or luminance correction of the main image, so that a projector that projects the main image with higher image quality can be realized.
- a scanning image display apparatus includes a laser light source, a drive circuit that drives the laser light source in accordance with an image signal, and scanning that scans laser light emitted from the laser light source on a projection surface.
- a control unit that controls the scanning unit in accordance with the image signal, a photodetector that detects the amount of reflected and scattered laser light reflected and scattered on the projection surface, an output signal of the photodetector, and the And a correction unit that corrects the intensity of the laser light source based on an image signal.
- the laser light source may be a plurality of laser light sources that emit light having different wavelengths.
- the correction unit may include a storage unit that maps and stores the amount of reflected / scattered light for each laser light source detected by the photodetector and the scanning angle of the scanning unit.
- the photodetector may include a filter that selectively transmits each wavelength of the plurality of laser light sources.
- the plurality of laser light sources may be pulse-driven in a time division manner, and the light detector may detect the amount of reflected and scattered light for each laser light source in synchronization with the drive timing of the laser light source.
- the amount of reflected / scattered light for each laser light source can be detected without using a wavelength selection filter, so that the configuration of the photodetector can be simplified.
- the photodetector may be configured to detect laser light incident through the scanning unit after being reflected and scattered on the projection surface.
- the scanning image display device may further include a pinhole provided between the scanning unit and the photodetector.
- the reflected / scattered light from the portion other than the portion where the laser light is scanned can be shielded, so that the reflected / scattered light of the laser light can be detected with higher accuracy.
- the diameter of the laser beam at the scanning unit is D and the distance between the scanning unit and the pinhole is L, the diameter of the pinhole is (D + L ⁇ 0.0024) or more and (D + L ⁇ 0. [0072]
- the following configuration may be adopted.
- the reflected and scattered light of the laser light to be detected can be surely passed through the pinhole, and the return light from the adjacent scanning portion can be shielded, so that the laser light can be obtained with higher accuracy. It becomes possible to detect the reflected and scattered light.
- the scanning image display apparatus may further include a light source output detector that is provided on the same substrate as the light detector and detects the output of the laser light source.
- the scanning image display device can be miniaturized. Furthermore, by using the photodetector and the light source output detector by dividing the region on the same substrate, the cost can be reduced.
- the scanning image display device may further include a position stability detector that detects a change in the position of the device.
- the laser light source may be configured to be a laser light source of three colors of red, green, and blue.
- the light source output detector for detecting the three-color laser light sources of red, green, and blue and the photodetector may be provided on the same substrate.
- the scanning image display device may include an acousto-optic element or an electro-optic element that modulates the intensity of the laser beam.
- the output of the laser light source may be modulated based on a modulation signal, and the output from the photodetector may be synchronously detected based on the modulation signal.
- an image correction method includes a step of modulating laser light according to an image signal, scanning a projection surface by a scanning unit, and a light amount of the laser light reflected and scattered by the projection surface.
- a configuration including a step of projecting a white image may be adopted.
- the laser beam may not be output when the detected light amount for each wavelength is equal to or less than a predetermined value at all wavelengths. By doing so, it is possible to reduce the power consumption by stopping the oscillation of the laser light in a region where there is no projection surface or the reflectance of the projection surface is extremely low.
- the output intensity of the laser beam may be controlled based on the detected light amount ratio for each wavelength. By doing in this way, it becomes possible to correct
- the output intensity of the laser beam may be controlled based on the detected light amount and light amount ratio for each wavelength. By doing in this way, it becomes possible to correct
- the projector of the present invention can be used as a small portable image display device or as an image display device built in a portable device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Projection Apparatus (AREA)
- Video Image Reproduction Devices For Color Tv Systems (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
本発明の実施の形態1に係るプロジェクタについて、図1ないし図7を参照し以下に説明する。
本発明の実施の形態2に係るプロジェクタについて図8ないし図11を参照し以下に説明する。
本発明の実施の形態3に係るプロジェクタについて図12及び図13を参照し、以下に説明する。
本発明の実施の形態4に係るプロジェクタについて、図14を参照し以下に説明する。
次に、本発明の実施の形態5に係るプロジェクタについて図17ないし図19を参照し以下に説明する。
本発明の実施の形態6に係るプロジェクタについて、図20A及び図20Bを参照し以下に説明する。
次に、本発明の実施の形態7に係るプロジェクタについて説明する。
図22A及び図22Bは、本発明の実施の形態8に係るプロジェクタの使用例を示す図である。本実施の形態に係るプロジェクタ22は撮像素子を備えているため、投影素子だけではなく、撮像素子としても利用可能である。
近年、3原色のレーザ光源を用いた画像表示装置が盛んに開発されている。その理由はレーザの単色性を利用することで色再現性の良い映像が得られる、また電気光変換効率が高く、光利用効率を高くできる半導体レーザを用いることで小型で低消費電力の機器が実現できるというメリットがあるからである。なかでも画像信号に従って強度変調したレーザ光を走査して、画像を形成するタイプの走査型画像表示装置は、より一層の小型化と低消費電力化が可能で携帯性に優れた画像表示装置を実現できる。このような携帯性に優れた画像表示装置の使用環境に鑑み、画像表示装置と映像表示対象物との位置関係を補正できる走査型画像表示装置が特許文献2に示されている。特許文献2によれば、計測用のレーザ光を映像表示対象物に照射し、反射光の光検出器への入射角度を計測することで三角測量の原理により走査型画像表示装置と映像表示対象物との距離や映像表示対象物の形状を測定して、表示する画像を補正することが可能である。
図28は、本実施の形態10の走査型画像表示装置の概略構成を示している。図28に示すように、本走査型画像表示装置は、位置安定性検出部251を備えている点で、図23に示す走査型画像表示装置と異なっている。位置安定性検出部251は、走査型画像表示装置の姿勢の変化や位置の変化を検出して制御部に検出信号を送る。位置安定性検出部251は具体的には加速度センサや地磁気センサ、ジャイロなど加速度や角加速度、傾きなどを検出できるものであれば良い。
図29は、本発明の実施の形態11に係る走査型画像表示装置の概略構成を示している。図29中、図28に示す実施の形態10に係る走査型画像表示装置と同様のものについては、同一符号を用いて詳細な説明を省略する。
D+L×0.0024
以上とすればよい。さらに、隣接する走査箇所からの戻り光までを含めても画質の劣化は判別しがたいので、ピンホール264の直径(m)は
D+L×0.0072
以下とすれば良いことがわかる。
図30は、本発明の実施の形態12に係る走査型画像表示装置の概略構成を示している。図30中、図29に示す実施の形態11に係る走査型画像表示装置と同様のものについては、同一符号を用いて詳細な説明を省略する。
図32は本発明の実施の形態13に係る走査型画像表示装置の構成を示す図である。図32中、図30に示す実施の形態12に係る走査型画像表示装置と同様のものについては、同一符号を用いて詳細な説明を省略する。
Claims (27)
- 光源と、
前記光源から出射された光を画像信号に基づいて変調する光変調部と、
周期的な主画像信号を含む前記画像信号を前記光変調部に出力して表示制御する表示制御部と、
前記光変調部で変調された光を投影する投影部と、
前記投影部から投影された光に基づく画像を撮像する撮像部と、を含み、
前記表示制御部は、時間積分すると白またはグレーの均一画面として視認される補正用画像を投影させる補正用画像信号を、前記周期的な主画像信号の間に挿入することを特徴とする、プロジェクタ。 - 前記撮像部は、前記補正用画像の投影タイミングと同期して当該補正用画像のみを撮像することを特徴とする、請求項1に記載のプロジェクタ。
- 前記補正用画像は、時間積分すると画像面内の色または輝度の差を相殺する少なくとも二つ以上の画像であることを特徴とする、請求項2に記載のプロジェクタ。
- 前記投影部は、前記補正用画像を白色光で投影することを特徴とする、請求項1ないし3の何れか1項に記載のプロジェクタ。
- 前記投影部は、少なくとも赤色画像、緑色画像および青色画像を含む前記補正用画像を投影することを特徴とする、請求項1ないし3の何れか1項に記載のプロジェクタ。
- 前記投影部は、少なくとも、補色の関係にある二つの画像を含む補正用画像を投影することを特徴とする、請求項1ないし3の何れか1項に記載のプロジェクタ。
- 前記投影部は、所定の周波数で投影される主画像の非投影期間中に前記撮像部にて撮像された被投影物体自体の色または輝度を相殺し、時間積分すると被投影物体の面内が均一な白またはグレーとして視認される補正用画像を投影することを特徴とする、請求項1に記載のプロジェクタ。
- 前記補正用画像は、輝度が反転している少なくとも二つ以上のパターン画像であることを特徴とする、請求項1ないし請求項7の何れか1項に記載のプロジェクタ。
- 前記パターン画像の少なくとも一つが格子模様であることを特徴とする、請求項8に記載のプロジェクタ。
- 前記撮像部で撮像された前記補正用画像に基づいて前記主画像信号を補正する補正部をさらに備え、
前記補正部は、主画像の歪みを補正することを特徴とする、請求項1ないし請求項9の何れか1項に記載のプロジェクタ。 - 前記パターン画像の少なくとも一つは、2次元コードであることを特徴とする、請求項8に記載のプロジェクタ。
- 前記撮像部で撮像された前記2次元コードから情報を取り出す解析部をさらに含むことを特徴とする、請求項11に記載のプロジェクタ。
- 前記補正用画像は、一つ又は二つ以上の均一画像であることを特徴とする、請求項1に記載のプロジェクタ。
- 前記撮像部で撮像された前記補正用画像に基づいて前記主画像信号を補正する補正部をさらに備え、
前記補正部は、主画像の色を補正することを特徴とする、請求項13に記載のプロジェクタ。 - 前記光源がLEDであることを特徴とする、請求項1ないし請求項14の何れか1項に記載のプロジェクタ。
- 前記光源がレーザであることを特徴とする、請求項1ないし請求項14の何れか1項に記載のプロジェクタ。
- 前記投影部は、前記光変調部で変調された光を投影する投影レンズを含み、
前記撮像部は、前記投影レンズを撮像レンズとして共用していることを特徴とする、請求項1ないし請求項16の何れか1項に記載のプロジェクタ。 - 前記投影部は、偏光に応じて光を分離する偏光ビームスプリッタと、前記偏光ビームスプリッタと前記投影レンズとの間に設けられ、当該投影レンズを往復通過する光の往路と復路との位相差をπ/2にする1/4波長板とを含むことを特徴とする、請求項17に記載のプロジェクタ。
- 前記主画像の駆動周波数が180Hz以上であることを特徴とする、請求項1ないし請求項18の何れか1項に記載のプロジェクタ。
- 前記光変調部は、光を空間的に変調する空間変調素子であることを特徴とする、請求項1ないし請求項19の何れか1項に記載のプロジェクタ。
- 前記空間変調素子が反射型の単板液晶表示素子であることを特徴とする、請求項20に記載のプロジェクタ。
- 前記空間変調素子が角度を変えることのできる多数のマイクロミラーであることを特徴とする、請求項20に記載のプロジェクタ。
- 前記光変調部は、光を二次元的に走査する走査部と、前記光源から走査部に入射する光の強度を変調できる強度変調部と、からなることを特徴とする、請求項1ないし請求項16の何れか1項に記載のプロジェクタ。
- 前記撮像部により異なるフレームで撮像された複数の前記補正用画像に基づいて被投影物体の変化を予測演算する演算処理部をさらに備えることを特徴とする、請求項1ないし請求項23の何れか1項に記載のプロジェクタ。
- 前記補正用画像を投影するタイミングを示す同期信号を送受信する信号送受信部をさらに含むことを特徴とする、請求項1ないし請求項24の何れか1項に記載のプロジェクタ。
- 前記補正用画像信号を前記周期的な主画像信号の間に挿入するか否かを切り替える切替部をさらに含むことを特徴とする、請求項1ないし請求項25の何れか1項に記載のプロジェクタ。
- 前記プロジェクタの動きを検知するモーションセンサをさらに含み、
前記切替部は、前記モーションセンサが動きを検知したときは前記補正用画像信号を挿入する一方、前記モーションセンサが動きを検知しないときは前記補正用画像信号を挿入しないことを特徴とする、請求項26記載のプロジェクタ。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010512944A JP5431312B2 (ja) | 2008-05-21 | 2009-05-20 | プロジェクタ |
| US12/669,874 US8235534B2 (en) | 2008-05-21 | 2009-05-20 | Projector that projects a correction image between cyclic main image signals |
| CN200980000576.5A CN101755300B (zh) | 2008-05-21 | 2009-05-20 | 投影仪 |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008132762 | 2008-05-21 | ||
| JP2008-132762 | 2008-05-21 | ||
| JP2008140346 | 2008-05-29 | ||
| JP2008-140346 | 2008-05-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009142015A1 true WO2009142015A1 (ja) | 2009-11-26 |
Family
ID=41339955
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/002232 Ceased WO2009142015A1 (ja) | 2008-05-21 | 2009-05-20 | プロジェクタ |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US8235534B2 (ja) |
| JP (1) | JP5431312B2 (ja) |
| CN (1) | CN101755300B (ja) |
| WO (1) | WO2009142015A1 (ja) |
Cited By (40)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102096293A (zh) * | 2011-01-30 | 2011-06-15 | 河南科技大学 | 三片式lcos激光投影显示用光学引擎 |
| CN102104760A (zh) * | 2009-12-21 | 2011-06-22 | 佳能株式会社 | 显示设备及其控制方法 |
| WO2011102360A1 (ja) * | 2010-02-18 | 2011-08-25 | コニカミノルタオプト株式会社 | 画像投影機能を備えた電子機器 |
| JP2011228457A (ja) * | 2010-04-19 | 2011-11-10 | Mitsubishi Electric Corp | 画像表示装置およびレーザ光源装置 |
| JP2012042878A (ja) * | 2010-08-23 | 2012-03-01 | Mitsumi Electric Co Ltd | 光走査装置 |
| JP2012058581A (ja) * | 2010-09-10 | 2012-03-22 | Sharp Corp | 電子機器 |
| JP2013122663A (ja) * | 2011-12-09 | 2013-06-20 | Sharp Corp | 表示システム、較正方法、コンピュータプログラム、及び記録媒体 |
| JP2013122662A (ja) * | 2011-12-09 | 2013-06-20 | Sharp Corp | 表示システム、較正方法、コンピュータプログラム、及び記録媒体 |
| JP2013160887A (ja) * | 2012-02-03 | 2013-08-19 | Funai Electric Co Ltd | Memsデバイスおよびプロジェクタ機能を有する電子機器 |
| JP2013164528A (ja) * | 2012-02-13 | 2013-08-22 | Seiko Epson Corp | プロジェクター及び投写画像調整方法 |
| JP2013182142A (ja) * | 2012-03-02 | 2013-09-12 | Mitsubishi Electric Corp | マルチ画面表示装置 |
| JP2014059522A (ja) * | 2012-09-19 | 2014-04-03 | Funai Electric Co Ltd | 画像表示装置 |
| JP2014509759A (ja) * | 2011-03-02 | 2014-04-21 | マイクロソフト コーポレーション | 没入型ディスプレイエクスペリエンス |
| JP2014130200A (ja) * | 2012-12-28 | 2014-07-10 | Asahi Glass Co Ltd | 投影装置 |
| JP2014150505A (ja) * | 2013-02-04 | 2014-08-21 | Mitsubishi Electric Corp | 映像信号処理装置、映像表示方法及び映像表示装置 |
| JP2014235295A (ja) * | 2013-05-31 | 2014-12-15 | 株式会社Jvcケンウッド | マルチプロジェクタシステム、投射装置、調整装置および画像調整方法、ならびに、画像調整プログラム |
| EP2378394A3 (en) * | 2010-04-15 | 2015-03-25 | Electronics and Telecommunications Research Institute | User interface device and method for recognizing user interaction using same |
| JP2015064550A (ja) * | 2013-08-26 | 2015-04-09 | ソニー株式会社 | 投射型表示装置 |
| JP2015096880A (ja) * | 2013-11-15 | 2015-05-21 | キヤノン株式会社 | 投射型画像表示装置及びその制御方法 |
| JP2015112953A (ja) * | 2013-12-10 | 2015-06-22 | 株式会社デンソー | 車両用投影装置 |
| KR101530370B1 (ko) * | 2013-08-14 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| KR101530369B1 (ko) * | 2013-07-24 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| KR101530371B1 (ko) * | 2013-08-27 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| JP2015179278A (ja) * | 2015-05-01 | 2015-10-08 | セイコーエプソン株式会社 | プロジェクター及びその制御方法 |
| KR101623469B1 (ko) * | 2014-11-17 | 2016-05-23 | 재단법인대구경북과학기술원 | 레이저 광원을 이용한 야간 장애물 검출 장치 및 방법 |
| JPWO2014115400A1 (ja) * | 2013-01-22 | 2017-01-26 | ソニー株式会社 | 投影型画像表示装置、画像処理装置及び画像処理方法、並びにコンピューター・プログラム |
| US9597587B2 (en) | 2011-06-08 | 2017-03-21 | Microsoft Technology Licensing, Llc | Locational node device |
| WO2017072842A1 (ja) * | 2015-10-27 | 2017-05-04 | 日立マクセル株式会社 | プロジェクタ、映像表示装置、及び映像表示方法 |
| US9772547B2 (en) | 2010-11-09 | 2017-09-26 | Seiko Epson Corporation | Projector |
| JP2017170982A (ja) * | 2016-03-22 | 2017-09-28 | 日本電気株式会社 | 無人飛行装置制御システム、無人飛行装置制御方法および画像投影装置 |
| WO2017187842A1 (ja) * | 2016-04-27 | 2017-11-02 | ソニー株式会社 | 画像投影装置、投影撮像システムおよび補正方法 |
| US9939561B2 (en) | 2012-12-28 | 2018-04-10 | Asahi Glass Company, Limited | Projector having diffuser |
| JPWO2017154628A1 (ja) * | 2016-03-11 | 2019-02-07 | ソニー株式会社 | 画像処理装置および方法 |
| JP2020169021A (ja) * | 2020-06-19 | 2020-10-15 | 日本電気株式会社 | 無人飛行装置制御システム、無人飛行装置制御方法および画像投影装置 |
| JPWO2021039977A1 (ja) * | 2019-08-29 | 2021-03-04 | ||
| JP2022138755A (ja) * | 2021-03-11 | 2022-09-26 | カシオ計算機株式会社 | 投影制御装置、空間投影装置、空間投影システム及び空間投影方法 |
| JP2022138754A (ja) * | 2021-03-11 | 2022-09-26 | カシオ計算機株式会社 | 結像媒体変更装置、空間投影装置、空間投影システム及び空間投影方法 |
| WO2022230884A1 (ja) * | 2021-04-27 | 2022-11-03 | 日本精機株式会社 | 表示装置 |
| JP2024505088A (ja) * | 2021-02-01 | 2024-02-02 | ドルビー ラボラトリーズ ライセンシング コーポレイション | 動的ターゲット・ジオメトリを有する投影システムおよび方法 |
| US12003897B2 (en) | 2021-05-14 | 2024-06-04 | Panasonic Intellectual Property Management Co., Ltd. | Projection display apparatus |
Families Citing this family (105)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8427424B2 (en) | 2008-09-30 | 2013-04-23 | Microsoft Corporation | Using physical objects in conjunction with an interactive surface |
| JP5493438B2 (ja) * | 2009-04-10 | 2014-05-14 | 株式会社ニコン | 投影装置および投影像補正プログラム |
| JP5532683B2 (ja) * | 2009-05-29 | 2014-06-25 | 日本精機株式会社 | 液晶表示装置 |
| CN101931826A (zh) * | 2009-06-26 | 2010-12-29 | 鸿富锦精密工业(深圳)有限公司 | 立体成像镜头模组 |
| EP2459960B1 (en) * | 2009-07-29 | 2019-11-13 | Canon Kabushiki Kaisha | Measuring apparatus, measuring method, and computer program |
| US8659585B2 (en) * | 2009-10-16 | 2014-02-25 | Nikon Corporation | Projector apparatus and projection image correcting program product |
| US8730309B2 (en) | 2010-02-23 | 2014-05-20 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction |
| CN102414614B (zh) * | 2010-03-09 | 2014-10-08 | 松下电器产业株式会社 | 投影仪 |
| US9057937B2 (en) * | 2010-09-10 | 2015-06-16 | Nec Display Solutions, Ltd. | Image projection device and color correction method |
| JP5538549B2 (ja) * | 2010-09-17 | 2014-07-02 | 富士フイルム株式会社 | ファインダ装置の表示制御方法及びその装置 |
| JP5803184B2 (ja) * | 2010-11-19 | 2015-11-04 | 株式会社リコー | 画像投影装置、メモリアクセス方法 |
| US8386339B2 (en) | 2010-11-23 | 2013-02-26 | Echostar Technologies L.L.C. | Ordering via dynamic matrix code generation |
| US9792612B2 (en) | 2010-11-23 | 2017-10-17 | Echostar Technologies L.L.C. | Facilitating user support of electronic devices using dynamic matrix code generation |
| US9329966B2 (en) | 2010-11-23 | 2016-05-03 | Echostar Technologies L.L.C. | Facilitating user support of electronic devices using matrix codes |
| CA2818757C (en) | 2010-11-24 | 2019-12-03 | Echostar Technologies Llc | Tracking user interaction from a receiving device |
| US8439257B2 (en) | 2010-12-01 | 2013-05-14 | Echostar Technologies L.L.C. | User control of the display of matrix codes |
| US9280515B2 (en) | 2010-12-03 | 2016-03-08 | Echostar Technologies L.L.C. | Provision of alternate content in response to QR code |
| US8886172B2 (en) | 2010-12-06 | 2014-11-11 | Echostar Technologies L.L.C. | Providing location information using matrix code |
| US8875173B2 (en) | 2010-12-10 | 2014-10-28 | Echostar Technologies L.L.C. | Mining of advertisement viewer information using matrix code |
| US9596500B2 (en) | 2010-12-17 | 2017-03-14 | Echostar Technologies L.L.C. | Accessing content via a matrix code |
| US8640956B2 (en) | 2010-12-17 | 2014-02-04 | Echostar Technologies L.L.C. | Accessing content via a matrix code |
| US9148686B2 (en) | 2010-12-20 | 2015-09-29 | Echostar Technologies, Llc | Matrix code-based user interface |
| US8856853B2 (en) | 2010-12-29 | 2014-10-07 | Echostar Technologies L.L.C. | Network media device with code recognition |
| US8408466B2 (en) | 2011-01-04 | 2013-04-02 | Echostar Technologies L.L.C. | Assisting matrix code capture by signaling matrix code readers |
| US8292166B2 (en) | 2011-01-07 | 2012-10-23 | Echostar Technologies L.L.C. | Performing social networking functions using matrix codes |
| US20120182320A1 (en) * | 2011-01-13 | 2012-07-19 | Echostar Technologies Llc | Utilizing Matrix Codes to Install a Display Device |
| US8534540B2 (en) | 2011-01-14 | 2013-09-17 | Echostar Technologies L.L.C. | 3-D matrix barcode presentation |
| US8786410B2 (en) | 2011-01-20 | 2014-07-22 | Echostar Technologies L.L.C. | Configuring remote control devices utilizing matrix codes |
| US8553146B2 (en) | 2011-01-26 | 2013-10-08 | Echostar Technologies L.L.C. | Visually imperceptible matrix codes utilizing interlacing |
| US8468610B2 (en) | 2011-01-27 | 2013-06-18 | Echostar Technologies L.L.C. | Determining fraudulent use of electronic devices utilizing matrix codes |
| US8430302B2 (en) | 2011-02-03 | 2013-04-30 | Echostar Technologies L.L.C. | Enabling interactive activities for content utilizing matrix codes |
| US9571888B2 (en) | 2011-02-15 | 2017-02-14 | Echostar Technologies L.L.C. | Selection graphics overlay of matrix code |
| US9329469B2 (en) | 2011-02-17 | 2016-05-03 | Microsoft Technology Licensing, Llc | Providing an interactive experience using a 3D depth camera and a 3D projector |
| US8511540B2 (en) | 2011-02-18 | 2013-08-20 | Echostar Technologies L.L.C. | Matrix code for use in verification of data card swap |
| US8931031B2 (en) | 2011-02-24 | 2015-01-06 | Echostar Technologies L.L.C. | Matrix code-based accessibility |
| US9367669B2 (en) | 2011-02-25 | 2016-06-14 | Echostar Technologies L.L.C. | Content source identification using matrix barcode |
| US8550334B2 (en) | 2011-02-28 | 2013-10-08 | Echostar Technologies L.L.C. | Synching one or more matrix codes to content related to a multimedia presentation |
| US8443407B2 (en) | 2011-02-28 | 2013-05-14 | Echostar Technologies L.L.C. | Facilitating placeshifting using matrix code |
| US9736469B2 (en) | 2011-02-28 | 2017-08-15 | Echostar Technologies L.L.C. | Set top box health and configuration |
| US8833640B2 (en) | 2011-02-28 | 2014-09-16 | Echostar Technologies L.L.C. | Utilizing matrix codes during installation of components of a distribution system |
| US9480907B2 (en) | 2011-03-02 | 2016-11-01 | Microsoft Technology Licensing, Llc | Immersive display with peripheral illusions |
| DE102011075147A1 (de) * | 2011-05-03 | 2012-11-08 | Osram Ag | Projektionsvorrichtung zum projizieren mindestens eines bildpunktes und verfahren zum betreiben einer projektionsvorrichtung |
| JP5168526B2 (ja) * | 2011-05-10 | 2013-03-21 | 大日本印刷株式会社 | 投射型映像表示装置 |
| EP2525281B1 (en) | 2011-05-20 | 2019-01-02 | EchoStar Technologies L.L.C. | Improved progress bar |
| JP5994301B2 (ja) * | 2011-06-20 | 2016-09-21 | 株式会社リコー | 画像処理装置、情報処理装置、方法、プログラムおよび記録媒体 |
| US20130002715A1 (en) * | 2011-06-28 | 2013-01-03 | Tidman James M | Image Sequence Reconstruction based on Overlapping Measurement Subsets |
| DE102011079059A1 (de) * | 2011-07-13 | 2013-01-17 | Osram Ag | Detektionseinrichtung für einen projektor |
| JP5924020B2 (ja) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
| US20130215394A1 (en) * | 2012-02-18 | 2013-08-22 | Rakesh Reddy | Underwater Image Projection Display System and Lighting Control System And Device |
| JP6016068B2 (ja) * | 2012-05-16 | 2016-10-26 | 株式会社リコー | 画像投影装置及びその制御方法並びにプログラム |
| DE102012212436B4 (de) * | 2012-07-16 | 2022-07-14 | Coretronic Corporation | Lichtmodul für eine Projektionsvorrichtung und Verfahren zur Generierung des Blauanteils in einem Lichtmodul für eine Projektionsvorrichtung |
| JP2014044244A (ja) * | 2012-08-24 | 2014-03-13 | Asahi Kasei E-Materials Corp | 映像表示装置 |
| US9674510B2 (en) * | 2012-11-21 | 2017-06-06 | Elwha Llc | Pulsed projection system for 3D video |
| US9769439B2 (en) * | 2013-01-04 | 2017-09-19 | Seiko Epson Corporation | Projector and method for controlling the same the same that adjust light source output based on a corrected detected light brightness |
| CN104023217B (zh) * | 2013-03-01 | 2016-12-28 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
| US9164281B2 (en) * | 2013-03-15 | 2015-10-20 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
| US9251715B2 (en) | 2013-03-15 | 2016-02-02 | Honda Motor Co., Ltd. | Driver training system using heads-up display augmented reality graphics elements |
| US9378644B2 (en) | 2013-03-15 | 2016-06-28 | Honda Motor Co., Ltd. | System and method for warning a driver of a potential rear end collision |
| US10339711B2 (en) | 2013-03-15 | 2019-07-02 | Honda Motor Co., Ltd. | System and method for providing augmented reality based directions based on verbal and gestural cues |
| US9747898B2 (en) | 2013-03-15 | 2017-08-29 | Honda Motor Co., Ltd. | Interpretation of ambiguous vehicle instructions |
| US10215583B2 (en) | 2013-03-15 | 2019-02-26 | Honda Motor Co., Ltd. | Multi-level navigation monitoring and control |
| US9393870B2 (en) | 2013-03-15 | 2016-07-19 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
| KR101511523B1 (ko) * | 2013-08-26 | 2015-04-13 | 씨제이씨지브이 주식회사 | 영상 중첩 영역의 보정 방법, 기록 매체 및 실행 장치 |
| US10078258B2 (en) * | 2013-09-04 | 2018-09-18 | Nec Corporation | Projection device, projection device control method, projection device control apparatus, and computer program thereof |
| US9232201B2 (en) * | 2013-12-30 | 2016-01-05 | Lenovo (Singapore) Pte. Ltd. | Dynamic projected image color correction based on projected surface coloration |
| WO2015166910A1 (ja) | 2014-04-28 | 2015-11-05 | 株式会社ニコン | パターン描画装置、パターン描画方法、デバイス製造方法、レーザ光源装置、ビーム走査装置、および、ビーム走査方法 |
| CN104410805A (zh) * | 2014-11-18 | 2015-03-11 | 苏州佳世达光电有限公司 | 一种投影装置自动校正方法和使用该方法的投影装置 |
| CN104601917B (zh) * | 2015-01-26 | 2018-07-27 | 苏州佳世达光电有限公司 | 一种护眼投影方法及投影装置 |
| CN104605824B (zh) * | 2015-02-10 | 2017-08-08 | 安徽信美医学工程科技有限公司 | 一种病变部位显像投影导航装置 |
| US10848242B2 (en) * | 2015-02-10 | 2020-11-24 | Brightcodes Technologies Ltd. | System and method for providing optically coding of information combining color and luminosity |
| JP6295981B2 (ja) * | 2015-02-25 | 2018-03-20 | 株式会社Jvcケンウッド | 画像描画装置、ヘッドアップディスプレイ及び画像輝度調整方法 |
| CN105204169A (zh) * | 2015-09-29 | 2015-12-30 | 北京为世联合科技有限公司 | 自发光激光匀光管 |
| CN105186288A (zh) * | 2015-09-29 | 2015-12-23 | 北京为世联合科技有限公司 | 白光半导体激光器阵列 |
| CN105186289A (zh) * | 2015-09-29 | 2015-12-23 | 北京为世联合科技有限公司 | 集成激光芯片 |
| JP6569440B2 (ja) * | 2015-09-30 | 2019-09-04 | 株式会社Jvcケンウッド | 検出方法、検出装置および投射装置 |
| JP6700731B2 (ja) * | 2015-11-13 | 2020-05-27 | キヤノン株式会社 | 投影装置および投影システム |
| WO2017085802A1 (ja) * | 2015-11-18 | 2017-05-26 | 日立マクセル株式会社 | 画像投射装置 |
| EP3273681B1 (en) * | 2016-06-08 | 2020-06-17 | Panasonic Intellectual Property Management Co., Ltd. | Projection system |
| JP6705600B2 (ja) * | 2016-10-18 | 2020-06-03 | Necディスプレイソリューションズ株式会社 | プロジェクター及び画像表示方法 |
| US9992464B1 (en) * | 2016-11-11 | 2018-06-05 | Christie Digital Systems Usa, Inc. | Method and system for screen correction |
| CN108071989A (zh) * | 2016-11-16 | 2018-05-25 | 艾丽西亚(天津)文化交流有限公司 | 一种具有摄像功能的led会议展示控制器 |
| US10527916B2 (en) | 2016-12-01 | 2020-01-07 | Coretronic Corporation | Light source module and projection device including the same |
| CN108132576B (zh) | 2016-12-01 | 2021-04-23 | 中强光电股份有限公司 | 光源模块、投影装置及其驱动方法 |
| CN106773073A (zh) * | 2017-01-10 | 2017-05-31 | 中国科学院半导体研究所 | 三基色激光器实现均光照明的系统 |
| CN106791747A (zh) * | 2017-01-25 | 2017-05-31 | 触景无限科技(北京)有限公司 | 台灯互动展示的分时处理方法、装置以及台灯 |
| US10437073B2 (en) * | 2017-01-25 | 2019-10-08 | North Inc. | Systems, devices, and methods for beam combining in laser projectors |
| WO2018159287A1 (ja) * | 2017-02-28 | 2018-09-07 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
| CN109100907A (zh) * | 2017-06-20 | 2018-12-28 | 扬明光学股份有限公司 | 发光系统 |
| US20190121135A1 (en) | 2017-10-23 | 2019-04-25 | North Inc. | Free space multiple laser diode modules |
| JP6834937B2 (ja) * | 2017-12-27 | 2021-02-24 | 株式会社Jvcケンウッド | プロジェクタシステム及びカメラ評価システム |
| CN109194953B (zh) * | 2018-08-15 | 2021-03-02 | 瑞声光学解决方案私人有限公司 | 空间颜色和分辨率测量装置及测量方法 |
| TWI703509B (zh) * | 2018-12-13 | 2020-09-01 | 致茂電子股份有限公司 | 光學檢測裝置以及校正方法 |
| US11233980B2 (en) * | 2019-06-13 | 2022-01-25 | Microsoft Technologly Licensing, LLC | Monitoring and correction system for improved laser display systems |
| CN110441915B (zh) * | 2019-07-28 | 2024-05-28 | 北京龙翼风科技有限公司 | 基于矩形针孔阵列的集成成像3d显示装置 |
| CN110418124B (zh) * | 2019-08-05 | 2021-11-30 | 歌尔光学科技有限公司 | 投影图像检测方法、装置、设备及计算机可读存储介质 |
| CN113497923B (zh) * | 2020-03-18 | 2023-10-31 | 中强光电股份有限公司 | 投影系统以及用于投影系统的定位方法 |
| CN111562714B (zh) * | 2020-05-22 | 2021-09-21 | 复旦大学 | 一种基于多波长led光源的多光谱调制成像装置 |
| CN112235554B (zh) * | 2020-10-15 | 2022-05-13 | 上海交通大学医学院附属第九人民医院 | 一种实现投影仪和摄像机软件同步的方法 |
| EP3985433A1 (en) * | 2020-10-16 | 2022-04-20 | Barco N.V. | Light source device and image projection device having a light source device |
| CN114222100A (zh) * | 2021-12-20 | 2022-03-22 | 青岛海信激光显示股份有限公司 | 投影图像的校正方法及激光投影设备 |
| CN116068759A (zh) * | 2021-10-29 | 2023-05-05 | 华为技术有限公司 | 一种光学显示装置、显示系统、交通工具及色彩调节方法 |
| CN113974832B (zh) * | 2021-12-01 | 2023-01-24 | 辽宁北镜医疗科技有限公司 | 具有投影导航功能的近红外荧光手术导航系统及方法 |
| CN114052909B (zh) * | 2021-12-01 | 2025-03-14 | 辽宁北镜医疗科技有限公司 | 一种多功能近红外荧光术中导航系统及导航方法 |
| CN116634126B (zh) * | 2023-05-10 | 2024-08-30 | 深圳市石代科技集团有限公司 | 基于宴会厅的5d全息投影控制系统 |
| WO2025039705A1 (zh) * | 2023-08-21 | 2025-02-27 | 青岛海信激光显示股份有限公司 | 激光投影设备及其图像显示方法 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06110403A (ja) * | 1992-09-30 | 1994-04-22 | Hitachi Ltd | 表示装置および表示パネル |
| JPH07162790A (ja) * | 1993-12-13 | 1995-06-23 | Mitsubishi Electric Corp | 投写型表示装置および投写画像改善方法 |
| JPH10161255A (ja) * | 1996-12-02 | 1998-06-19 | Hitachi Ltd | 投写形液晶表示装置および液晶パネル |
| JP2003076494A (ja) * | 2001-08-20 | 2003-03-14 | Lg Electronics Inc | プロジェクタの座標入力装置 |
| WO2005015904A1 (en) * | 2003-08-06 | 2005-02-17 | Mitsubishi Denki Kabushiki Kaisha | Method and system for determining correspondence between locations on display surface having arbitrary shape and pixels in output image of projector |
| JP2005204043A (ja) * | 2004-01-15 | 2005-07-28 | Seiko Epson Corp | 画像表示装置、画像表示装置の情報取得機器および画像表示装置の情報取得・設定方法ならびに画像表示装置の設定情報が埋め込まれたコンテンツ |
| JP2007065099A (ja) * | 2005-08-30 | 2007-03-15 | Fuji Electric Holdings Co Ltd | 映像表示システムおよびその表示調整方法 |
| JP2007295375A (ja) * | 2006-04-26 | 2007-11-08 | Nippon Telegr & Teleph Corp <Ntt> | 投影映像補正装置及び投影映像補正プログラム |
| WO2009028262A1 (ja) * | 2007-08-24 | 2009-03-05 | Nec Corporation | 画像表示装置及び画像表示方法 |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4684996A (en) | 1986-08-25 | 1987-08-04 | Eastman Kodak Company | Video projector with optical feedback |
| JP2510353B2 (ja) * | 1990-11-07 | 1996-06-26 | 村田機械株式会社 | 画像デ―タのスム―ジング処理方法 |
| JPH11205648A (ja) * | 1998-01-09 | 1999-07-30 | Olympus Optical Co Ltd | 画像合成装置 |
| US7006707B2 (en) * | 2001-05-03 | 2006-02-28 | Adobe Systems Incorporated | Projecting images onto a surface |
| JP2003029201A (ja) | 2001-07-11 | 2003-01-29 | Canon Inc | 画像投射装置及び画像補正方法 |
| JP3871061B2 (ja) * | 2003-03-25 | 2007-01-24 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法 |
| JP4371749B2 (ja) | 2003-09-19 | 2009-11-25 | Necディスプレイソリューションズ株式会社 | プロジェクタおよびそのテストパターン検出方法 |
| CN100596206C (zh) | 2003-12-10 | 2010-03-24 | 日本电气株式会社 | 投影仪颜色校正方法 |
| JP3888465B2 (ja) * | 2004-05-26 | 2007-03-07 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタおよび画像処理方法 |
| JP2006109380A (ja) | 2004-10-08 | 2006-04-20 | Sharp Corp | 投射画像色調整方法及びプロジェクタ |
| JP4854579B2 (ja) * | 2007-04-20 | 2012-01-18 | 三洋電機株式会社 | ぶれ補正装置及びぶれ補正方法ならびにぶれ補正装置を備えた電子機器や、画像ファイル及び画像ファイル作成装置 |
-
2009
- 2009-05-20 US US12/669,874 patent/US8235534B2/en not_active Expired - Fee Related
- 2009-05-20 WO PCT/JP2009/002232 patent/WO2009142015A1/ja not_active Ceased
- 2009-05-20 JP JP2010512944A patent/JP5431312B2/ja not_active Expired - Fee Related
- 2009-05-20 CN CN200980000576.5A patent/CN101755300B/zh not_active Expired - Fee Related
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06110403A (ja) * | 1992-09-30 | 1994-04-22 | Hitachi Ltd | 表示装置および表示パネル |
| JPH07162790A (ja) * | 1993-12-13 | 1995-06-23 | Mitsubishi Electric Corp | 投写型表示装置および投写画像改善方法 |
| JPH10161255A (ja) * | 1996-12-02 | 1998-06-19 | Hitachi Ltd | 投写形液晶表示装置および液晶パネル |
| JP2003076494A (ja) * | 2001-08-20 | 2003-03-14 | Lg Electronics Inc | プロジェクタの座標入力装置 |
| WO2005015904A1 (en) * | 2003-08-06 | 2005-02-17 | Mitsubishi Denki Kabushiki Kaisha | Method and system for determining correspondence between locations on display surface having arbitrary shape and pixels in output image of projector |
| JP2005204043A (ja) * | 2004-01-15 | 2005-07-28 | Seiko Epson Corp | 画像表示装置、画像表示装置の情報取得機器および画像表示装置の情報取得・設定方法ならびに画像表示装置の設定情報が埋め込まれたコンテンツ |
| JP2007065099A (ja) * | 2005-08-30 | 2007-03-15 | Fuji Electric Holdings Co Ltd | 映像表示システムおよびその表示調整方法 |
| JP2007295375A (ja) * | 2006-04-26 | 2007-11-08 | Nippon Telegr & Teleph Corp <Ntt> | 投影映像補正装置及び投影映像補正プログラム |
| WO2009028262A1 (ja) * | 2007-08-24 | 2009-03-05 | Nec Corporation | 画像表示装置及び画像表示方法 |
Cited By (59)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102104760B (zh) * | 2009-12-21 | 2014-09-03 | 佳能株式会社 | 显示设备及其控制方法 |
| CN102104760A (zh) * | 2009-12-21 | 2011-06-22 | 佳能株式会社 | 显示设备及其控制方法 |
| JP2011128564A (ja) * | 2009-12-21 | 2011-06-30 | Canon Inc | 投射装置、プログラム、及び投射装置の制御方法 |
| US8669997B2 (en) | 2009-12-21 | 2014-03-11 | Canon Kabushiki Kaisha | Display apparatus and method of controlling the same |
| WO2011102360A1 (ja) * | 2010-02-18 | 2011-08-25 | コニカミノルタオプト株式会社 | 画像投影機能を備えた電子機器 |
| EP2378394A3 (en) * | 2010-04-15 | 2015-03-25 | Electronics and Telecommunications Research Institute | User interface device and method for recognizing user interaction using same |
| JP2011228457A (ja) * | 2010-04-19 | 2011-11-10 | Mitsubishi Electric Corp | 画像表示装置およびレーザ光源装置 |
| US8801193B2 (en) | 2010-04-19 | 2014-08-12 | Mitsubishi Electric Corporation | Image display device and laser light source device including multiple laser elements whose light amounts are individually measured |
| JP2012042878A (ja) * | 2010-08-23 | 2012-03-01 | Mitsumi Electric Co Ltd | 光走査装置 |
| JP2012058581A (ja) * | 2010-09-10 | 2012-03-22 | Sharp Corp | 電子機器 |
| US9772547B2 (en) | 2010-11-09 | 2017-09-26 | Seiko Epson Corporation | Projector |
| CN102096293A (zh) * | 2011-01-30 | 2011-06-15 | 河南科技大学 | 三片式lcos激光投影显示用光学引擎 |
| CN102096293B (zh) * | 2011-01-30 | 2012-06-27 | 河南科技大学 | 三片式lcos激光投影显示用光学引擎 |
| JP2014509759A (ja) * | 2011-03-02 | 2014-04-21 | マイクロソフト コーポレーション | 没入型ディスプレイエクスペリエンス |
| US9597587B2 (en) | 2011-06-08 | 2017-03-21 | Microsoft Technology Licensing, Llc | Locational node device |
| JP2013122662A (ja) * | 2011-12-09 | 2013-06-20 | Sharp Corp | 表示システム、較正方法、コンピュータプログラム、及び記録媒体 |
| JP2013122663A (ja) * | 2011-12-09 | 2013-06-20 | Sharp Corp | 表示システム、較正方法、コンピュータプログラム、及び記録媒体 |
| US10018833B2 (en) | 2012-02-03 | 2018-07-10 | Funai Electric Co., Ltd. | MEMS device and electronic device having projector function |
| JP2013160887A (ja) * | 2012-02-03 | 2013-08-19 | Funai Electric Co Ltd | Memsデバイスおよびプロジェクタ機能を有する電子機器 |
| JP2013164528A (ja) * | 2012-02-13 | 2013-08-22 | Seiko Epson Corp | プロジェクター及び投写画像調整方法 |
| JP2013182142A (ja) * | 2012-03-02 | 2013-09-12 | Mitsubishi Electric Corp | マルチ画面表示装置 |
| JP2014059522A (ja) * | 2012-09-19 | 2014-04-03 | Funai Electric Co Ltd | 画像表示装置 |
| JP2014130200A (ja) * | 2012-12-28 | 2014-07-10 | Asahi Glass Co Ltd | 投影装置 |
| US9939561B2 (en) | 2012-12-28 | 2018-04-10 | Asahi Glass Company, Limited | Projector having diffuser |
| JPWO2014115400A1 (ja) * | 2013-01-22 | 2017-01-26 | ソニー株式会社 | 投影型画像表示装置、画像処理装置及び画像処理方法、並びにコンピューター・プログラム |
| JP2014150505A (ja) * | 2013-02-04 | 2014-08-21 | Mitsubishi Electric Corp | 映像信号処理装置、映像表示方法及び映像表示装置 |
| JP2014235295A (ja) * | 2013-05-31 | 2014-12-15 | 株式会社Jvcケンウッド | マルチプロジェクタシステム、投射装置、調整装置および画像調整方法、ならびに、画像調整プログラム |
| KR101530369B1 (ko) * | 2013-07-24 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| KR101530370B1 (ko) * | 2013-08-14 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| JP2015064550A (ja) * | 2013-08-26 | 2015-04-09 | ソニー株式会社 | 投射型表示装置 |
| US9696854B2 (en) | 2013-08-26 | 2017-07-04 | Sony Corporation | Projection display |
| KR101530371B1 (ko) * | 2013-08-27 | 2015-06-23 | (주)아이엠 | 피코 프로젝터 |
| JP2015096880A (ja) * | 2013-11-15 | 2015-05-21 | キヤノン株式会社 | 投射型画像表示装置及びその制御方法 |
| JP2015112953A (ja) * | 2013-12-10 | 2015-06-22 | 株式会社デンソー | 車両用投影装置 |
| KR101623469B1 (ko) * | 2014-11-17 | 2016-05-23 | 재단법인대구경북과학기술원 | 레이저 광원을 이용한 야간 장애물 검출 장치 및 방법 |
| JP2015179278A (ja) * | 2015-05-01 | 2015-10-08 | セイコーエプソン株式会社 | プロジェクター及びその制御方法 |
| US10244216B2 (en) | 2015-10-27 | 2019-03-26 | Maxell, Ltd. | Projector, video display device, and video display method |
| WO2017072842A1 (ja) * | 2015-10-27 | 2017-05-04 | 日立マクセル株式会社 | プロジェクタ、映像表示装置、及び映像表示方法 |
| JPWO2017072842A1 (ja) * | 2015-10-27 | 2018-07-26 | マクセル株式会社 | プロジェクタ、映像表示装置、及び映像表示方法 |
| JP7074052B2 (ja) | 2016-03-11 | 2022-05-24 | ソニーグループ株式会社 | 画像処理装置および方法 |
| JPWO2017154628A1 (ja) * | 2016-03-11 | 2019-02-07 | ソニー株式会社 | 画像処理装置および方法 |
| JP2017170982A (ja) * | 2016-03-22 | 2017-09-28 | 日本電気株式会社 | 無人飛行装置制御システム、無人飛行装置制御方法および画像投影装置 |
| JP7003913B2 (ja) | 2016-04-27 | 2022-01-21 | ソニーグループ株式会社 | 画像投影装置、投影撮像システムおよび補正方法 |
| JPWO2017187842A1 (ja) * | 2016-04-27 | 2019-02-28 | ソニー株式会社 | 画像投影装置、投影撮像システムおよび補正方法 |
| WO2017187842A1 (ja) * | 2016-04-27 | 2017-11-02 | ソニー株式会社 | 画像投影装置、投影撮像システムおよび補正方法 |
| US10574952B2 (en) | 2016-04-27 | 2020-02-25 | Sony Corporation | Image projection apparatus, projection imaging system, and correction method |
| JP2024150737A (ja) * | 2019-08-29 | 2024-10-23 | 国立大学法人東北大学 | 投影システム、投影システム制御装置及び投影方法 |
| JPWO2021039977A1 (ja) * | 2019-08-29 | 2021-03-04 | ||
| WO2021039977A1 (ja) * | 2019-08-29 | 2021-03-04 | 国立大学法人東北大学 | 投影システム、投影システム制御装置、投影方法及びプログラム |
| JP7761299B2 (ja) | 2019-08-29 | 2025-10-28 | 国立大学法人東北大学 | 投影システム、投影システム制御装置及び投影方法 |
| JP7599717B2 (ja) | 2019-08-29 | 2024-12-16 | 国立大学法人東北大学 | 投影システム、投影システム制御装置、投影方法及びプログラム |
| US11838696B2 (en) | 2019-08-29 | 2023-12-05 | Tohoku University | Projection system, projection system control device, projection method, and program |
| JP2020169021A (ja) * | 2020-06-19 | 2020-10-15 | 日本電気株式会社 | 無人飛行装置制御システム、無人飛行装置制御方法および画像投影装置 |
| JP7031698B2 (ja) | 2020-06-19 | 2022-03-08 | 日本電気株式会社 | 無人飛行装置制御システムおよび無人飛行装置制御方法 |
| JP2024505088A (ja) * | 2021-02-01 | 2024-02-02 | ドルビー ラボラトリーズ ライセンシング コーポレイション | 動的ターゲット・ジオメトリを有する投影システムおよび方法 |
| JP2022138754A (ja) * | 2021-03-11 | 2022-09-26 | カシオ計算機株式会社 | 結像媒体変更装置、空間投影装置、空間投影システム及び空間投影方法 |
| JP2022138755A (ja) * | 2021-03-11 | 2022-09-26 | カシオ計算機株式会社 | 投影制御装置、空間投影装置、空間投影システム及び空間投影方法 |
| WO2022230884A1 (ja) * | 2021-04-27 | 2022-11-03 | 日本精機株式会社 | 表示装置 |
| US12003897B2 (en) | 2021-05-14 | 2024-06-04 | Panasonic Intellectual Property Management Co., Ltd. | Projection display apparatus |
Also Published As
| Publication number | Publication date |
|---|---|
| CN101755300B (zh) | 2014-02-05 |
| JP5431312B2 (ja) | 2014-03-05 |
| JPWO2009142015A1 (ja) | 2011-09-29 |
| US20100201894A1 (en) | 2010-08-12 |
| CN101755300A (zh) | 2010-06-23 |
| US8235534B2 (en) | 2012-08-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5431312B2 (ja) | プロジェクタ | |
| US10051209B2 (en) | Combined visible and non-visible projection system | |
| JP4898121B2 (ja) | 画像投影装置 | |
| US8556431B2 (en) | Apparatus and method for reducing speckle in display of images | |
| CN109240028B (zh) | 投影型影像显示装置 | |
| US12130547B2 (en) | Image display apparatus | |
| US20130194644A1 (en) | Image display systems | |
| US20120200832A1 (en) | Image projection device, image protection method, distance measuring device and distance measuring method | |
| US11950026B2 (en) | Methods and apparatus employing angular and spatial modulation of light | |
| CN102472956A (zh) | 通过改变扫描幅度来修正扫描的投影仪失真 | |
| CN102759847A (zh) | 扫描型图像显示装置 | |
| KR101185297B1 (ko) | 영상 프로젝터 | |
| JP2012018214A (ja) | 投写型映像表示装置 | |
| JP2016180979A (ja) | 投写型表示装置 | |
| JP2020187165A (ja) | 画像投射装置 | |
| JP5633570B2 (ja) | レーザ投射装置および画像投影システム | |
| JP2010128414A (ja) | 画像表示装置 | |
| JP6969606B2 (ja) | 検出機能付きプロジェクタ | |
| CN112104793A (zh) | 基于光同步的激光电视投影装置 | |
| KR101713337B1 (ko) | 레이저 디스플레이 장치 | |
| JP2014059522A (ja) | 画像表示装置 | |
| WO2022181156A1 (ja) | 投射システム | |
| JP2011174994A (ja) | 画像表示装置 | |
| JP2006267161A (ja) | プロジェクタ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980000576.5 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010512944 Country of ref document: JP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09750377 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 12669874 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 09750377 Country of ref document: EP Kind code of ref document: A1 |