WO2024219280A1 - 撮像装置及び情報処理方法 - Google Patents
撮像装置及び情報処理方法 Download PDFInfo
- Publication number
- WO2024219280A1 WO2024219280A1 PCT/JP2024/014328 JP2024014328W WO2024219280A1 WO 2024219280 A1 WO2024219280 A1 WO 2024219280A1 JP 2024014328 W JP2024014328 W JP 2024014328W WO 2024219280 A1 WO2024219280 A1 WO 2024219280A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- optical element
- light receiving
- light
- receiving area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B19/00—Cameras
- G03B19/02—Still-picture cameras
- G03B19/04—Roll-film cameras
- G03B19/06—Roll-film cameras adapted to be loaded with more than one film, e.g. with exposure of one or the other at will
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- This disclosure relates to an imaging device and an information processing method.
- a system is known that can capture both an image for distance calculation and an image for display using a single imaging device (see, for example, Patent Document 1).
- An imaging device a first optical element that forms a first image on a first light receiving area of the image sensor based on light coming from a subject; a second optical element that forms a second image on a second light receiving area of the image sensor adjacent to the first light receiving area, the second optical element receiving the light from the object; a third optical element that reflects at least a portion of the light traveling from the first optical element toward the outside of the first light receiving region and causes the light to travel toward the inside of the first light receiving region; a fourth optical element that reflects at least a portion of the light traveling from the second optical element toward the outside of the second light receiving region and causes the light to travel toward the inside of the second light receiving region;
- the system further includes a controller that performs a conversion process including at least inversion and combination on the superimposed image of the first light receiving area and the superimposed image of the second light receiving area, and calculates the parallax of the subject by comparing the superimposed images after the conversion process.
- At least one of a fifth optical element and a sixth optical element is disposed opposite to the third optical element, and blocks or reflects at least a portion of the light traveling from the first optical element toward the outside of the first light receiving region, causing the light to travel toward the inside of the first light receiving region;
- the sixth optical element is arranged opposite the fourth optical element and blocks or reflects at least a portion of the light traveling from the second optical element toward the outside of the second light receiving region, causing the light to travel toward the inside of the second light receiving region.
- the controller separates, based on the calculated direction of parallax, a subject image captured only through the first optical element or the second optical element from a subject image captured also through optical elements other than the first optical element and the second optical element.
- An information processing method executed by an imaging device including: a first optical element that forms a first image of light coming from a subject on a first light receiving region of an imaging element; a second optical element that forms a second image of the light coming from the subject on a second light receiving region adjacent to the first light receiving region of the imaging element; a third optical element that reflects at least a portion of the light traveling from the first optical element toward the outside of the first light receiving region and causes the light to travel toward the inside of the first light receiving region; a fourth optical element that reflects at least a portion of the light traveling from the second optical element toward the outside of the second light receiving region and causes the light to travel toward the inside of the second light receiving region; and a controller, The controller: performing a conversion process including at least inversion and combination on the superimposed image of the first light receiving area and the superimposed image of the second light receiving area; The parallax of the subject is calculated by comparing the superimposed images that have been subjected to the
- FIG. 1 is a side view illustrating a schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 2 is a diagram for explaining a first image and a second image of the imaging device of FIG.
- FIG. 3 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 4 is a diagram for explaining the conversion process executed by the image capture device of FIG.
- FIG. 5 is a diagram showing an example of the converted superimposed image.
- FIG. 6 is a flowchart showing an example of the information processing method executed by the imaging apparatus.
- FIG. 7 is a side view illustrating another schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 8 is a diagram for explaining the first and second images of the imaging device of FIG. FIG.
- FIG. 9 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 10 is a diagram for explaining the conversion process executed by the imaging device of FIG.
- FIG. 11 is a side view illustrating another schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 12 is a diagram for explaining the first image and the second image of the imaging device of FIG.
- FIG. 13 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 14 is a diagram for explaining the conversion process executed by the imaging device of FIG.
- FIG. 15 is a side view illustrating another schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 16 is a diagram for explaining the first image and the second image of the imaging device of FIG. FIG.
- FIG. 17 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 18 is a diagram for explaining the conversion process executed by the imaging device of FIG.
- FIG. 19 is a side view illustrating another schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 20 is a diagram for explaining the first image and the second image of the imaging device of FIG.
- FIG. 21 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 22 is a diagram for explaining the conversion process executed by the imaging device of FIG.
- FIG. 23 is a side view illustrating another schematic configuration example of an imaging device according to an embodiment of the present disclosure.
- FIG. 24 is a diagram for explaining the first image and the second image of the imaging device of FIG. 23.
- FIG. FIG. 24 is a diagram for explaining the first image and the second image of the imaging device of FIG. 23.
- FIG. 25 is a diagram for explaining a superimposed image generated by the imaging device of FIG.
- FIG. 26 is a diagram for explaining the conversion process executed by the imaging device of FIG.
- FIG. 27 is a diagram showing a modification of the mirror of the imaging device according to an embodiment of the present disclosure.
- FIG. 1 an imaging device 1 (see FIG. 1) and an information processing method according to one embodiment of the present disclosure will be described with reference to the drawings.
- the same components are given the same reference numerals.
- the figures explaining the embodiment are schematic.
- the dimensional ratios in the drawings do not necessarily match those in reality.
- the imaging device 1 captures a parallax image of a subject 40 (see FIG. 1). For example, the distance to each point on the subject 40 that is the target of distance measurement can be calculated based on the parallax image of the subject 40. In distance measurement based on a parallax image, the greater the baseline length, the higher the resolution and accuracy of the distance data. The baseline length corresponds to the distance between the devices that capture the two images that make up the parallax image.
- a stereo camera is a method of performing triangulation using two cameras arranged in parallel.
- the distance between the two cameras corresponds to the baseline length. Therefore, the resolution and accuracy of the distance data can be improved by increasing the baseline length of the stereo camera.
- the baseline length of a stereo camera is increased, it is necessary to place the two cameras farther apart, which increases the size of the device.
- a method of increasing the focal length can be considered to increase the resolution and accuracy of the distance data.
- increasing the focal length makes it difficult to widen the angle of the captured image. In other words, with a stereo camera of a conventional configuration, it is difficult to either widen the angle of the captured image or improve the resolution and accuracy of the distance data.
- the imaging device 1 according to this embodiment can achieve a wider angle of view for captured images and improved resolution and accuracy of distance data without increasing the size of the device.
- the imaging device 1 according to this embodiment can obtain images that are wide-angle yet have the resolution of a long focal length optical system, thanks to the configuration described below.
- a specific configuration example of the imaging device 1 is described below.
- the imaging device 1 includes a first optical system 10, a second optical system 20, and a controller 14.
- the imaging device 1 may further include an imaging element 30.
- the optical system is also called an optical device.
- the imaging device 1 can capture an image of an image formed by the first optical system 10 and an image of an image formed by the second optical system 20 as parallax images by forming an image of a subject 40 by each of the first optical system 10 and the second optical system 20 and capturing the image by the imaging element 30.
- the image formed by the first optical system 10 is also called a first image 41.
- the image formed by the second optical system 20 is also called a second image 42.
- FIG. 2 is a diagram for explaining the first image 41 and the second image 42 of the imaging device 1 of FIG. 1. 2, for the purpose of explaining the first image 41 and the second image 42, the first optical system 10 and the second optical system 20 are virtually separated, and a state in which light virtually travels in a straight line to generate a subject image at the position of the image sensor 30 is shown.
- the image obtained by capturing the first image 41 is also referred to as the first image.
- the image obtained by capturing the second image 42 is also referred to as the second image.
- the parallax image is composed of the first image and the second image.
- the imaging element 30 has a light receiving area 30A.
- the imaging element 30 captures light incident on the light receiving area 30A.
- the light receiving area 30A is also referred to as an imaging area.
- the imaging element 30 may be capable of capturing an image formed by visible light or invisible light such as infrared light or ultraviolet light.
- the imaging element 30 may be configured to include, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide-Semiconductor) image sensor.
- the imaging element 30 may be a color image sensor.
- the light receiving area 30A of the imaging element 30 may have a plurality of pixels.
- the imaging element 30 generates an image signal based on an electrical signal output from each pixel according to the amount of light incident on each pixel.
- the image sensor 30 may generate an image signal at a predetermined frame rate, such as 30 fps (frames per second).
- the light receiving area 30A includes a first light receiving area 31 and a second light receiving area 32.
- the first light receiving area 31 and the second light receiving area 32 do not overlap with each other on the light receiving area 30A.
- the area of the first light receiving area 31 and the area of the second light receiving area 32 may be the same.
- the first light receiving area 31 and the second light receiving area 32 may be adjacent to each other.
- the first light receiving area 31 and the second light receiving area 32 are distinguished for convenience of explanation. In the actual light receiving area 30A, the first light receiving area 31 and the second light receiving area 32 do not have to be configured to be distinguishable.
- the first optical system 10 includes a first optical element 11.
- the first optical element 11 has an optical axis 11A.
- the first optical element 11 forms an image of light or a light beam coming from the subject 40 on the first light receiving area 31 of the light receiving area 30A of the image sensor 30.
- the first optical element 11 may be configured to include at least one lens.
- the first optical element 11 may be configured to include various lenses, such as a convex lens or a concave lens.
- the first optical element 11 may be configured to include various mirrors, such as a convex mirror, a concave mirror, or a plane mirror.
- the first optical element 11 may be configured to include a diaphragm.
- the first optical element 11 may be configured to include various other elements, without being limited to these.
- the first optical element 11 forms an image of the subject 40 as a first image 41 in the light receiving area 30A.
- the first optical system 10 does not have to be image-side telecentric. In other words, the angle of the direction of the chief ray of any light beam passing through the first optical system 10 relative to the optical axis 11A of the first optical element 11 may be greater than 0 degrees.
- the first optical system 10 may be image-side telecentric.
- the second optical system 20 includes a second optical element 21.
- the second optical element 21 has an optical axis 21A.
- the second optical element 21 forms an image of the light or light beam coming from the subject 40 on the second light receiving area 32 of the light receiving area 30A of the image sensor 30.
- the second optical element 21 may be configured to include at least one lens.
- the second optical element 21 may be configured to include various lenses, such as a convex lens or a concave lens.
- the second optical element 21 may be configured to include various mirrors, such as a convex mirror, a concave mirror, or a plane mirror.
- the second optical element 21 may be configured to include a diaphragm.
- the second optical element 21 may be configured to include various other elements, without being limited to these.
- the second optical element 21 forms an image of the subject 40 as a second image 42 in the light receiving area 30A.
- the second optical system 20 does not have to be image-side telecentric. In other words, the angle of the direction of the chief ray of any light beam passing through the second optical system 20 relative to the optical axis 21A of the second optical element 21 may be greater than 0 degrees.
- the second optical system 20 may be image-side telecentric.
- the first optical system 10 further includes a third optical element 12.
- the third optical element 12 is configured as a mirror with a surface close to the first light receiving area 31 as a reflective surface. In the configuration example of FIG. 1, the third optical element 12 is a plane mirror.
- the third optical element 12 reflects at least a portion of the light traveling from the first optical element 11 toward the outside of the first light receiving area 31 and causes it to travel toward the inside of the first light receiving area 31.
- the light rays reflected by the third optical element 12 form an image inside the first light receiving area 31.
- the image formed by the light rays reflected by the third optical element 12 corresponds to an image formed outside the first light receiving area 31 (L1 in FIG. 2) folded back to the inside of the first light receiving area 31.
- the image that is folded back to the inside of the first light receiving region 31 is represented as [L1], with the folding (inversion) indicated in parentheses.
- the second optical system 20 further includes a fourth optical element 22.
- the fourth optical element 22 is configured as a mirror with a surface close to the second light receiving area 32 as a reflective surface. In the configuration example of FIG. 1, the fourth optical element 22 is a plane mirror.
- the fourth optical element 22 reflects at least a portion of the light traveling from the second optical element 21 toward the outside of the second light receiving area 32 and causes it to travel toward the inside of the second light receiving area 32.
- the light rays reflected by the fourth optical element 22 form an image inside the second light receiving area 32.
- the image formed by the light rays reflected by the fourth optical element 22 corresponds to an image formed outside the second light receiving area 32 (R4 in FIG. 2) folded back to the inside of the second light receiving area 32.
- the image that is folded back to the inside of the second light receiving region 32 is represented as [R4], with the folding (inversion) indicated in parentheses.
- the imaging device 1 may include at least one of the fifth optical element 13 and the sixth optical element 23.
- the imaging device 1 includes the fifth optical element 13 and the sixth optical element 23.
- the fifth optical element 13 and the sixth optical element 23 may be configured as mirrors having reflective surfaces.
- the fifth optical element 13 and the sixth optical element 23 may be configured as one unit, for example, as a single double-sided mirror.
- the fifth optical element 13 is configured as a mirror with a surface close to the first light receiving area 31 as a reflecting surface.
- the fifth optical element 13 is a plane mirror.
- the fifth optical element 13 is disposed to face the third optical element 12, and reflects at least a portion of the light traveling from the first optical element 11 toward the outside of the first light receiving area 31, and causes the light to travel toward the inside of the first light receiving area 31.
- the light reflected by the fifth optical element 13 is imaged inside the first light receiving area 31.
- the image formed by the light reflected by the fifth optical element 13 corresponds to the image formed outside the first light receiving area 31 (L4 in FIG. 2) folded back to the inside of the first light receiving area 31.
- the image folded back to the inside of the first light receiving area 31 is represented as [L4], with the folding (inversion) indicated in parentheses.
- the sixth optical element 23 is configured as a mirror with a surface close to the second light receiving area 32 as a reflecting surface.
- the sixth optical element 23 is a plane mirror.
- the sixth optical element 23 is disposed to face the fourth optical element 22, and reflects at least a portion of the light traveling from the second optical element 21 toward the outside of the second light receiving area 32, and causes the light to travel toward the inside of the second light receiving area 32.
- the light reflected by the sixth optical element 23 is imaged inside the second light receiving area 32.
- the image formed by the light reflected by the sixth optical element 23 corresponds to the image formed outside the second light receiving area 32 (R1 in FIG. 2) folded back to the inside of the second light receiving area 32.
- the image folded back to the inside of the second light receiving area 32 is represented as [R1], with the folding (inversion) indicated in parentheses.
- FIG. 3 is a diagram for explaining a superimposed image 50 generated by the imaging device 1 of FIG. 1.
- the captured image includes a superimposed image 50L in which an image formed on the first light receiving area 31 via other optical elements is superimposed on an image formed on the first light receiving area 31 via other optical elements by light coming from the subject 40 through the first optical element 11 without passing through other optical elements.
- the captured image also includes a superimposed image 50R in which an image formed on the second light receiving area 32 via other optical elements is superimposed on an image formed on the second light receiving area 32 via other optical elements by light coming from the subject 40 through the second optical element 21 without passing through other optical elements.
- the superimposed image 50L and the superimposed image 50R are not distinguished from each other, they may be collectively referred to as the superimposed image 50.
- the first image 41 includes images L1, L2, L3, and L4.
- L1 is folded back by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L1].
- [L1] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- L4 is folded back by the fifth optical element 13 and formed in the first light receiving area 31 as an inverted image [L4].
- [L4] is superimposed on L3, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50L including the images [L1], L2, L3, and [L4].
- the second image 42 includes images of R1, R2, R3, and R4.
- R4 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R4].
- [R4] is superimposed on R3, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- R1 is folded back by the sixth optical element 23 and formed in the second light receiving area 32 as an inverted image [R1].
- [R1] is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50R including the images of [R1], R2, R3, and [R4].
- the third optical element 12, the fourth optical element 22, the fifth optical element 13, and the sixth optical element 23 are arranged so as to fold the image in the left-right direction toward the paper surface, but they may be arranged so as to fold the image in the up-down direction toward the paper surface.
- the light receiving area 30A of the imaging element 30 captures an image in which the first optical system 10 and the second optical system 20 focus the light or light beam coming from the subject 40 on the light receiving area 30A.
- the range in which the light or light beam can be focused by the first optical system 10 and the second optical system 20 so as to be captured on the light receiving area 30A corresponds to the angle of view of the imaging device 1.
- the range in which an image is directly focused on the light receiving area 30A from the first optical element 11 and the second optical element 21 without passing through any optical elements other than the first optical element 11 and the second optical element 21 is also referred to as the direct angle of view.
- the imaging device 1 can capture images by directing light or light beams incident on the first optical system 10 and the second optical system 20 from outside the direct angle of view toward the light receiving area 30A by optical elements other than the first optical element 11 and the second optical element 21 to form an image.
- the angle of view can be widened without changing the focal length.
- the imaging device 1 includes a controller 14 that controls the imaging device 1.
- the controller 14 includes at least one processor, at least one dedicated circuit, or a combination of these.
- the processor is a general-purpose processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a dedicated processor specialized for specific processing.
- the dedicated circuit may be, for example, an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit).
- the controller 14 performs information processing (image processing) on the captured image obtained from the imaging element 30.
- the controller 14 performs a conversion process including at least inversion and combination on the superimposed image 50L of the first light receiving area 31 and the superimposed image 50R of the second light receiving area 32, and calculates the parallax of the subject 40 by comparing the superimposed images 50 on which the conversion process has been performed.
- FIG. 4 is a diagram for explaining the conversion process performed by the imaging device 1 of FIG. 1.
- the controller 14 inverts (folds) the portion of the superimposed image 50L that includes [L1] in the outward direction and combines it.
- the controller 14 also inverts (folds) the portion of the superimposed image 50L that includes [L4] in the outward direction and combines it.
- the superimposed image 51L after conversion is the superimposed image 50L after conversion process has been performed.
- the superimposed image 51L after conversion includes images of L1, L2, L3, and L4 in the same arrangement as the first image 41.
- the superimposed image 51L after conversion is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts (folds) the portion of the superimposed image 50R that includes [R1] in the outward direction and combines them.
- the controller 14 also inverts (folds) the portion of the superimposed image 50R that includes [R4] in the outward direction and combines them.
- the transformed superimposed image 51R is the superimposed image 50R that has been subjected to the transformation process.
- the transformed superimposed image 51R includes the images of R1, R2, R3, and R4 in the same arrangement as the second image 42.
- the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the transformed superimposed image 51L and the transformed superimposed image 51R are not to be distinguished from each other, they may be collectively referred to as the transformed superimposed image 51.
- the controller 14 calculates the parallax by performing, for example, a known stereo matching process. In the stereo matching process, the controller 14 may use one of the transformed superimposed image 51L and the transformed superimposed image 51R as a reference image and the other as a comparison image. The controller 14 can calculate the difference (parallax) between the positions of the corresponding parts on the image in the reference image and the comparison image. The controller 14 can also use the calculated parallax to calculate distance information to the object based on the principle of triangulation. Here, it is known which of the transformed superimposed image 51L and the transformed superimposed image 51R is the right eye side or the left eye side of the imaging device 1 configured as a stereo camera.
- the controller 14 can know in advance in which direction the subject 40 will move due to the parallax when comparing the reference image with the comparison image.
- the converted superimposed image 51 contains an inverted image, but the direction in which the inverted image of the subject 40 moves due to parallax is opposite to the pre-calculated direction of parallax (the correct direction in which the image of the subject 40 moves). Therefore, the controller 14 can avoid erroneously calculating the parallax for the inverted image of the subject 40 by checking against the pre-calculated direction of parallax.
- the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21.
- the controller 14 may separate the subject image by applying an image processing method such as an independent component analysis, a wavelet method, or an image separation model to the converted superimposed image 51L or the converted superimposed image 51R, that is, perform processing to remove the inverted subject image.
- the image separation model is, for example, a model constructed by creating an image in which multiple images are superimposed in advance and learning multiple correct images to be separated from the previously created image.
- the image separation model may be a model that applies Pix-to-Pix, which makes a generator that generates an image compete with a discriminator that determines whether the generated image is a fake image, and generates a pair image that reflects the relationship.
- controller 14 may also use the calculated direction of parallax (the correct direction in which the image of subject 40 moves) in the process of separating the subject images. As described above, by comparing transformed superimposed image 51L and transformed superimposed image 51R, the image of subject 40 moves in the direction of the calculated parallax, and the inverted image of subject 40 moves in the opposite direction. Therefore, controller 14 can identify the inverted image of subject 40 and remove it from transformed superimposed image 51 based on the calculated direction of parallax.
- the controller 14 may separate the superposition from the converted superposition image 51 and output at least one of the first image 41 without superposition and the second image 42 without superposition.
- the controller 14 may calculate distance information to the target using the principle of triangulation as described above.
- FIG. 6 is a flowchart showing an example of the information processing method executed by the imaging device 1.
- the controller 14 may execute the process shown in the flowchart of FIG. 6 to obtain a restored image from which the inverted image of the subject 40 has been removed.
- the controller 14 acquires the superimposed image 50L of the first light receiving area 31 and the superimposed image 50R of the second light receiving area 32 (step S1).
- the controller 14 executes a conversion process including at least inversion and combination as described above (step S2).
- the conversion process results in a converted superimposed image 51L and a converted superimposed image 51R.
- the controller 14 calculates the parallax of the subject 40 by comparing the converted superimposed image 51L and the converted superimposed image 51R (step S3).
- a stereo matching process or the like may be executed.
- the controller 14 separates the subject image (image of the subject 40) based on the calculated direction of parallax to obtain a restored image.
- an image processing method such as an independent component analysis, a wavelet method, or an image separation model may be applied.
- the imaging device 1 does not need to include the fifth optical element 13 and the sixth optical element 23.
- a part of the first image 41 is formed in the first light receiving area 31.
- a part of the first image 41 is formed in a part of the second light receiving area 32.
- the first optical element 11 is configured to image the light or light beam coming from the subject 40 as the first image 41 in a range wider than the first light receiving area 31.
- a part of the second image 42 is formed in the second light receiving area 32.
- a part of the second image 42 is formed in a part of the first light receiving area 31.
- the second optical element 21 is configured to image the light or light beam coming from the subject 40 as the second image 42 in a range wider than the second light receiving area 32.
- the first image 41 includes images L1, L2, L3, and L4.
- L1 is folded back by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L1].
- [L1] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- R1 which is a part of the second image 42, is formed in the first light receiving area 31.
- R1 is superimposed on L3, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50L that includes the images [L1], L2, L3, and R1.
- the second image 42 includes images of R1, R2, R3, and R4.
- R4 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R4].
- [R4] is superimposed on R3, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- L4 which is a part of the first image 41, is formed in the second light receiving area 32.
- L4 is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50R including the images of L4, R2, R3, and [R4].
- FIG. 10 is a diagram for explaining the conversion process executed by the imaging device 1 of FIG. 7.
- the controller 14 inverts and combines the portion of the superimposed image 50L that includes [L1] in an outward direction.
- the controller 14 also extracts the portion of the superimposed image 50R that includes L4 and combines it in an outward direction of the portion of the superimposed image 50L that includes L3.
- the converted superimposed image 51L includes images of L1, L2, L3, and L4 in the same arrangement as the first image 41.
- the converted superimposed image 51L is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts and combines the portion of the superimposed image 50R that includes [R4] in an outward direction.
- the controller 14 also extracts the portion of the superimposed image 50L that includes R1 and combines it in an outward direction of the portion of the superimposed image 50R that includes R2.
- the transformed superimposed image 51R includes images of R1, R2, R3, and R4 in the same arrangement as the second image 42. In other words, the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information for the subject 40.
- the controller 14 calculates the parallax of the subject 40 by comparing the converted superimposed images 51, as in the first configuration example. Also, the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21, as in the first configuration example.
- FIG. 11 to 14 are explanatory diagrams similar to Figures 1 to 4, respectively, and the same elements are given the same reference numerals. In order to avoid redundant explanation, explanations of the same elements as in the first configuration example will be omitted or simplified, and differences from the first configuration example will mainly be described.
- the imaging device 1 may include a fifth optical element 13 and a sixth optical element 23, which are light-blocking members.
- the fifth optical element 13 is disposed opposite the third optical element 12, and blocks at least a portion of the light traveling from the first optical element 11 toward the outside of the first light-receiving region 31.
- the sixth optical element 23 is disposed opposite the fourth optical element 22, and blocks at least a portion of the light traveling from the second optical element 21 toward the outside of the second light-receiving region 32.
- the first image 41 includes images L1 and L2.
- L1 is folded back by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L1].
- [L1] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through any other optical elements.
- the imaging element 30 generates a superimposed image 50L that includes the images [L1] and L2.
- the second image 42 includes images of R2 and R3.
- R3 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R3].
- [R3] is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through any other optical element.
- the imaging element 30 generates a superimposed image 50R that includes the images of R2 and [R3].
- FIG. 14 is a diagram for explaining the conversion process executed by the imaging device 1 of FIG. 11.
- the controller 14 inverts and combines the portion of the superimposed image 50L that includes [L1] (the entire superimposed image 50L in the example of FIG. 14) in the outward direction.
- the converted superimposed image 51L includes the images of L1 and L2 in the same arrangement as the first image 41.
- the converted superimposed image 51L is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts and combines the portion of the superimposed image 50R that includes [R3] (the entire superimposed image 50R in the example of FIG. 14) in the outward direction.
- the transformed superimposed image 51R includes the images of R2 and R3 in the same arrangement as the second image 42.
- the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information for the subject 40.
- the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21. This makes it possible to expand the parallax calculation (distance measurement) range on the close-range side.
- the imaging device 1 may not include the fifth optical element 13 and the sixth optical element 23, as in the second configuration example.
- a part of the first image 41 is formed in the first light receiving area 31.
- a part of the first image 41 is formed in a part of the second light receiving area 32.
- the first optical element 11 is configured to image the light or light beam coming from the subject 40 as the first image 41 in a range wider than the first light receiving area 31.
- a part of the second image 42 is formed in the second light receiving area 32.
- a part of the second image 42 is formed in a part of the first light receiving area 31.
- the second optical element 21 is configured to image the light or light beam coming from the subject 40 as the second image 42 in a range wider than the second light receiving area 32.
- the first image 41 includes images L1, L2, and L4.
- L1 is folded back by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L1].
- [L1] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- R1 which is a part of the second image 42, is formed in the first light receiving area 31.
- R1 is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50L that includes the images [L1], L2, and R1.
- the second image 42 includes images of R1, R2, and R4.
- R4 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R4].
- [R4] is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- L4 which is a part of the first image 41, is formed in the second light receiving area 32.
- L4 is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through other optical elements.
- the imaging element 30 generates a superimposed image 50R including the images of L4, R2, and [R4].
- FIG. 18 is a diagram for explaining the conversion process executed by the imaging device 1 of FIG. 15.
- the controller 14 inverts and combines the portion of the superimposed image 50L that includes [L1] (the entire superimposed image 50L in the example of FIG. 18) in an outward direction.
- the controller 14 also extracts the portion of the superimposed image 50R that includes L4 (the entire superimposed image 50R in the example of FIG. 18) and combines it in the opposite outward direction.
- the converted superimposed image 51L includes images of L1, L2, and L4 in the same arrangement as the first image 41. In other words, the converted superimposed image 51L is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts and combines the portion of the superimposed image 50R that includes [R4] (the entire superimposed image 50R in the example of FIG. 18) in an outward direction.
- the controller 14 also extracts the portion of the superimposed image 50L that includes R1 (the entire superimposed image 50L in the example of FIG. 18) and combines it in the opposite outward direction.
- the transformed superimposed image 51R includes images of R1, R2, and R4 in the same arrangement as the second image 42. In other words, the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information for the subject 40.
- the controller 14 calculates the parallax of the subject 40 by comparing the converted superimposed images 51, as in the first configuration example. Also, the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21, as in the first configuration example.
- the imaging device 1 may be configured such that the third optical element 12 is moved to the position of the fifth optical element 13 in the first configuration example, and the fifth optical element 13 is not included.
- the third optical element 12 is configured as a mirror with a surface close to the first light receiving area 31 as a reflective surface.
- the imaging device 1 may also be configured with a sixth optical element 23 that is a light blocking member.
- the first image 41 includes images of L2 and L4.
- L4 is folded by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L4].
- [L4] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through any other optical elements.
- the imaging element 30 generates a superimposed image 50L that includes the images of L2 and [L4].
- the second image 42 includes images of R2 and R4.
- R4 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R4].
- [R4] is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through any other optical element.
- the imaging element 30 generates a superimposed image 50R that includes the images of R2 and [R4].
- FIG. 22 is a diagram for explaining the conversion process executed by the imaging device 1 of FIG. 19.
- the controller 14 inverts and combines the portion of the superimposed image 50L that includes [L4] (the entire superimposed image 50L in the example of FIG. 22) in the outward direction.
- the converted superimposed image 51L includes the images of L2 and L4 in the same arrangement as the first image 41.
- the converted superimposed image 51L is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts and combines the portion of the superimposed image 50R that includes [R4] (the entire superimposed image 50R in the example of FIG. 22) in the outward direction.
- the transformed superimposed image 51R includes the images of R2 and R4 in the same arrangement as the second image 42.
- the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information for the subject 40.
- the controller 14 calculates the parallax of the subject 40 by comparing the converted superimposed images 51, as in the first configuration example. Also, the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21, as in the first configuration example.
- the imaging device 1 may be configured without the fifth optical element 13 by moving the third optical element 12 to the position of the fifth optical element 13 in the first configuration example.
- the third optical element 12 is configured as a mirror with a surface close to the first light receiving area 31 as a reflective surface.
- the imaging device 1 may be configured without the sixth optical element 23 by moving the fourth optical element 22 to the position of the sixth optical element 23 in the first configuration example.
- the fourth optical element 22 is configured as a mirror with a surface close to the second light receiving area 32 as a reflective surface.
- the first image 41 includes images of L2 and L4.
- L4 is folded back by the third optical element 12 and formed in the first light receiving area 31 as an inverted image [L4].
- [L4] is superimposed on L2, which is an image formed in the first light receiving area 31 from the first optical element 11 without passing through any other optical elements.
- the imaging element 30 generates a superimposed image 50L that includes the images of L2 and [L4].
- the second image 42 includes images of R1 and R2.
- R1 is folded back by the fourth optical element 22 and formed in the second light receiving area 32 as an inverted image [R1].
- [R1] is superimposed on R2, which is an image formed in the second light receiving area 32 from the second optical element 21 without passing through any other optical element.
- the imaging element 30 generates a superimposed image 50R that includes the images of [R1] and R2.
- FIG. 26 is a diagram for explaining the conversion process executed by the imaging device 1 of FIG. 23.
- the controller 14 inverts and combines the portion of the superimposed image 50L that includes [L4] (the entire superimposed image 50L in the example of FIG. 26) in the outward direction.
- the converted superimposed image 51L includes the images of L2 and L4 in the same arrangement as the first image 41.
- the converted superimposed image 51L is an image in which an inverted image is superimposed, but which includes correct position information of the subject 40.
- the controller 14 also inverts and combines the portion of the superimposed image 50R that includes [R1] (the entire superimposed image 50R in the example of FIG. 26) in an outward direction.
- the transformed superimposed image 51R includes the images of R1 and R2 in the same arrangement as the second image 42.
- the transformed superimposed image 51R is an image in which an inverted image is superimposed, but which includes correct position information for the subject 40.
- the controller 14 may separate the subject image captured only through the first optical element 11 or the second optical element 21 from the subject image captured also through optical elements other than the first optical element 11 and the second optical element 21.
- the reflecting surface of the third optical element 12, which is a mirror may be inclined with respect to the optical axis 11A so as to be in an inwardly inclined position facing the image forming surface side of the first optical element 11.
- the reflecting surfaces of the fourth optical element 22, the fifth optical element 13, and the sixth optical element 23, which are mirrors may be in an inwardly inclined position.
- the inwardly inclined position allows the entire optical device to be made smaller than a configuration in which the reflecting surface of the mirror is parallel to the optical axis 11A or 21A.
- the mirror may be, for example, a flat mirror, a curved mirror, a DMD (Digital Mirror Device), or a Fresnel mirror.
- the reflecting surface of the third optical element 12, which is a mirror may be inclined with respect to the optical axis 11A so as to be in an outwardly inclined position facing the first optical element 11.
- the reflecting surfaces of the fourth optical element 22, the fifth optical element 13, and the sixth optical element 23, which are mirrors may be in an outwardly inclined position.
- the outwardly inclined position allows the angle of view of the entire optical device to be made wider than in a configuration in which the reflective surface of the mirror is parallel to the optical axis 11A or 21A.
- the controller 14 when the third optical element 12 is tilted inward, the image [L4] reflected by the third optical element 12 and formed at the first light receiving area 31 is reduced compared to L2, which is an image formed without passing through the third optical element 12.
- the controller 14 When the image inverted by the mirror is reduced or enlarged, the controller 14 generates the transformed superimposed image 51 by including an enlargement process or a reduction process in the conversion process. That is, the transformed superimposed image 51 is generated so that the reduction or enlargement of the image due to the tilt of the optical element is canceled.
- the controller 14 when the reflective surface of the mirror is a curved surface or the like and the position of the image inverted by the mirror moves, the controller 14 generates the transformed superimposed image 51 by including a coordinate conversion process in the conversion process. That is, the transformed superimposed image 51 is generated so that the movement of the image position is canceled. Furthermore, while [L4] in FIG. 27 becomes a correct image in the transformed superimposed image 51 as a result of such enlargement, reduction, or coordinate transformation, a portion of L2, which is also processed, becomes an incorrect image due to deformation. By utilizing the deformation of the image caused by the enlargement, reduction, or coordinate transformation, as well as based on the calculated direction of parallax, the controller 14 can perform parallax calculation processing and subject image separation even more efficiently.
- the imaging device 1 and information processing method according to this embodiment are capable of obtaining an image that has the resolution of a long focal length optical system while still being wide-angle.
- the converted superimposed image 51 can be used to calculate the parallax of the subject 40 before the subject image is separated.
- a process for separating the subject image is performed for each of the two images, superimposed image 50L and superimposed image 50R, and then the parallax of the subject 40 is calculated using the two separated images. Compared to such a conventional method, it is possible to reduce the amount of calculation.
- embodiments of the present disclosure are not limited to the specific configurations of any of the embodiments described above.
- the embodiments of the present disclosure may extend to any novel feature or combination of features described herein or any novel method or process step or combination of features described herein.
- references such as “first” and “second” are identifiers for distinguishing the configuration.
- Configurations distinguished by descriptions such as “first” and “second” in this disclosure may have their numbers exchanged.
- the first optical system 10 may exchange identifiers “first” and “second” with the second optical system 20.
- the exchange of identifiers is performed simultaneously.
- the configurations remain distinguished even after the exchange of identifiers.
- Identifiers may be deleted.
- Configurations from which identifiers have been deleted are distinguished by symbols. Descriptions of identifiers such as “first” and “second” in this disclosure alone should not be used to interpret the order of the configurations or to justify the existence of identifiers with smaller numbers.
- Storage media include, for example, optical disks, magneto-optical disks, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, hard disks, or memory cards.
- the implementation form of the program may be, but is not limited to, an application program such as object code compiled by a compiler or program code executed by an interpreter.
- the implementation form of the program may include a program module incorporated into an operating system.
- the program may or may not be configured so that all processing is performed only by the CPU on the control board.
- the program may be configured so that part or all of it is executed by another processing unit implemented on an expansion board or expansion unit added to the board as necessary.
- Imaging device 10 First optical system 11 First optical element 11A Optical axis 12 Third optical element 13 Fifth optical element 20 Second optical system 21 Second optical element 21A Optical axis 22 Fourth optical element 23 Sixth optical element 30 Imaging element 30A Light receiving area 31 First light receiving area 32 Second light receiving area 40 Subject 41 First image 42 Second image 50, 50L, 50R Superimposed image 51, 51L, 51R Transformed superimposed image
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Studio Devices (AREA)
- Measurement Of Optical Distance (AREA)
- Cameras In General (AREA)
Abstract
Description
被写体から到来する光を第1像として撮像素子の第1受光領域に結像させる第1光学素子と、
前記被写体から到来する光を第2像として前記撮像素子の前記第1受光領域に隣接する第2受光領域に結像させる第2光学素子と、
前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を反射して前記第1受光領域の中に向かって進行させる第3光学素子と、
前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を反射して前記第2受光領域の中に向かって進行させる第4光学素子と、
前記第1受光領域の重畳画像と前記第2受光領域の重畳画像に少なくとも反転及び結合を含む変換処理を実行し、前記変換処理が実行された重畳画像の比較によって被写体の視差を算出するコントローラと、を備える。
第5光学素子及び第6光学素子の少なくとも1つを備え、
前記第5光学素子は、前記第3光学素子に対向するように配置され、前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を遮光し、又は、反射して前記第1受光領域の中に向かって進行させ、
前記第6光学素子は、前記第4光学素子に対向するように配置され、前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を遮光し、又は、反射して前記第2受光領域の中に向かって進行させる。
前記コントローラは、算出した視差の方向に基づいて、前記第1光学素子又は前記第2光学素子のみを介して撮像された被写体像と、前記第1光学素子と前記第2光学素子以外の光学素子も介して撮像された被写体像と、を分離する。
被写体から到来する光を第1像として撮像素子の第1受光領域に結像させる第1光学素子と、前記被写体から到来する光を第2像として前記撮像素子の前記第1受光領域に隣接する第2受光領域に結像させる第2光学素子と、前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を反射して前記第1受光領域の中に向かって進行させる第3光学素子と、前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を反射して前記第2受光領域の中に向かって進行させる第4光学素子と、コントローラと、を備える撮像装置が実行する情報処理方法であって、
前記コントローラが、
前記第1受光領域の重畳画像と前記第2受光領域の重畳画像に少なくとも反転及び結合を含む変換処理を実行し、
前記変換処理が実行された重畳画像の比較によって被写体の視差を算出する。
図1に示されるように、本実施形態に係る撮像装置1は、第1光学系10と、第2光学系20と、コントローラ14と、を備える。撮像装置1は撮像素子30をさらに備えてよい。光学系は光学装置とも称される。撮像装置1は、第1光学系10及び第2光学系20それぞれで被写体40を結像させて撮像素子30で撮像することによって、第1光学系10で結像した像を撮像した画像と第2光学系20で結像した像を撮像した画像とを、視差画像として撮像できる。第1光学系10で結像した像は、第1像41とも称される。第2光学系20で結像した像は、第2像42とも称される。図2は、図1の撮像装置1の第1像41及び第2像42を説明するための図である。図2において、第1像41及び第2像42の説明のために、仮想的に第1光学系10と第2光学系20とを離して示し、仮想的に光が直進して撮像素子30の位置で被写体像が生成される様子を示している。第1像41を撮像した画像は、第1画像とも称される。第2像42を撮像した画像は、第2画像とも称される。視差画像は、第1画像及び第2画像によって構成される。
第2構成例の撮像装置1は、図7~図10を参照して説明される。図7~図10は、それぞれ図1~図4と同様の説明図であって、同じ要素について同じ符号が付されている。重複説明回避のため、第1構成例と同じ要素については説明を省略又は簡略化し、主に第1構成例と異なる部分について説明する。
第3構成例の撮像装置1は、図11~図14を参照して説明される。図11~図14は、それぞれ図1~図4と同様の説明図であって、同じ要素について同じ符号が付されている。重複説明回避のため、第1構成例と同じ要素については説明を省略又は簡略化し、主に第1構成例と異なる部分について説明する。
第4構成例の撮像装置1は、図15~図18を参照して説明される。図15~図18は、それぞれ図1~図4と同様の説明図であって、同じ要素について同じ符号が付されている。重複説明回避のため、第1構成例と同じ要素については説明を省略又は簡略化し、主に第1構成例と異なる部分について説明する。
第5構成例の撮像装置1は、図19~図22を参照して説明される。図19~図22は、それぞれ図1~図4と同様の説明図であって、同じ要素について同じ符号が付されている。重複説明回避のため、第1構成例と同じ要素については説明を省略又は簡略化し、主に第1構成例と異なる部分について説明する。
第6構成例の撮像装置1は、図23~図26を参照して説明される。図23~図26は、それぞれ図1~図4と同様の説明図であって、同じ要素について同じ符号が付されている。重複説明回避のため、第1構成例と同じ要素については説明を省略又は簡略化し、主に第1構成例と異なる部分について説明する。
図27に示されるように、ミラーである第3光学素子12の反射面は、第1光学素子11の結像面側を向く内側傾斜の姿勢となるように、光軸11Aに対して傾斜してよい。この場合に、ミラーである第4光学素子22、第5光学素子13及び第6光学素子23の反射面が内側傾斜の姿勢であってよい。内側傾斜の姿勢によって、ミラーの反射面が光軸11A又は21Aと平行になる構成と比べて、光学装置全体が小型化され得る。ミラーは、例えば、平面ミラー、曲面ミラー、DMD(Digital Mirror Device)又はフレネルミラーであってよい。また、別の例として、ミラーである第3光学素子12の反射面は、第1光学素子11側を向く外側傾斜の姿勢となるように、光軸11Aに対して傾斜してよい。この場合に、ミラーである第4光学素子22、第5光学素子13及び第6光学素子23の反射面が外側傾斜の姿勢であってよい。外側傾斜の姿勢によって、ミラーの反射面が光軸11A又は21Aと平行になる構成と比べて、光学装置全体の画角が広角化され得る。
10 第1光学系
11 第1光学素子
11A 光軸
12 第3光学素子
13 第5光学素子
20 第2光学系
21 第2光学素子
21A 光軸
22 第4光学素子
23 第6光学素子
30 撮像素子
30A 受光領域
31 第1受光領域
32 第2受光領域
40 被写体
41 第1像
42 第2像
50、50L、50R 重畳画像
51、51L、51R 変換後重畳画像
Claims (4)
- 被写体から到来する光を第1像として撮像素子の第1受光領域に結像させる第1光学素子と、
前記被写体から到来する光を第2像として前記撮像素子の前記第1受光領域に隣接する第2受光領域に結像させる第2光学素子と、
前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を反射して前記第1受光領域の中に向かって進行させる第3光学素子と、
前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を反射して前記第2受光領域の中に向かって進行させる第4光学素子と、
前記第1受光領域の重畳画像と前記第2受光領域の重畳画像に少なくとも反転及び結合を含む変換処理を実行し、前記変換処理が実行された重畳画像の比較によって被写体の視差を算出するコントローラと、を備える、撮像装置。 - 第5光学素子及び第6光学素子の少なくとも1つを備え、
前記第5光学素子は、前記第3光学素子に対向するように配置され、前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を遮光し、又は、反射して前記第1受光領域の中に向かって進行させ、
前記第6光学素子は、前記第4光学素子に対向するように配置され、前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を遮光し、又は、反射して前記第2受光領域の中に向かって進行させる、請求項1に記載の撮像装置。 - 前記コントローラは、算出した視差の方向に基づいて、前記第1光学素子又は前記第2光学素子のみを介して撮像された被写体像と、前記第1光学素子と前記第2光学素子以外の光学素子も介して撮像された被写体像と、を分離する、請求項1又は2に記載の撮像装置。
- 被写体から到来する光を第1像として撮像素子の第1受光領域に結像させる第1光学素子と、前記被写体から到来する光を第2像として前記撮像素子の前記第1受光領域に隣接する第2受光領域に結像させる第2光学素子と、前記第1光学素子から前記第1受光領域の外に向かって進行する光の少なくとも一部を反射して前記第1受光領域の中に向かって進行させる第3光学素子と、前記第2光学素子から前記第2受光領域の外に向かって進行する光の少なくとも一部を反射して前記第2受光領域の中に向かって進行させる第4光学素子と、コントローラと、を備える撮像装置が実行する情報処理方法であって、
前記コントローラが、
前記第1受光領域の重畳画像と前記第2受光領域の重畳画像に少なくとも反転及び結合を含む変換処理を実行し、
前記変換処理が実行された重畳画像の比較によって被写体の視差を算出する、情報処理方法。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-069639 | 2023-04-20 | ||
| JP2023069639A JP7802032B2 (ja) | 2023-04-20 | 2023-04-20 | 撮像装置及び情報処理方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024219280A1 true WO2024219280A1 (ja) | 2024-10-24 |
Family
ID=93152452
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/014328 Ceased WO2024219280A1 (ja) | 2023-04-20 | 2024-04-08 | 撮像装置及び情報処理方法 |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7802032B2 (ja) |
| WO (1) | WO2024219280A1 (ja) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004289305A (ja) | 2003-03-19 | 2004-10-14 | Sumitomo Electric Ind Ltd | 車載撮像システム及び撮像装置 |
| JP2011064566A (ja) * | 2009-09-17 | 2011-03-31 | Fujitsu Ltd | 距離推定装置 |
| JP2019178871A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社リコー | ステレオカメラ装置 |
-
2023
- 2023-04-20 JP JP2023069639A patent/JP7802032B2/ja active Active
-
2024
- 2024-04-08 WO PCT/JP2024/014328 patent/WO2024219280A1/ja not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004289305A (ja) | 2003-03-19 | 2004-10-14 | Sumitomo Electric Ind Ltd | 車載撮像システム及び撮像装置 |
| JP2011064566A (ja) * | 2009-09-17 | 2011-03-31 | Fujitsu Ltd | 距離推定装置 |
| JP2019178871A (ja) * | 2018-03-30 | 2019-10-17 | 株式会社リコー | ステレオカメラ装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7802032B2 (ja) | 2026-01-19 |
| JP2024155175A (ja) | 2024-10-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| NL2008639C2 (en) | Device, system and vehicle for recording panoramic images, and a device and method for panoramic projection thereof. | |
| JP5273356B2 (ja) | 複眼画像入力装置及びそれを用いた距離測定装置 | |
| KR101737085B1 (ko) | 3차원 카메라 | |
| JP6793608B2 (ja) | ステレオ画像処理装置 | |
| JP7331222B2 (ja) | 光学装置及び撮像装置 | |
| JP7802032B2 (ja) | 撮像装置及び情報処理方法 | |
| JP7710963B2 (ja) | 撮像装置 | |
| JP2012242488A (ja) | 撮像装置、立体撮像光学系、およびプログラム | |
| JP6983740B2 (ja) | ステレオカメラシステム、及び測距方法 | |
| Lin et al. | Single-view-point omnidirectional catadioptric cone mirror imager | |
| WO2024101324A1 (ja) | 光学装置及び撮像装置 | |
| WO2023166813A1 (ja) | ステレオ画像処理装置 | |
| WO2024219279A1 (ja) | 撮像装置 | |
| JP7583874B2 (ja) | 画像処理装置、撮像装置及び画像処理方法 | |
| WO2020235249A1 (ja) | ステレオカメラシステム、及び測距方法 | |
| US20230306623A1 (en) | Apparatus, method, and storage medium | |
| JP7519237B2 (ja) | 撮像装置 | |
| JP2825707B2 (ja) | 3次元撮像装置 | |
| JP2020112881A (ja) | 距離算出装置及び距離算出方法、プログラム、記憶媒体 | |
| JPH07177423A (ja) | 複眼撮像装置 | |
| JP2024070596A (ja) | 光学装置及び撮像装置 | |
| WO2024101256A1 (ja) | 光学装置及び撮像装置 | |
| WO2023166546A1 (ja) | ステレオ画像処理装置 | |
| CN118176459A (zh) | 光学装置以及拍摄装置 | |
| JP2024119171A (ja) | 測距装置、移動体、制御方法及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24792555 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024792555 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2024792555 Country of ref document: EP Effective date: 20251120 |
|
| ENP | Entry into the national phase |
Ref document number: 2024792555 Country of ref document: EP Effective date: 20251120 |
|
| ENP | Entry into the national phase |
Ref document number: 2024792555 Country of ref document: EP Effective date: 20251120 |
|
| ENP | Entry into the national phase |
Ref document number: 2024792555 Country of ref document: EP Effective date: 20251120 |