[go: up one dir, main page]

WO2018088016A1 - Dispositif de traitement d'images, procédé de traitement d'images, système d'imagerie, et programme de traitement d'images - Google Patents

Dispositif de traitement d'images, procédé de traitement d'images, système d'imagerie, et programme de traitement d'images Download PDF

Info

Publication number
WO2018088016A1
WO2018088016A1 PCT/JP2017/032496 JP2017032496W WO2018088016A1 WO 2018088016 A1 WO2018088016 A1 WO 2018088016A1 JP 2017032496 W JP2017032496 W JP 2017032496W WO 2018088016 A1 WO2018088016 A1 WO 2018088016A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
images
generated
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/032496
Other languages
English (en)
Japanese (ja)
Inventor
宜邦 野村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Publication of WO2018088016A1 publication Critical patent/WO2018088016A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present disclosure relates to an image processing device, an image processing method, an imaging system, and an image processing program.
  • a shift between a plurality of captured images is only a translational shift in a one-dimensional direction.
  • the image is distorted due to factors of an optical system such as a lens. Therefore, it is difficult to make the shift between images only translational shift.
  • a process such as a lens distortion correction process or a parallelization process (Rectification) is performed to correct the image, and a shift between the images is corrected only to a translational shift. If the image has only one-dimensional translational displacement, only one-dimensional alignment is required, so that high-speed alignment can be performed.
  • Lens distortion correction processing and parallelization processing involve processing such as pixel interpolation. Therefore, the resolution of the image corrected by the lens distortion correction process and the parallelization process is lowered, and as a result, the resolution of the image obtained by image synthesis is also lowered.
  • the calculation amount of the two-dimensional block matching is an order of O (n 4 ) when expressed in the Landau notation, and the calculation cost required for alignment becomes excessive. It is difficult to perform two-dimensional block matching within a time that does not hinder the operation in a situation where resolution enhancement of an image sensor or the like is progressing.
  • the present disclosure provides an image processing apparatus and an image processing method capable of reducing a decrease in resolution in processing such as image synthesis using a plurality of images and performing processing within a time that does not hinder operation. It is an object to provide an imaging system and an image processing program.
  • an image processing apparatus includes: One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • an image processing method includes: One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images. Image deformation processing to generate Image synthesis is performed using information based on the reference image and information based on the image generated by the image transformation process. This is an image processing method.
  • an imaging system includes: An imaging unit that captures a plurality of images of the same subject; One of the plurality of images captured by the imaging unit is used as a reference image, an image that can be one-dimensionally aligned is generated based on each of the plurality of images, and then deformed, and alignment with respect to the reference image is performed.
  • An image processing program for achieving the above object is as follows: One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • An image deformation processing step for generating a processed image, and Performing image synthesis using information based on a reference image and information based on an image generated by image deformation processing;
  • image synthesis is performed using information based on the reference image and information based on the image generated by the image deformation process. Since the reference image has not undergone the resolution degradation accompanying the alignment, the resolution degradation of the image obtained by image synthesis is reduced. In addition, since an image that can be one-dimensionally aligned is generated and then deformed to generate an image that is aligned with the reference image, it can be processed within a time that does not hinder operation.
  • FIG. 1 is a schematic diagram for describing a configuration of an imaging system including an image processing device according to the first embodiment of the present disclosure.
  • FIG. 2 is a schematic diagram for explaining a configuration of an imaging system including an image processing apparatus according to a reference example.
  • FIG. 3 is a schematic diagram for explaining image processing in the image processing apparatus of the reference example.
  • FIG. 4 is a block diagram for explaining the operation of the image processing apparatus of the reference example.
  • FIG. 5 is a block diagram for explaining the operation of the image processing apparatus according to the first embodiment.
  • 6A, 6B, and 6C are drawing-substituting photographs for explaining an example of the alignment execution result in the image processing apparatus according to the first embodiment.
  • FIG. 7 is a schematic diagram for explaining image processing in the image processing apparatus according to the first embodiment.
  • FIG. 8 is a block diagram for explaining an operation in the image processing apparatus according to the second embodiment of the present disclosure.
  • FIG. 9 is a diagram for explaining details of a portion related to alignment in FIG.
  • FIG. 10 is a diagram for explaining the details of the part related to the synthesis process in FIG.
  • FIG. 11 is a schematic diagram for explaining image processing in the image processing apparatus according to the second embodiment.
  • FIG. 12 is a block diagram for explaining an operation in the image processing apparatus according to the third embodiment of the present disclosure.
  • FIG. 13 is a diagram in which a part related to the alignment in FIG. 12 is extracted and described.
  • FIG. 14 is a diagram for explaining details of a portion related to the synthesis process in FIG.
  • FIG. 15 is a schematic diagram for explaining image processing in the image processing apparatus according to the third embodiment.
  • FIG. 9 is a diagram for explaining details of a portion related to alignment in FIG.
  • FIG. 10 is a diagram for explaining the details of the part related to the synthesis process in FIG.
  • FIG. 16 is a block diagram for explaining an operation in the image processing apparatus according to the fourth embodiment of the present disclosure.
  • FIG. 17 is a diagram for explaining details of a portion related to the synthesis process in FIG. 16.
  • FIG. 18 is a schematic diagram for explaining image processing in the image processing apparatus according to the fourth embodiment.
  • FIG. 19 is a block diagram illustrating an example of a schematic configuration of the vehicle control system.
  • FIG. 20 is an explanatory diagram illustrating an example of the installation positions of the vehicle exterior information detection unit and the imaging unit.
  • the image deformation processing unit is A first image correction unit that generates an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment unit that generates an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction unit; and Of the images generated by the alignment unit, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction unit, thereby aligning the reference image.
  • a second image correction unit for generating a processed image including, It can be configured.
  • Image transformation processing A first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment process for generating an image that is one-dimensionally aligned based on an image generated based on the reference image for the image generated by the first image correction process; and Of the images generated by the alignment process, the image except the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction process, thereby aligning the reference image.
  • a second image correction step for generating a processed image including, It can be configured.
  • the image transformation process step A first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment step for generating an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction step; and Of the images generated by the alignment step, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction step, thereby aligning the reference image.
  • a second image correction step for generating a processed image including, It can be configured.
  • the first image correction step includes Including a lens distortion correction process and a parallelization process
  • the second image correction step Including an anti-parallelization process and an inverse lens distortion correction process, It can be configured.
  • the lens distortion correction process consists of a calculation process using a predetermined lens coefficient
  • the reverse lens distortion correction step includes a calculation process using a coefficient different from the predetermined lens coefficient. It can be configured.
  • the parallelization process consists of a calculation process using a predetermined matrix
  • the anti-parallel processing step includes an arithmetic process using a matrix different from the predetermined matrix. It can be configured.
  • the matrix used in the parallelization processing step and the matrix used in the antiparallelization processing step can be configured such that one is in the relationship of the other inverse matrix.
  • the reference image is a monochrome image
  • the other images are color images. It can be configured.
  • image synthesis is performed. It can be configured.
  • the image transformation process Generating a disparity map storing disparity information based on the image generated based on the reference image based on the image generated in the alignment step; and A step of generating a parallax map that is aligned with respect to the reference image by performing a calculation process opposite to the calculation process in the first image correction step on the generated parallax map; including, It can be configured.
  • Controlling the depth of field in the image composition based on the parallax map aligned with the reference image can be configured.
  • the reference image is a monochrome image
  • the other images are color images. Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed. It can be configured.
  • the image deformation processing unit and the synthesis unit may be configured to operate based on physical connection by hardware, or configured to operate based on a program. It may be.
  • the program may be provided by being stored in a computer-readable storage medium, or may be provided by distribution via wired or wireless communication means.
  • An imaging unit used in an imaging system includes, for example, a plurality of imaging elements such as a CMOS sensor and a CCD sensor in which pixels including photoelectric conversion elements and various pixel transistors are arranged in a two-dimensional matrix in a row direction and a column direction. , Can be configured by being spaced apart. Although it depends on the electronic device on which the imaging unit is mounted, basically, it is preferable that the separation interval be as narrow as possible.
  • the number of cameras constituted by the image sensor is not limited to two, but may be three or more. This disclosure even if each camera has multiple colors, each camera has different colors, each camera has a different number of pixels, or each camera has a different angle of view Is applicable.
  • an image with the highest resolution among a plurality of images obtained by capturing the same subject as the reference image it is preferable to select an image with the highest resolution among a plurality of images obtained by capturing the same subject as the reference image.
  • an image pickup device for color photographing since a single pixel is usually composed of a plurality of pixel groups, the resolution is lower than that of an image pickup device for monochrome image pickup. Therefore, when a color image and a monochrome image are mixed, it is preferable to select the monochrome image as the reference image.
  • the present disclosure can be applied within a range where the same subject is captured. Furthermore, even when the number of pixels differs between images, the present disclosure can be applied by, for example, aligning the number of pixels by image resizing processing or the like.
  • the configuration of the combining unit that combines the images is not particularly limited as long as the operation of the present disclosure is not hindered.
  • a configuration in which a plurality of images are combined to improve S / N a configuration in which color information is combined by adding color information to a monochrome image, and the depth of field is adjusted using disparity information between images.
  • a configuration in which another image is sandwiched so as to correspond to a predetermined depth location can be exemplified.
  • the first embodiment relates to an image processing apparatus, an image processing method, an imaging system, and an image processing program according to the present disclosure.
  • FIG. 1 is a schematic diagram for explaining a configuration of an imaging system according to the first embodiment of the present disclosure.
  • the imaging system 1 An imaging unit 10 that captures a plurality of images of the same subject; One of a plurality of images captured by the imaging unit 10 is used as a reference image, an image that can be one-dimensionally aligned is generated based on each of the plurality of images, deformed, and aligned with the reference image.
  • the imaging unit 10 includes a camera A and a camera B arranged at a predetermined interval. These are constituted by an optical system such as a lens and a CMOS sensor. For convenience of explanation, it is assumed in the first embodiment that both are for monochrome imaging.
  • An image captured by the camera A is denoted by reference numeral 10A
  • an image captured by the camera B is denoted by reference numeral 10B.
  • the image deformation processing unit 110 and the synthesis unit 120 are formed on a semiconductor substrate made of silicon, for example.
  • the image deformation processing unit 110 and the composition unit 120 constitute an image processing apparatus 100.
  • the operation of the entire imaging system 1 is controlled by a control circuit (not shown).
  • the imaging system 1 may be configured as an integral unit, for example, so as to be suitable for device incorporation, or may be configured as a separate body.
  • a configuration in which the image deformation processing unit 110 and the combining unit 120 are provided on a control board of a portable electronic device may be employed.
  • the image deformation processing unit 110 A first image correction unit 111 that generates an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment unit 112 that generates an image that is one-dimensionally aligned based on an image generated based on the reference image for the image generated by the first image correction unit 111; and Of the images generated by the alignment unit 112, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction unit 111, whereby the position relative to the reference image is determined.
  • One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned based on each of the plurality of images is generated and then transformed.
  • Perform image deformation processing to generate an image that is aligned with the reference image Image synthesis is performed using information based on the reference image and information based on the image generated by the image transformation process.
  • a first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject;
  • An alignment process for generating an image that is one-dimensionally aligned based on an image generated based on the reference image for the image generated by the first image correction process; and
  • the image except the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction process, thereby aligning the reference image.
  • a second image correction step for generating a processed image including.
  • An image processing apparatus 100 shown in FIG. 1 is configured to operate by executing an image processing program stored in a storage device (not shown).
  • the image processing program One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • An image deformation processing step for generating a processed image, and Performing image synthesis using information based on a reference image and information based on an image generated by image deformation processing; Processing including
  • a first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject;
  • An alignment step for generating an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction step; and
  • the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction step, thereby aligning the reference image.
  • a second image correction step for generating a processed image including.
  • the basic configuration of the imaging system 1 has been described above. Next, in order to help understanding of the present disclosure, a reference example of a configuration in which the second image correction unit is omitted and problems thereof will be described.
  • FIG. 2 is a schematic diagram for explaining a configuration of an imaging system including an image processing apparatus of a reference example.
  • the image processing apparatus 900 used in the imaging system 9 of the reference example has a configuration in which the second image correction unit 113 is omitted from the image processing apparatus 100 illustrated in FIG.
  • An image deformation processing unit in the image processing apparatus 900 is denoted by reference numeral 910.
  • the image deformation processing unit 910 includes a first image correction unit 111 and an alignment unit 112.
  • FIG. 3 is a schematic diagram for explaining image processing in the image processing apparatus of the reference example.
  • FIG. 4 is a block diagram for explaining the operation of the image processing apparatus of the reference example.
  • the image of the camera A and the image of the camera B are only one-dimensional translational deviations, high-speed alignment can be performed. However, the image is distorted due to factors of an optical system such as a lens.
  • a process such as a lens distortion correction process or a parallelization process (Rectification) is performed to correct the image, and the shift between the images is corrected only to the translational shift.
  • the image processing methods of camera A and camera B are basically the same except that the lens distortion correction coefficient and the parallelization matrix are different.
  • the image processing of the camera A will be described with reference to FIG.
  • the first image correction unit 111 generates an image capable of one-dimensional alignment based on each of a plurality of images obtained by capturing the same subject. Specifically, steps including a lens distortion correction process and a parallelization process are executed.
  • the lens distortion correction step includes arithmetic processing using a predetermined lens coefficient
  • the parallelization processing step includes arithmetic processing using a predetermined matrix.
  • lens distortion correction (step S11A) is performed on the image 10A of the camera A.
  • a predetermined calculation process is performed with reference to the lens distortion correction coefficient 11A for the camera A.
  • a parallelization process (step S12A) is performed on the image that has undergone step S11A.
  • a predetermined calculation process is performed with reference to the parallelization matrix 12A for the camera A.
  • the arithmetic processing performed in the above-described steps includes, for example, sign (u, v) after image distortion correction and parallelization processing, and sign (c x ', cy ') after lens distortion correction and parallelization processing.
  • sign (x ′ ′′, y ′ ′′) is image coordinates before lens distortion correction and parallelization
  • sign (c x , cy ) is lens distortion correction and parallelization.
  • the previous image center and the sign (f x , f y ) are the f values before the lens distortion correction and the parallelization processing, they are expressed as the following equations (1-1) to (1-6). .
  • This process takes into account the change in the number of pixels before and after the deformation of the image, calculates the decimal coordinate position of the image before deformation from the integer coordinate position of the image after deformation based on the image after deformation, and the decimal coordinate position This is an example of Backward Mapping in which the pixel value at is obtained by interpolation.
  • step S11B the image 10B of the camera B is subjected to lens distortion correction (step S11B) and parallelization processing (step S12A).
  • step S11B lens distortion correction
  • step S12A parallelization processing
  • the calculation amount associated with the correction processing described above is an order of O (n 2 ) when expressed in Landau notation.
  • the alignment unit 112 executes a step including an alignment step of generating an image that is one-dimensionally aligned with respect to the image generated by the first image correction unit 111 using the image generated based on the reference image as a reference. To do.
  • step S13B alignment processing is performed using the image 10A md1 and the image 10B md1 .
  • description will be made assuming that the alignment is performed with the image 10Amd1 as a reference.
  • step S13B the corrected image 10B md2 of the camera B corrected to the camera A viewpoint is obtained.
  • the calculation amount associated with the alignment process described above is an order of O (n 3 ).
  • the calculation amount for the system 9 of the reference example is dominant in the order of O (n 3 ) in the alignment. If two-dimensional alignment is performed by omitting lens distortion correction and parallelization processing, the calculation amount associated with the alignment processing is on the order of O (n 4 ), and calculation required for alignment is required. Cost is excessive.
  • the image 10A md1 of the camera A and the image 10B md2 of the camera B are in a relationship in which the images substantially match by alignment.
  • the synthesizing unit 120 performs the synthesizing process using the image 10A md1 of the camera A and the image 10B md2 of the camera B.
  • the resolution of the image 10Amd1 of the camera A is deteriorated by the lens distortion correction, and further, the resolution is deteriorated in the parallelization process.
  • the imaging system 1 After performing correction processing that causes resolution degradation of both the camera A image and the camera B image, high-speed alignment is executed by one-dimensional alignment. Then, the corrected image of the camera B is deformed according to the original image of the camera A.
  • the image of camera A which is one of the images of camera A and camera B that have captured the same subject, is the reference image. The same applies to other embodiments.
  • FIG. 5 is a block diagram for explaining the operation of the image processing apparatus according to the first embodiment.
  • the image deformation processing unit 110 in the image processing apparatus 100 according to the first embodiment has a configuration in which a second image correction unit 113 is added after the image deformation processing unit 910 described in the reference example. 5 are the same as those in FIG. For the convenience of illustration, in FIG. 5, a part of the processing process in the portion denoted by reference numeral 910 is omitted.
  • the second image correction unit 113 performs an arithmetic process opposite to the arithmetic process in the first image correction process on an image generated based on the reference image among the images generated in the alignment process. Steps including a second image correction step of generating an image aligned with the reference image are executed. Specifically, steps including an anti-parallelization process and an inverse lens distortion correction process are executed.
  • the deparallelization process includes an arithmetic process using a matrix different from the matrix used for the parallelization process, and the inverse lens distortion correction process is different from the lens coefficient used for the lens distortion correction. Comprising arithmetic processing using coefficients.
  • the second image correction unit 113 generates an image that is aligned with the original image 10A of the camera A.
  • the anti-parallelization process (step S21B) is performed on the image 10B md2 of the camera B described in the reference example.
  • a predetermined calculation process is performed with reference to the parallelization matrix 21A for the camera A.
  • one of the matrices used for the calculation is in the relationship of the inverse matrix of the other.
  • step S22B reverse lens distortion correction
  • a predetermined calculation process is performed with reference to the reverse lens distortion correction coefficient 22A for the camera A.
  • a post- correction image 10B md3 of the camera B obtained by performing the anti-parallel processing and the reverse lens distortion correction on the image 10B md2 of the camera B is obtained.
  • the reverse lens distortion correction coefficient a coefficient that causes the lens distortion generated by the camera A is selected in advance.
  • the inverse lens distortion correction coefficient will be described in detail in the description of Expression (2-3) described later.
  • the calculation amount associated with the above-described processing is an order of O (n 2 ) when expressed in Landau notation. Therefore, the calculation amount in the imaging system 1 is also dominant in the order of O (n 3 ) in the alignment. In other words, the calculation amount of the imaging system 1 is approximately the same as that of the imaging system 9 of the reference example.
  • Equation (2-3) has a shape similar to Equation (1-5) described above, and expresses radial distortion and circumferential distortion, which are the main factors of lens distortion.
  • This type of formula is generally used for lens distortion correction processing. However, it is possible to correct (deform) an image with lens distortion so that there is no lens distortion, or to correct (deform) an image without lens distortion so as to have lens distortion. It can be explained by deformation. Therefore, this type of formula can be used for any application.
  • the coefficients (k 1 , k 2 ..., K 6 , p 1 , p 2 ) used in the equation (1-5) can be calculated using a general lens calibration method.
  • the coefficient calculated in this way is used as the camera A lens distortion correction coefficient 11A.
  • Equation (3-1) is the third-order polynomial in the radial direction
  • Equation (3-2) is the fifth-order polynomial in the radial direction
  • Equation (3-3) is the seventh-order polynomial in the radial direction
  • Equation (3-4) is the third-order polynomial in the radial direction.
  • equation (3-5) is tangential distortion (Conrady, 1919)
  • equation (3-6) is tangential distortion (Brown, 1966)
  • equation (3-7) is a radial seventh-order polynomial Tangential distortion (Brown, 1971)
  • equation (3-8) is a radial seventh-order polynomial and tangential distortion (Murai, 1971).
  • FIG. 6A, FIG. 6B, and FIG. 6C are drawing-substituting photographs for explaining an example of the alignment execution result in the image processing apparatus according to the first embodiment.
  • FIG. 6A is an example of an image 10A captured by the camera A
  • FIG. 6B is an example of an image 10B captured by the camera B. Since the camera A and the camera B are at different positions, there is also a deviation in the position of the captured image.
  • 6C is an example of the image of the camera B transformed to the viewpoint of the camera A and corresponding to the image 10B md3 shown in FIG.
  • FIG. 7 is a schematic diagram for explaining image processing in the image processing apparatus according to the first embodiment.
  • the combining unit 120 performs a combining process using the original image 10A of the camera A and the image 10B md3 of the camera B.
  • the imaging system 1 after performing correction processing that causes resolution degradation of both the camera A image and the camera B image, high-speed alignment is executed by one-dimensional alignment. Then, the corrected image of the camera B is deformed according to the original image of the camera A.
  • the calculation cost required to transform the image of the camera B in accordance with the original image of the camera A can be kept in the order of O (n 3 ).
  • O O
  • an original image can be used for the image of camera A. Therefore, a decrease in resolution can be reduced as compared with the reference example in which the synthesis is performed in a state where both the camera A and the camera B have reduced resolution.
  • the imaging system 1 it is possible to reduce a decrease in the resolution of the composite image and to perform processing within a time that does not hinder the operation.
  • the imaging unit 10 further includes a camera C, a camera D, and the like
  • the same processing as that for the image of the camera B may be performed on the images of the camera C and the camera D. Therefore, the first embodiment can be applied even when the imaging unit 10 includes three or more cameras. The same applies to other embodiments described later.
  • the second embodiment also relates to an image processing device, an image processing method, an imaging system, and an image processing program according to the present disclosure.
  • the second embodiment is mainly different from the first embodiment in that the camera B captures a Bayer array color image and that the operation of the image processing apparatus is partially different due to this. Is different.
  • the reference image is a monochrome image, and the other images are color images.
  • the Bayer array camera performs color imaging using a group of photoelectric conversion elements corresponding to [R, G, G, B]. Therefore, the resolution is lower in principle than monochrome imaging.
  • luminance information is obtained from the black-and-white image of camera A
  • color information is obtained from the image of camera B
  • the images are synthesized to synthesize a high-resolution color image.
  • the imaging system includes an imaging system 1 as an imaging system 2, an image processing device 100 as an image processing device 200, a first image correction unit 111 as a first image correction unit 211, and the like.
  • the alignment unit 112 may be read as the alignment unit 212, the second image correction unit 113 as the second image correction unit 213, and the synthesis unit 120 as the synthesis unit 220.
  • An image captured by the camera A is denoted by reference numeral 10A_BW, and an image captured by the camera B is denoted by reference numeral 10B_Bayer.
  • FIG. 8 is a block diagram for explaining the operation of the image processing apparatus according to the second embodiment of the present disclosure.
  • FIG. 9 is a diagram for explaining details of a portion related to alignment in FIG.
  • the portion composed of the first image correction unit 211 and the alignment unit 212 is represented by reference numeral 910A in consideration of the relationship with FIG. 2, FIG. 4, and FIG.
  • step S11A, step S12A the same processing as the processing described for the image 10A in the first embodiment is performed to obtain a parallelized image 10A_BW md1 .
  • the image 10B_bayer is first subjected to a complementing process (step S201B) and converted to an RGB image in which all RGB colors are aligned in one pixel.
  • the converted image is denoted by reference numeral 10B_RGB.
  • step S11B, step S12B the image 10B_RGB is subjected to the same processing (step S11B, step S12B) as the processing described for the image 10B in the first embodiment to obtain a parallelized image 10B_RGB md1 .
  • luminance information is generated based on the color image 10B_RGB md1 (step S202B).
  • step S213B alignment processing is performed using the luminance information generated based on the image 10A_BW md1 and the image 10B_RGB md1 , and further based on the image 10B_RGB md1 .
  • the alignment is performed with the image 10Amd1 as a reference.
  • step S213B the corrected image 10B_RGB md2 of the camera B corrected to the camera A viewpoint is obtained.
  • FIG. 10 is a diagram for explaining the details of the part related to the synthesis process in FIG.
  • the second image correction unit 213 generates an image that is aligned with the original image 10A_BW of the camera A. Specifically, an antiparallelization process (step S21B) is performed on the image 10B_RGB md2 . Next, reverse lens distortion correction (step S22B) is performed on the image that has undergone step S21B. As a result, a post- correction image 10B_RGB md3 of the camera B obtained by performing the anti-parallel processing and the reverse lens distortion correction on the image 10B_RGB md2 of the camera B is obtained. Since this process is the same as the process described in the first embodiment, a description thereof will be omitted.
  • FIG. 11 is a schematic diagram for explaining image processing in the image processing apparatus according to the second embodiment.
  • image composition is performed based on the luminance information of the reference image and the color information of the image generated by the second image correction step.
  • the combining unit 220 performs combining processing by using the image 10B_RGB md3 the original image 10A_BW the camera B of the camera A (step shown in FIG. 10 S241A). Specifically, a color image of the image 10A_BW is generated using the luminance information of the original image 10A_BW of the camera A and the color information of the corrected image 10B_RGB md3 of the camera B.
  • This process is a process of superimposing the color information of the camera B on the luminance information of the original image 10A_BW of the camera A.
  • the human eye is highly sensitive to changes in luminance, but has low sensitivity to changes in color. Therefore, the synthesized image is a color image with excellent resolution.
  • the case of the Bayer array has been described.
  • one-dimensional alignment can be performed by generating luminance information. Therefore, the present disclosure can be applied regardless of what color information the camera constituting the imaging unit has.
  • the third embodiment also relates to an image processing device, an image processing method, an imaging system, and an image processing program according to the present disclosure.
  • a disparity map storing disparity information based on an image generated based on a reference image is generated based on the image generated in the alignment step, and then the generated disparity map is generated.
  • a parallax map that is aligned with respect to the reference image is generated by performing a calculation process opposite to the calculation process in the first image correction step. Then, based on the parallax map aligned with the reference image, the depth of field is controlled in image synthesis.
  • the imaging system includes an imaging system 1 as an imaging system 3, an image processing device 100 as an image processing device 300, an alignment unit 112 as an alignment unit 312, and a second image correction.
  • the unit 113 may be read as the second image correction unit 313, and the combining unit 120 may be read as the combining unit 320.
  • an image captured by the camera A is denoted by reference numeral 10A
  • an image captured by the camera B is denoted by reference numeral 10B.
  • FIG. 12 is a block diagram for explaining the operation of the image processing apparatus according to the third embodiment of the present disclosure.
  • FIG. 13 is a diagram for explaining details of a portion related to alignment in FIG.
  • the portion composed of the first image correction unit 111 and the alignment unit 312 is represented by reference numeral 910B in consideration of the relationship with FIG. 2, FIG. 4, and FIG.
  • a parallax map DP_MAP storing a parallax amount for each pixel is also generated. Since the amount of parallax after one-dimensional alignment is performed, the parallax map DP_MAP is a map (disparity map) in which the amount of horizontal pixel displacement is stored.
  • FIG. 14 is a diagram for explaining the details of the part related to the synthesis process in FIG.
  • the second image correction unit 313 generates a parallax map that is aligned with the original image 10A of the camera A. Specifically, an anti-parallel process (step S331B) is performed on the parallax map DP_MAP. Next, reverse lens distortion correction (step S332B) is performed on the image that has undergone step S331B. As a result, the post-correction parallax map DP_MAP md1 of the camera B obtained by performing the anti-parallelization process and the reverse lens distortion correction on the parallax map DP_MAP of the camera B is obtained. This process is the same as that applied to the parallax map DP_MAP in the first embodiment, since the process of changing the image 10B md2 of the camera B to the image 10B md3 is not described.
  • FIG. 15 is a schematic diagram for explaining image processing in the image processing apparatus according to the third embodiment.
  • the composition unit 320 performs composition processing using the original image 10A of the camera A and the corrected parallax map DP_MAP md1 of the camera B (step S341A illustrated in FIG. 14). Specifically, an image in which the depth of field of the image 10A is controlled using the luminance information of the original image 10A of the camera A and the information of the corrected parallax map DP_MAP md1 of the camera B (image 10A_fmd shown in FIG. 14). Is generated.
  • This process is a process of adjusting the focus of the original image 10A of the camera A based on the value of the parallax map. Since the parallax map DP_MAP md1 is the depth information of the image 10A, for example, when finishing in a portrait style, a process of blurring the background may be performed.
  • a disparity map can be calculated without deforming an image of a camera that is desired to avoid resolution degradation.
  • a resolution restoration process such as a cross bilateral filter is applied using a camera image that avoids resolution degradation, thereby reducing the resolution of the disparity map. You may compensate.
  • the fourth embodiment also relates to an image processing device, an image processing method, an imaging system, and an image processing program according to the present disclosure.
  • the fourth embodiment has a configuration in which the third embodiment is further applied to the second embodiment.
  • the imaging system includes an imaging system 1 as an imaging system 4, an image processing device 100 as an image processing device 400, a first image correction unit 111 as a first image correction unit 411, and the like.
  • the alignment unit 112 may be read as the alignment unit 412, the second image correction unit 113 as the second image correction unit 413, and the synthesis unit 120 as the synthesis unit 420.
  • Camera A and camera B constituting the imaging unit 10 are the same as those in the second embodiment. Camera A captures a monochrome image, and camera B captures a Bayer color image.
  • An image captured by the camera A is denoted by reference numeral 10A_BW, and an image captured by the camera B is denoted by reference numeral 10B_Bayer.
  • FIG. 16 is a block diagram for explaining the operation of the image processing apparatus according to the fourth embodiment of the present disclosure.
  • the portion including the first image correction unit 411 and the alignment unit 412 is represented by reference numeral 910C in consideration of the relationship with FIG. 2, FIG. 4, and FIG.
  • the image 10B_RGB md2 is generated in step S213B shown in FIG.
  • a parallax map DP_MAP is also generated.
  • the generation process of the image 10B_RGB md2 is the same as the process described in the second embodiment.
  • the process of generating the parallax map DP_MAP is the same as the process described in the third embodiment. Therefore, detailed description is omitted.
  • FIG. 17 is a diagram for explaining details of a portion related to the synthesis process in FIG. 16.
  • FIG. 18 is a schematic diagram for explaining image processing in the image processing apparatus according to the fourth embodiment.
  • the depth of field is controlled in the image composition based on the parallax map that is aligned with the reference image. Further, image composition is performed based on the luminance information of the reference image and the color information of the image generated by the second image correction process.
  • the combining unit 420 performs combining processing using the corrected parallax map DP_MAP md1 in addition to the original image 10A_BW of the camera A and the corrected image 10B_RGB md2 of the camera B. This is performed (step S441A shown in FIG. 14). Specifically, the image 10A_RGB is obtained by colorizing the image 10A_BW and controlling the depth of field by using the luminance information of the original image 10A_BW of the camera A, the color information of the image 10B_RGB md2 , and the information of the parallax map DP_MAP md1. Generate fmd .
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be any kind of movement such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, and an agricultural machine (tractor).
  • FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system 7000 that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .
  • the communication network 7010 for connecting the plurality of control units conforms to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives various devices to be controlled. Is provided.
  • Each control unit includes a network I / F for communicating with other control units via a communication network 7010, and is connected to devices or sensors inside and outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG.
  • a microcomputer 7610 As a functional configuration of the integrated control unit 7600, a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated.
  • other control units include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • a vehicle state detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the rotational movement of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of sensors for detecting an angle, an engine speed, a rotational speed of a wheel, or the like is included.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, or the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 7200 can be input with radio waves or various switch signals transmitted from a portable device that substitutes for a key.
  • the body system control unit 7200 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310 that is a power supply source of the drive motor according to various programs. For example, information such as battery temperature, battery output voltage, or remaining battery capacity is input to the battery control unit 7300 from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature adjustment of the secondary battery 7310 or the cooling device provided in the battery device.
  • the outside information detection unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted.
  • the outside information detection unit 7400 is connected to at least one of the imaging unit 7410 and the outside information detection unit 7420.
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the outside information detection unit 7420 detects, for example, current weather or an environmental sensor for detecting weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the surrounding information detection sensors.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the imaging unit 7410 and the outside information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 20 shows an example of installation positions of the imaging unit 7410 and the vehicle outside information detection unit 7420.
  • the imaging units 7910, 7912, 7914, 7916, and 7918 are provided at, for example, at least one of the front nose, the side mirror, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior of the vehicle 7900.
  • An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • Imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900.
  • An imaging unit 7916 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or
  • FIG. 20 shows an example of shooting ranges of the respective imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided in the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided in the side mirrors, respectively
  • the imaging range d The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916, an overhead image when the vehicle 7900 is viewed from above is obtained.
  • the vehicle outside information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners of the vehicle 7900 and the upper part of the windshield in the vehicle interior may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle outside information detection units 7920, 7926, and 7930 provided on the front nose, the rear bumper, the back door, and the windshield in the vehicle interior of the vehicle 7900 may be, for example, LIDAR devices.
  • These outside information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.
  • the vehicle exterior information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the vehicle exterior information detection unit 7420 connected thereto. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the outside information detection unit 7400 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received information.
  • the vehicle exterior information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, or the like based on the received information.
  • the vehicle outside information detection unit 7400 may calculate a distance to an object outside the vehicle based on the received information.
  • the outside information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by the different imaging units 7410 to generate an overhead image or a panoramic image. Also good.
  • the vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • the vehicle interior information detection unit 7500 detects vehicle interior information.
  • a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500.
  • Driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects biometric information of the driver, a microphone that collects sound in the passenger compartment, and the like.
  • the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of an occupant sitting on the seat or a driver holding the steering wheel.
  • the vehicle interior information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and determines whether the driver is asleep. May be.
  • the vehicle interior information detection unit 7500 may perform a process such as a noise canceling process on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
  • the integrated control unit 7600 may be input with data obtained by recognizing voice input through a microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. May be.
  • the input unit 7800 may be, for example, a camera.
  • the passenger can input information using a gesture.
  • data obtained by detecting the movement of the wearable device worn by the passenger may be input.
  • the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600.
  • a passenger or the like operates the input unit 7800 to input various data or instruct a processing operation to the vehicle control system 7000.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like.
  • the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • General-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX, LTE (Long Term Evolution) or LTE-A (LTE-Advanced), or a wireless LAN (Wi-Fi). (Also referred to as (registered trademark)) and other wireless communication protocols such as Bluetooth (registered trademark) may be implemented.
  • GSM Global System of Mobile communications
  • WiMAX Wireless LAN
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution-A
  • Wi-Fi wireless LAN
  • Bluetooth registered trademark
  • the general-purpose communication I / F 7620 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. May be.
  • the general-purpose communication I / F 7620 is a terminal (for example, a driver, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the vehicle using, for example, P2P (Peer To Peer) technology. You may connect with.
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in vehicles.
  • the dedicated communication I / F 7630 is a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of the lower layer IEEE 802.11p and the upper layer IEEE 1609. May be implemented.
  • the dedicated communication I / F 7630 typically includes vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) Perform V2X communication, which is a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), performs positioning, and performs latitude, longitude, and altitude of the vehicle.
  • the position information including is generated.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station installed on the road, and acquires information such as the current position, traffic jam, closed road, or required time. Note that the function of the beacon receiving unit 7650 may be included in the dedicated communication I / F 7630 described above.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I / F 7660 may establish a wireless connection using a wireless communication protocol such as a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I / F 7660 is connected to a USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (Mobile) via a connection terminal (and a cable if necessary). Wired connection such as High-definition Link) may be established.
  • the in-vehicle device 7760 may include, for example, at least one of a mobile device or a wearable device that a passenger has, or an information device that is carried into or attached to the vehicle.
  • In-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination.
  • In-vehicle device I / F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
  • the in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680.
  • the vehicle control system 7000 is controlled according to various programs based on the acquired information. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. Also good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning. You may perform the cooperative control for the purpose. Further, the microcomputer 7610 controls the driving force generator, the steering mechanism, the braking device, or the like based on the acquired information on the surroundings of the vehicle, so that the microcomputer 7610 automatically travels independently of the driver's operation. You may perform the cooperative control for the purpose of driving.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 7610 is information acquired via at least one of the general-purpose communication I / F 7620, the dedicated communication I / F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I / F 7660, and the in-vehicle network I / F 7680.
  • the three-dimensional distance information between the vehicle and the surrounding structure or an object such as a person may be generated based on the above and local map information including the peripheral information of the current position of the vehicle may be created.
  • the microcomputer 7610 may generate a warning signal by predicting a danger such as a collision of a vehicle, approach of a pedestrian or the like or an approach to a closed road based on the acquired information.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
  • Display unit 7720 may include at least one of an on-board display and a head-up display, for example.
  • the display portion 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, wearable devices such as glasses-type displays worn by passengers, projectors, and lamps.
  • the display device can display the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. Further, when the output device is an audio output device, the audio output device converts an audio signal made up of reproduced audio data or acoustic data into an analog signal and outputs it aurally.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be configured by a plurality of control units.
  • the vehicle control system 7000 may include another control unit not shown.
  • some or all of the functions of any of the control units may be given to other control units. That is, as long as information is transmitted and received via the communication network 7010, the predetermined arithmetic processing may be performed by any one of the control units.
  • a sensor or device connected to one of the control units may be connected to another control unit, and a plurality of control units may transmit / receive detection information to / from each other via the communication network 7010. .
  • the technology according to the present disclosure can be applied to, for example, the imaging unit of the outside information detection unit among the configurations described above. That is, a high-resolution image can be acquired, and parallax information using a parallax map can be obtained, so that more detailed information can be obtained.
  • A1 One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • An image processing apparatus An image processing apparatus.
  • the image transformation processing unit A first image correction unit that generates an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment unit that generates an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction unit; and Of the images generated by the alignment unit, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction unit, thereby aligning the reference image.
  • a second image correction unit for generating a processed image including, The image processing apparatus according to [A1].
  • the first image correction unit Perform processing including a lens distortion correction step and a parallelization processing step, The second image correction step Performing a process including an anti-parallelization process and an inverse lens distortion correction process;
  • the image processing apparatus according to [A2].
  • the lens distortion correction process consists of a calculation process using a predetermined lens coefficient,
  • the reverse lens distortion correction step includes a calculation process using a coefficient different from the predetermined lens coefficient.
  • the parallelization process consists of a calculation process using a predetermined matrix,
  • the anti-parallel processing step includes an arithmetic process using a matrix different from the predetermined matrix.
  • the image processing apparatus according to [A3] or [A4].
  • [A6] One of the matrix used in the parallelization process and the matrix used in the antiparallel process is in the relationship of the other inverse matrix.
  • the image processing apparatus according to [A5].
  • [A7] Of the multiple images of the same subject the reference image is a monochrome image, and the other images are color images.
  • the image processing device according to any one of [A1] to [A6].
  • the image processing apparatus according to [A7].
  • the image transformation processing unit further includes Generating a disparity map storing disparity information based on an image generated based on the reference image based on the image generated by the alignment unit; and A step of generating a parallax map that is aligned with respect to the reference image by performing a calculation process opposite to the calculation process in the first image correction unit on the generated parallax map; I do, The image processing apparatus according to any one of [A2] to [A8]. [A10] Controlling the depth of field in the image composition based on the parallax map aligned with the reference image; The image processing apparatus according to [A9]. [A11] Among multiple images of the same subject, the reference image is a monochrome image, and the other images are color images. Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed. The image processing apparatus according to [A10].
  • One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • Image deformation processing to generate Image synthesis is performed using information based on the reference image and information based on the image generated by the image transformation process.
  • Image processing method is performed using information based on the reference image and information based on the image generated by the image transformation process.
  • [B2] Image transformation processing A first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment process for generating an image that is one-dimensionally aligned based on an image generated based on the reference image for the image generated by the first image correction process; and Of the images generated by the alignment process, the image except the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction process, thereby aligning the reference image.
  • a second image correction step for generating a processed image including, The image processing method according to [B1] above.
  • the first image correction step includes Including a lens distortion correction process and a parallelization process
  • the second image correction step Including an anti-parallelization process and an inverse lens distortion correction process
  • the lens distortion correction process consists of a calculation process using a predetermined lens coefficient
  • the reverse lens distortion correction step includes a calculation process using a coefficient different from the predetermined lens coefficient.
  • the parallelization process consists of a calculation process using a predetermined matrix
  • the anti-parallel processing step includes an arithmetic process using a matrix different from the predetermined matrix.
  • [B6] One of the matrix used in the parallelization process and the matrix used in the antiparallel process is in the relationship of the other inverse matrix.
  • the reference image is a monochrome image, and the other images are color images.
  • image synthesis is performed.
  • the image transformation process Generating a disparity map storing disparity information based on the image generated based on the reference image based on the image generated in the alignment step; and A step of generating a parallax map that is aligned with respect to the reference image by performing a calculation process opposite to the calculation process in the first image correction step on the generated parallax map; including, The image processing method according to any one of [B2] to [B8].
  • [B10] Controlling the depth of field in the image composition based on the parallax map aligned with the reference image; The image processing method according to [B9] above.
  • the reference image is a monochrome image, and the other images are color images. Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed. The image processing method according to [B10].
  • An imaging system comprising: [C2] The image transformation processing unit A first image correction unit that generates an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject; An alignment unit that generates an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction unit; and Of the images generated by the alignment unit, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction unit, thereby aligning the reference image.
  • a second image correction unit for generating a processed image including, The imaging system according to [C1] above.
  • the first image correction unit Perform processing including a lens distortion correction step and a parallelization processing step, The second image correction step Performing a process including an anti-parallelization process and an inverse lens distortion correction process;
  • the lens distortion correction process consists of a calculation process using a predetermined lens coefficient,
  • the reverse lens distortion correction step includes a calculation process using a coefficient different from the predetermined lens coefficient.
  • the parallelization process consists of a calculation process using a predetermined matrix
  • the anti-parallel processing step includes an arithmetic process using a matrix different from the predetermined matrix.
  • One of the matrix used in the parallelization process and the matrix used in the antiparallel process is in the relationship of the other inverse matrix.
  • the reference image is a monochrome image, and the other images are color images.
  • the image transformation processing unit further includes Generating a disparity map storing disparity information based on an image generated based on the reference image based on the image generated by the alignment unit; and A step of generating a parallax map that is aligned with respect to the reference image by performing a calculation process opposite to the calculation process in the first image correction unit on the generated parallax map; I do, The imaging system according to any one of [C2] to [C8].
  • [C10] Controlling the depth of field in the image composition based on the parallax map aligned with the reference image;
  • the reference image is a monochrome image, and the other images are color images. Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed.
  • image synthesis is performed.
  • [D1] One of a plurality of images obtained by capturing the same subject is used as a reference image, and an image that can be one-dimensionally aligned is generated based on each of the plurality of images.
  • An image deformation processing step for generating a processed image, and Performing image synthesis using information based on a reference image and information based on an image generated by image deformation processing;
  • An image processing program for causing a computer to execute processing including [D2]
  • the image transformation process step A first image correction step for generating an image capable of one-dimensional alignment based on each of a plurality of images obtained by imaging the same subject;
  • An alignment step for generating an image that is one-dimensionally aligned with respect to an image generated based on the reference image for the image generated by the first image correction step; and Of the images generated by the alignment step, the image excluding the image generated based on the reference image is subjected to a calculation process opposite to the calculation process in the first image correction step, thereby aligning the reference image.
  • a second image correction step for generating a processed image including, The image processing program according to [D1].
  • the first image correction step includes A lens distortion correction step and a parallelization processing step
  • the second image correction step includes Including an anti-parallelization processing step and an inverse lens distortion correction step,
  • the lens distortion correction step consists of a calculation process using a predetermined lens coefficient
  • the reverse lens distortion correction step includes a calculation process using a coefficient different from the predetermined lens coefficient.
  • the parallelization processing step includes an arithmetic processing using a predetermined matrix
  • the anti-parallel processing step includes an arithmetic process using a matrix different from the predetermined matrix.
  • One of the matrix used for the parallelization processing step and the matrix used for the antiparallelization processing step is in the relationship of the inverse matrix of the other, The image processing program according to [D5] above.
  • the reference image is a monochrome image, and the other images are color images.
  • [D8] Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed.
  • the image processing program according to any one of [D7].
  • [D9] The image transformation process Generating a disparity map storing disparity information based on the image generated based on the reference image based on the image generated in the alignment step; and Generating a parallax map that is aligned with respect to the reference image by performing a calculation process opposite to the calculation process in the first image correction step on the generated parallax map; including, The image processing program according to any one of [D2] to [D8].
  • [D10] Controlling the depth of field in the image composition based on the parallax map aligned with the reference image;
  • [D11] Among multiple images of the same subject, the reference image is a monochrome image, and the other images are color images. Based on the luminance information of the reference image and the color information of the image generated by the second image correction step, image synthesis is performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

Selon l'invention, un dispositif de traitement d'image comprend : une unité de traitement de transformation d'image qui, en considérant l'une d'une pluralité d'images obtenues en capturant des images du même sujet comme image de référence, génère une image pouvant être alignée de façon unidimensionnelle sur la base de chaque image de la pluralité d'images, et transforme l'image générée de sorte à générer une image alignée avec l'image de référence ; et une unité de composition qui exécute une composition d'image à l'aide d'informations basées sur l'image de référence et d'informations basées sur l'image générée par l'unité de traitement de transformation d'image.
PCT/JP2017/032496 2016-11-08 2017-09-08 Dispositif de traitement d'images, procédé de traitement d'images, système d'imagerie, et programme de traitement d'images Ceased WO2018088016A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-217846 2016-11-08
JP2016217846A JP2018078404A (ja) 2016-11-08 2016-11-08 画像処理装置、画像処理方法、撮像システム、及び、画像処理プログラム

Publications (1)

Publication Number Publication Date
WO2018088016A1 true WO2018088016A1 (fr) 2018-05-17

Family

ID=62109643

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/032496 Ceased WO2018088016A1 (fr) 2016-11-08 2017-09-08 Dispositif de traitement d'images, procédé de traitement d'images, système d'imagerie, et programme de traitement d'images

Country Status (2)

Country Link
JP (1) JP2018078404A (fr)
WO (1) WO2018088016A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10929982B2 (en) * 2019-01-25 2021-02-23 Google Llc Face pose correction based on depth information
US12309500B2 (en) * 2019-12-13 2025-05-20 Sony Group Corporation Trans-spectral feature detection for volumetric image alignment and colorization

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072677A (ja) * 2002-08-09 2004-03-04 Sharp Corp 画像合成装置、画像合成方法、画像合成プログラム、および画像合成プログラムを記録した記録媒体
JP2013093836A (ja) * 2011-10-03 2013-05-16 Canon Inc 撮像装置、画像処理装置およびその方法
JP2016122947A (ja) * 2014-12-25 2016-07-07 キヤノン株式会社 画像処理装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004072677A (ja) * 2002-08-09 2004-03-04 Sharp Corp 画像合成装置、画像合成方法、画像合成プログラム、および画像合成プログラムを記録した記録媒体
JP2013093836A (ja) * 2011-10-03 2013-05-16 Canon Inc 撮像装置、画像処理装置およびその方法
JP2016122947A (ja) * 2014-12-25 2016-07-07 キヤノン株式会社 画像処理装置

Also Published As

Publication number Publication date
JP2018078404A (ja) 2018-05-17

Similar Documents

Publication Publication Date Title
US10957029B2 (en) Image processing device and image processing method
JP6977821B2 (ja) 画像処理装置と画像処理方法
US10704957B2 (en) Imaging device and imaging method
WO2018163725A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et programme
TW202038088A (zh) 感測器裝置、電子機器、感測器系統及控制方法
WO2018074252A1 (fr) Dispositif et procédé de traitement d'image
JP6977722B2 (ja) 撮像装置、および画像処理システム
WO2018179671A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et dispositif de capture d'images
JP6981416B2 (ja) 画像処理装置と画像処理方法
JP7038050B2 (ja) 固体撮像装置、補正方法、および電子装置
WO2018150683A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et appareil d'imagerie
WO2018016151A1 (fr) Dispositif de traitement d'images et procédé de traitement d'images
WO2018016150A1 (fr) Dispositif de traitement d'images et procédé de traitement d'images
WO2019142660A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images, et programme
WO2018088016A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images, système d'imagerie, et programme de traitement d'images
US20230412923A1 (en) Signal processing device, imaging device, and signal processing method
WO2018034157A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et dispositif de capture d'images
WO2024150690A1 (fr) Dispositif d'imagerie à semi-conducteurs
WO2019111651A1 (fr) Système d'imagerie, dispositif de traitement d'image, et procédé de traitement d'image
WO2024237021A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2022181265A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et système de traitement d'images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17870451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17870451

Country of ref document: EP

Kind code of ref document: A1