WO2016203989A1 - Dispositif et procédé de traitement d'images - Google Patents
Dispositif et procédé de traitement d'images Download PDFInfo
- Publication number
- WO2016203989A1 WO2016203989A1 PCT/JP2016/066569 JP2016066569W WO2016203989A1 WO 2016203989 A1 WO2016203989 A1 WO 2016203989A1 JP 2016066569 W JP2016066569 W JP 2016066569W WO 2016203989 A1 WO2016203989 A1 WO 2016203989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- difference
- yaw angle
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
- G01B11/27—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
Definitions
- the present disclosure relates to an image processing device and an image processing method, and more particularly, to an image processing device and an image processing method capable of accurately estimating a difference in yaw angle between cameras caused by a secular shift of a stereo camera. .
- gesture UI User Interface
- depth depth
- a calibration method for example, there is a method of imaging a chart pattern on which a plurality of feature points having a known positional relationship are printed a plurality of times while changing the viewpoint (see, for example, Non-Patent Document 1).
- Non-Patent Document 2 it is relatively easy to estimate the difference in pitch angle of each camera that causes the positional shift in the vertical direction, which is the most important in the matching of stereo matching in the horizontal direction. be able to.
- the difference in the yaw angle of each camera which is the main factor causing the horizontal displacement of the corresponding point in the image of each camera, does not greatly affect the positional displacement in the vertical direction. Therefore, the influence of noise included in the image of each camera becomes large, and it is difficult to estimate the difference in yaw angle of each camera with high accuracy from the positional deviation in the vertical direction.
- the present disclosure has been made in view of such a situation, and makes it possible to accurately estimate the difference in yaw angle between cameras caused by the aging of a stereo camera.
- An image processing apparatus includes a position of a known region where an ideal value of parallax in a first image captured by a first imaging unit is known, and a first image captured by a second imaging unit. 2 based on the measured value of the parallax of the known area calculated based on the position of the area corresponding to the known area in the second image and the ideal value of the parallax. It is an image processing apparatus provided with the yaw angle estimation part which estimates the difference of the yaw angle of 2 imaging parts.
- the image processing method according to one aspect of the present disclosure corresponds to the image processing apparatus according to one aspect of the present disclosure.
- the position of a known region where the ideal value of parallax in the first image captured by the first imaging unit is known, and the second image captured by the second imaging unit The first imaging unit and the second imaging based on the measured parallax value of the known area calculated based on the position of the area corresponding to the known area and the ideal value of the parallax The difference in yaw angle of the part is estimated.
- the image processing apparatus can be realized by causing a computer to execute a program.
- a program to be executed by a computer can be provided by being transmitted via a transmission medium or by being recorded on a recording medium.
- the present disclosure it is possible to perform processing on an image. Further, according to one aspect of the present disclosure, it is possible to estimate the difference in yaw angle between cameras that is caused by the aging of the stereo camera with high accuracy.
- FIG. 3 is a flowchart for describing image processing of the imaging apparatus in FIG. 1. It is a flowchart explaining the detail of (phi) estimation process of FIG. It is a flowchart explaining the detail of the estimation parameter determination process of FIG. It is a figure explaining a secular deviation parameter.
- It is a block diagram which shows the structural example of the hardware of a computer. It is a block diagram which shows an example of a schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part.
- FIG. 1 is a block diagram illustrating a configuration example of an embodiment of an imaging apparatus to which the present disclosure is applied.
- 1 includes an imaging module unit 11, a warping unit 12, and an aging shift estimation unit 13, and captures images from two viewpoints.
- FIG. 1 is a block diagram illustrating a configuration example of a first embodiment of an imaging apparatus to which the present disclosure is applied.
- the imaging module unit 11 of the imaging apparatus 10 includes a stereo camera 21 and an initial parameter storage unit 22.
- the stereo camera 21 includes a left camera 21A (second imaging unit) disposed on the left side of the subject and a right camera 21B (first imaging unit) disposed on the right side.
- the left camera 21A and the right camera 21B capture the same subject from different viewpoints and obtain captured images.
- a captured image captured by the left camera 21A (hereinafter referred to as a left image) and a captured image captured by the right camera 21B (hereinafter referred to as a right image) are supplied to the warping unit 12.
- the initial parameter storage unit 22 stores the parameters of the stereo camera 21 acquired by calibration as initial parameters.
- the initial parameters are composed of internal parameters representing the lens distortion shapes of the left camera 21A and the right camera 21B, and external parameters representing the geometric positional relationship between the left camera 21A and the right camera 21B.
- the calibration method and initial parameters are not particularly limited, but can be those described in Non-Patent Document 1, for example.
- the warping unit 12 includes a left warping unit 31, a right warping unit 32, and a generation unit 33.
- the left warping unit 31 performs rectification on the left image. Specifically, the left warping unit 31 sequentially sets each pixel of the left image as a target pixel. The left warping unit 31 warps a target pixel by setting a pixel at a coordinate before warping of the target pixel of the left image supplied from the generation unit 33 in the left image supplied from the left camera 21A as a target pixel. I do. The left warping unit 31 supplies and outputs an image obtained as a result of warping for all pixels of the left image to the aging deviation estimation unit 13 as a left image after rectification.
- the right warping unit 32 performs rectification of the right image supplied from the right camera 21B based on the coordinates before warping of the target pixel of the right image supplied from the generation unit 33. .
- the right warping unit 32 supplies the right image after rectification to the aging shift estimation unit 13 and outputs it.
- the generation unit 33 reads the initial parameters from the initial parameter storage unit 22. In addition, the generation unit 33 reads from the aging deviation estimation unit 13 aging parameters used for a model expression (hereinafter, aging model expression) representing a deviation between the left image and the right image due to the aging deviation of the stereo camera 21.
- aging model expression a model expression representing a deviation between the left image and the right image due to the aging deviation of the stereo camera 21.
- the secular deviation parameter can be a dominant factor in the deviation of the left and right images caused by the secular deviation, the pitch angle difference between the left camera 21A and the right camera 21B, the yaw angle difference, the roll angle difference, and the left image. And the scale ratio of the right image.
- a pitch angle difference a yaw angle difference, a roll angle difference, and a scale ratio
- parameters when it is not necessary to particularly distinguish a pitch angle difference, a yaw angle difference, a roll angle difference, and a scale ratio, they are referred to as parameters.
- the generation unit 33 calculates the coordinates before warping of the target pixel of the left image and the coordinates before warping of the target pixel of the right image based on the initial parameter and the aging parameter. Then, the generation unit 33 supplies the coordinates before the warping of the target pixel of the left image to the left warping unit 31 and supplies the coordinates before the warping of the target pixel of the right image to the right warping unit 32. As a result, the horizontal and vertical shifts between the left image and the right image due to lens distortion of the left camera 21A and the right camera 21B and geometric positional shift between the left camera 21A and the right camera 21B are corrected. (Rectification).
- the aging deviation estimation unit 13 includes a known area detection unit 41, a left / right pair detection unit 42, a yaw angle estimation unit 43, a non-yaw angle estimation unit 44, and an aging deviation parameter storage unit 45.
- the known area detection unit 41 detects, from the right image after rectification supplied from the right warping unit 32, an area where the ideal value of parallax is known (including infinity) as a known area.
- the detection in the known area detection unit 41 includes, for example, a general object recognition method in which feature points are extracted from each of a model image registered in advance and an input image, and an object is recognized by matching feature amounts of the feature points. Can be used.
- the general object recognition technique is described in, for example, Japanese Patent Application Laid-Open No. 2004-326693.
- the known area detection unit 41 supplies the coordinates of the known area detected from the right image after rectification and the ideal parallax value of the known area to the left and right pair detection unit 42.
- the known area detection unit 41 detects the known area from the left image after rectification supplied from the left warping unit 31 and supplies the coordinates of the known area and the ideal parallax value to the left and right pair detection unit 42. You may make it do.
- the left and right pair detection unit 42 is based on the coordinates of the known area supplied from the known area detection unit 41 and the right image after rectification from the right warping unit 32 and the left image after rectification from the left warping unit 31. Perform block matching of images. Accordingly, the left / right pair detection unit 42 detects the coordinates of the region of the left image after rectification corresponding to the known region of the right image after rectification.
- the accuracy of the detected coordinates is, for example, the same subpixel accuracy as that of stereo matching performed using the left image and the right image.
- the left-right pair detection unit 42 generates a pair of coordinates of the known area of the right image after rectification and the coordinates of the area of the left image after rectification corresponding to the known area as left-right pair coordinates.
- the left / right pair detection unit 42 supplies the left / right pair coordinates of the known area and the ideal parallax value to the yaw angle estimation unit 43.
- the yaw angle estimator 43 obtains the difference between the horizontal coordinates of the left and right pair coordinates of the known area supplied from the left and right pair detector 42 as an actual value of parallax.
- the yaw angle estimation unit 43 is based on the ideal parallax value and the actual parallax value supplied from the left and right pair detection unit 42, and the difference between the yaw angles of the left camera 21A and the right camera 21B caused by the secular deviation of the stereo camera 21. Is estimated.
- the baseline length between the left camera 21A and the right camera 21B is b [mm]
- the focal length in terms of pixels is f [mm].
- the ideal value d ideal [pixel] of the parallax corresponding to the ideal value z [mm] of the depth (distance in the depth direction) of the known region is expressed by the following formula (1).
- the amount of parallax (disparity) error near the center of the image which is caused by a small difference ⁇ [rad] in the yaw angle between the left camera 21A and the right camera 21B caused by the aging of the stereo camera 21, that is,
- the difference between the actual parallax value d real [pixel] and the ideal value d ideal [pixel] can be approximated by the following equation (2).
- the yaw angle estimation unit 43 when the known area exists near the center of the image, based on the measured value d real [pixel] and the ideal value d ideal [pixel] of the parallax, The yaw angle difference ⁇ is estimated.
- the yaw angle estimation unit 43 estimates the difference ⁇ based on the estimation formula for the difference ⁇ based on the model.
- the yaw angle estimation unit 43 supplies the estimated yaw angle difference ⁇ to the estimation unit 44 other than the yaw angle. Further, the yaw angle estimation unit 43 updates the yaw angle difference ⁇ by supplying and storing the estimated yaw angle difference ⁇ to the aging parameter storage unit 45.
- the estimation unit 44 other than the yaw angle is based on the left image from the left warping unit 31, the right image from the right warping unit 31, and the difference ⁇ from the yaw angle estimation unit 43. Parameters other than ⁇ are estimated as estimation parameters.
- the estimation unit 44 other than the yaw angle supplies the estimated parameters to the aging parameter storage unit 45 and stores them, thereby updating the estimated parameters.
- the aging parameter storage unit 45 stores the yaw angle difference ⁇ supplied from the yaw angle estimation unit 43 and the estimation parameters supplied from the estimation unit 44 other than the yaw angle.
- FIG. 2 is a perspective view showing an external configuration example of the stereo camera 21 of FIG.
- the left camera 21A and the right camera 21B of the stereo camera 21 are arranged side by side in the horizontal direction (X direction).
- the base line length (baseline) b between the left camera 21A and the right camera 21B of the stereo camera 21 is 80 [mm].
- FIG. 3 is a diagram for explaining an aging parameter.
- the pitch angle difference ⁇ [rad], yaw angle difference ⁇ [rad], roll angle difference ⁇ [rad], and scale ratio ⁇ [times] that constitute the aging parameter are expressed by the following equation (4). expressed.
- ⁇ L and ⁇ R are pitch angles that are rotation angles of the left camera 21A and the right camera 21B around the X axis that is the horizontal axis, respectively.
- ⁇ L and ⁇ R are yaw angles that are rotation angles of the left camera 21A and the right camera 21B around the Y axis that is the vertical axis, respectively.
- ⁇ L and ⁇ R are roll angles that are rotation angles of the left camera 21A and the right camera 21B around the Z axis that is the axis in the optical axis direction, respectively.
- ⁇ L and ⁇ R are the horizontal sizes of the left image 61 and the right image 62, respectively. Note that ⁇ L and ⁇ R may be the sizes of the left image 61 and the right image 62 in the vertical direction, respectively.
- the pitch angle difference ⁇ [rad], the yaw angle difference ⁇ [rad], and the roll angle difference ⁇ [rad] cause a deviation in the viewpoint direction.
- the scale ratio ⁇ [times] is caused by a shift in focal length between the left camera 21A and the right camera 21B.
- FIG. 4 is a diagram for explaining a method of calculating coordinates before warping by the generation unit 33 in FIG. 1.
- the imaging apparatus 10 employs the equation (5) as an aging model equation.
- the coordinate system is a coordinate system in which the center of the image is (0, 0), and the left camera 21A and the right camera 21B are pinhole cameras having a focal length f of 1.0
- the coordinates (X, coordinates (X L_real in the left image shift amount ⁇ X and shift amount ⁇ Y occurs in Y), Y L_real) and in the right image coordinates (X R-- real, Y R-- real) is represented by the following formula (6) .
- the generation unit 33 first determines the coordinates (X, Y) of the target pixel in the left image after the position is shifted due to the aging shift of the stereo camera 21 based on the coordinates (X, Y) of the target pixel of the left image and the right image.
- X L_real , Y L_real ) and the coordinates (X R_real , Y R_real ) in the right image are calculated.
- the generation unit 33 calculates the horizontal shift amount ⁇ X and the vertical shift amount ⁇ Y with respect to the target pixel by the above-described equation (5). At this time, among the aging parameters other than the difference ⁇ , those not held in the aging parameter storage unit 45 are set to 0. Then, the generation unit 33 calculates the coordinates (X L_real , Y L_real ) and the coordinates (X R_real , Y R_real ) based on the deviation amount ⁇ X and the deviation amount Y according to the above-described equation (6).
- the generation unit 33 uses the method described in Non-Patent Document 1 or the like based on the initial parameters for each of the coordinates (X L_real , Y L_real ) and the coordinates (X R_real , Y R_real ) before warping. Coordinates (X “L, Y” L), (X “R, Y” R) are calculated.
- FIG. 5 is a diagram illustrating a first example of a known area detection method.
- the known area detection unit 41 is, for example, an artificial or natural object that exists at infinity, that is, can be regarded as having zero parallax (for example, the sun or An image of a celestial body such as a cloud or a mountain) is stored in advance as a model image of a known area having an ideal parallax value of zero.
- the known area detection unit 41 When the right image 70 in FIG. 5 is input from the right warping unit 32, the known area detection unit 41 performs general object recognition on the right image 70 based on the model image. Thus, the known area detection unit 41 detects the sun area 71, the cloud area 72, and the mountain area 73 in the right image 70 as known areas. The known area detection unit 41 supplies the coordinates of the detected areas 71 to 73 and the ideal parallax value 0 to the left and right pair detection unit 42.
- FIG. 6 is a diagram illustrating a second example of a known area detection method.
- the known region detection unit 41 stores, for example, a planar image having a known size as a model image in advance. Then, when the right image 80 in FIG. 6 is input from the right warping unit 32, the known area detection unit 41 performs general object recognition on the right image 80 based on the model image.
- the known area detecting unit 41 detects the area 81 of the image similar to the model image in the right image 80 as the known area.
- the known area detection unit 41 calculates the depth of the area 81 as an ideal value of the depth based on the size of the area 81 and the size of the plane corresponding to the area 81.
- the known area detection unit 41 supplies the coordinates of the detected area 81 and the ideal parallax value corresponding to the ideal depth value to the left and right pair detection unit 42.
- FIG. 7 is a diagram illustrating a third example of a known area detection method.
- the known area detection unit 41 When the vanishing point detection (Perspective analysis) is adopted as the known area detection method, the known area detection unit 41 first applies the right image 91 to the right image 91 when the right image 91 in FIG. Edge detection processing is performed to generate an edge image 92.
- the known area detection unit 41 extends all the edges in the edge image 92 and obtains an intersection of the edges for each combination of two edges.
- the known area detection unit 41 detects the intersection 92A most frequently obtained as a vanishing point of the right image 91 that can be regarded as existing at infinity. Details of vanishing point detection are described in, for example, “Interpreting perspective images”, S.T.Barnard, Artificial Intelligence, Vol 21, pp.435-462, 1983.
- the known area detection unit 41 supplies the coordinates of the detected intersection 92A as the coordinates of the known area to the left and right pair detection unit 42, and supplies 0, which is an ideal value of the parallax of the intersection 92A, to the left and right pair detection unit 42.
- FIG. 8 is a diagram illustrating an example of left and right pair coordinates.
- the left and right pair detection unit 42 performs block matching between the known area 101A of the center coordinates (XR, YR) of the right image 101 and the left image 102, as shown in FIG. Then, the left / right pair detection unit 42 converts the pair of the center coordinate (XL, YL) of the region 102A in the left image 102 having the highest correlation with the known region 101A and the center coordinate (XR, YR) of the known region 101A to the left / right pair. Use coordinates.
- the left and right pair coordinates are assumed to be a pair of center coordinates. However, if the left and right pair coordinates are a pair of coordinates at a predetermined position in the region, it is necessary to be a pair of center coordinates. There is no.
- FIG. 9 is a block diagram illustrating a configuration example of the estimation unit 44 other than the yaw angle of FIG.
- a feature point detection unit 111 includes a feature point detection unit 111, a left / right pair detection unit 112, a left / right pair buffer 113, a distribution analysis unit 114, an estimation unit 115, a determination unit 116, and an update unit 117.
- the feature point detection unit 111 detects, from the right image after rectification supplied from the right warping unit 32, a corner of a picture that can easily associate the right image and the left image as a feature point.
- Harris corner detection can be used as the detection in the feature point detection unit 111. Harris corner detection is described, for example, in C. Harris, M.J. Stephens, "A combined corner and edge detector", In Alvey Vision Conference, pp. 147-152, 1988.
- the feature point detection unit 111 supplies the coordinates of each detected feature point to the left and right pair detection unit 112. Note that the feature point detection unit 111 detects the coordinates of the feature points from the left image after rectification supplied from the left warping unit 31, and supplies the coordinates of the feature points to the left and right pair detection unit 112. Good.
- the left / right pair detection unit 112 is based on the coordinates of each feature point supplied from the feature point detection unit 111 and the right image after rectification from the right warping unit 32 and the rectification after the rectification from the left warping unit 31. Perform block matching of the left image. Accordingly, the left / right pair detection unit 112 detects the coordinates of the points of the left image after rectification corresponding to the feature points of the right image after rectification.
- the accuracy of the detected coordinates is, for example, the same subpixel accuracy as that of stereo matching performed using the left image and the right image.
- the left / right pair detection unit 112 uses a pair of coordinates of the feature point of the right image after warping and the coordinate of the point of the left image after rectification corresponding to the feature point as a left / right pair coordinate. This is supplied to the pair buffer 113.
- the left / right pair buffer 113 holds the left / right pair coordinates of each feature point supplied from the left / right pair detection unit 112.
- the distribution analysis unit 114 reads the left and right pair coordinates of each feature point from the left and right pair buffer 113.
- the distribution analysis unit 114 calculates the coordinates of the midpoint of the left and right pair coordinates of each feature point based on the left and right pair coordinates of each feature point.
- the distribution analysis unit 114 determines a parameter to be estimated from parameters other than the difference ⁇ of the secular deviation parameter as an estimation parameter ( Select) and supply to the estimation unit 115.
- the estimation unit 115 reads the left and right pair coordinates of each feature point from the left and right pair buffer 113.
- the estimation unit 115 substitutes the difference ⁇ supplied from the yaw angle estimation unit 43 into the vertical aging model equation.
- the estimation unit 115 calculates an estimation formula for estimating the deviation in the vertical direction from the model formula into which the difference ⁇ is substituted, by deleting the estimation parameter supplied from the distribution analysis unit 114 and a parameter term other than the difference ⁇ .
- the estimation unit 115 calculates the vertical deviation that is the vertical difference between the left and right pair coordinates of each feature point and the vertical direction calculated by the estimation formula.
- the estimation parameter used in the estimation equation is estimated so that the difference from the estimated value of the deviation is minimized.
- the estimation unit 115 supplies the estimation formula obtained by substituting the estimated parameter and the difference ⁇ to the determination unit 116, and supplies the estimation parameter to the update unit 117.
- the determination unit 116 reads the left and right pair coordinates of each feature point from the left and right pair buffer 113. Based on the left and right pair coordinates of each feature point and the estimation formula supplied from the estimation unit 115, the determination unit 116 determines the vertical difference between the left and right pair coordinates of each feature point and the vertical direction estimated by the estimation formula. Calculate the residual statistic with the deviation. The determination unit 116 performs verification based on the calculated statistic to determine whether the estimation parameter is valid. The determination unit 116 supplies the verification result to the update unit 117.
- the update unit 117 supplies the estimated parameter supplied from the estimation unit 115 to the aging parameter storage unit 45 and stores the estimated parameter, thereby updating the estimated parameter. .
- estimation unit 44 other than the yaw angle may not be provided with the determination unit 116, and the estimation parameter may be updated whenever the estimation parameter is estimated. Further, the estimation unit 44 other than the yaw angle may estimate an aging parameter other than the difference ⁇ by the technique described in Non-Patent Document 2.
- FIG. 10 illustrates a method for determining an estimation parameter by the distribution analysis unit 114 of FIG.
- the imaging apparatus 10 employs the above-described equation (5) as a model equation. Therefore, the estimation unit 115 basically uses the equation that defines the deviation amount ⁇ Y of the equation (5), which is a vertical aging model equation, as the estimation equation, and the deviation amount ⁇ Y estimated by the estimation equation and the deviation amount Parameters other than the difference ⁇ among the secular deviation parameters used in the estimation formula are estimated so that the difference between the amount ⁇ Y and the actually measured value is minimized.
- the estimation unit 115 uses the left image coordinates (X L , Y L ) and the right image coordinates (X R , Y R ) constituting the left and right pair coordinates of each feature point as the actual measurement value of the deviation amount ⁇ Y. ) Vertical difference is used. Further, the estimating unit 115 calculates the coordinates (X M , Y M ) of the midpoint between the coordinates (X L , Y L ) of the left image and the coordinates (X R , Y R ) of the right image by the following formula (7). Ask for.
- the estimation unit 115 calculates the shift amount ⁇ Y with respect to the coordinates (X M , Y M ) of the midpoint estimated by the estimation formula and the shift amount ⁇ Y based on the formula that defines the shift amount ⁇ Y of Formula (5).
- the square sum E of the error of the actually measured value is defined as an evaluation function by the following equation (8).
- estimation part 115 estimates other than the difference (phi) among the secular deviation parameters from which the difference E becomes the minimum using common nonlinear minimization methods, such as Levenberg-Marquardt method.
- the coefficient of difference ⁇ is ⁇ (Y M 2 +1), the coefficient of difference ⁇ is X M Y M , and the coefficient of difference ⁇ is ⁇ X. is M, the coefficient of scale ratio ⁇ is Y M.
- the distribution analysis unit 114 is only in the case where the coordinates (X M , Y M ) of the predetermined number or more of the middle points are distributed in the area a1 of the entire screen, that is, when the number of feature points is the predetermined number or more. Therefore, it is determined that the difference ⁇ can be estimated with sufficient accuracy, and the difference ⁇ is determined as an estimation parameter.
- the distribution analysis unit 114 distributes the coordinates (X M , Y M ) of a predetermined number or more of the midpoints in the regions a3 to c3 divided into a plurality (three in the example of FIG. 6) in the horizontal direction. Only when it is determined that the difference ⁇ can be estimated with sufficient accuracy, and the difference ⁇ is determined as an estimation parameter.
- the distribution analysis unit 114 distributes the coordinates (X M , Y M ) of the midpoint in the regions a4 to c4 divided into a plurality (three in the example of FIG. 6) in the vertical direction. Only, it is determined that the scale ratio ⁇ can be estimated with sufficient accuracy, and the scale ratio ⁇ is determined as an estimation parameter.
- parameters other than the estimated parameter and the difference ⁇ among the aging parameters are deleted from the estimation formula. Therefore, parameters other than the estimated parameter and the difference ⁇ among the aging parameters are not estimated.
- FIG. 11 is a diagram for explaining verification by the determination unit 116 of FIG.
- the graph of FIG. 11 is a histogram in which the horizontal axis is the residual Yerr [Pixel] and the vertical axis is the number of feature points [pieces].
- the residual Yerr is expressed by the following equation (9).
- the determination unit 116 obtains the residual Yerr of each feature point and calculates the number of feature points corresponding to each residual Yerr as a statistic, thereby generating the histogram of FIG. Then, the determination unit 116 obtains the number of feature points count_valid in which the residual Yerr exists within a range in which the absolute value set in advance as an effective range is y_thr (for example, 0.5) or less.
- the determination unit 116 displays a verification result that the estimation parameter is valid. Generate.
- the determination unit 116 generates a verification result that the estimation parameter is not valid.
- FIG. 12 is a flowchart for describing image processing of the imaging apparatus 10 of FIG.
- the imaging device 10 initializes the left and right pair buffer 113.
- the left and right pair buffer 113 deletes the left and right pair coordinates held therein.
- step S12 the left camera 21A of the stereo camera 21 captures the left image, and the right camera 21B captures the right image.
- the left image is supplied to the left warping unit 31, and the right image is supplied to the right warping unit 32.
- step S ⁇ b> 13 the left warping unit 31 determines a pixel that has not yet been set as the target pixel among the pixels constituting the left image as the target pixel of the left image. Further, the right warping unit 32 determines a pixel at the same position as the target pixel of the left image among the pixels constituting the right image as the target pixel of the right image.
- step S14 the generation unit 33, based on the initial parameters read from the initial parameter storage unit 22 and the aging deviation parameters read from the aging parameter storage unit 45, before warping of the target pixel of the left image and the right image. Calculate coordinates. Then, the generation unit 33 supplies the coordinates before the warping of the target pixel of the left image to the left warping unit 31 and supplies the coordinates before the warping of the target pixel of the right image to the right warping unit 32.
- step S15 the left warping unit 31 sets the pixel at the coordinates before warping of the target pixel of the left image supplied from the generation unit 33, out of the left image supplied from the left camera 21A, as a target pixel. Warping is performed on the target pixel. Further, the right warping unit 32 sets the pixel of the coordinates before warping of the target pixel of the right image supplied from the generation unit 33 in the right image supplied from the right camera 21B as the target pixel. Warping for.
- step S16 the left warping unit 31 determines whether or not all pixels of the left image have been determined as the target pixel. If it is determined in step S16 that all the pixels in the left image have not yet been determined as the target pixel, the process returns to step S13, and steps S13 to S16 are performed until all the pixels in the left image are determined as the target pixel. The process is repeated.
- the left warping unit 31 converts the image obtained as a result of warping for all the pixels of the left image to a post-rectification image.
- the left image is supplied to the left / right pair detection unit 42 and the estimation unit 44 other than the yaw angle.
- the right warping unit 32 uses, as a right image after rectification, an image obtained as a result of warping for all pixels of the right image, and a known area detection unit 41, a left / right pair detection unit 42, and an estimation unit other than the yaw angle 44.
- step S17 the yaw angle estimation unit 43 performs a ⁇ estimation process for estimating the yaw angle difference ⁇ based on the left and right pair coordinates of the known area supplied from the left and right pair detection unit 42 and the ideal value of the parallax. Details of this ⁇ estimation processing will be described with reference to FIG.
- step S18 the feature point detection unit 111 detects a feature point from the right image after rectification supplied from the right warping unit 32.
- the known area detection unit 41 supplies the coordinates of each feature point detected from the right image after rectification to the left and right pair detection unit 112.
- step S19 the left / right pair detection unit 112 generates left / right pair coordinates of each feature point by performing block matching of the right image and the left image after rectification based on the coordinates of each feature point.
- the left and right pair coordinates of each feature point are supplied to and held in the left and right pair buffer 113.
- step S20 the distribution analysis unit 114 performs an estimation parameter determination process for determining an estimation parameter based on the distribution of the left and right pair coordinates of each feature point held in the left and right pair buffer 113. Details of this estimation parameter determination processing will be described with reference to FIG.
- step S21 the estimation unit 115 assigns the difference ⁇ supplied from the yaw angle estimation unit 43 to the vertical aging model equation, and deletes the estimation parameter and the parameter terms other than the difference ⁇ , Based on the left and right pair coordinates of each feature point held in the pair buffer 113, an estimation parameter is estimated.
- the estimation unit 115 supplies the estimation formula obtained by substituting the estimated parameter and the difference ⁇ to the determination unit 116, and supplies the estimation parameter to the update unit 117.
- step S ⁇ b> 22 the determination unit 116 generates a residual histogram based on the left and right pair coordinates of each feature point held in the left and right pair buffer 113 and the estimation formula supplied from the estimation unit 115. Verification is performed based on the histogram. The determination unit 116 supplies the verification result to the update unit 117.
- step S23 the updating unit 117 determines whether the estimation parameter is valid based on the verification result supplied from the determination unit 116, that is, the verification result is that the estimation parameter is valid. Determine whether or not.
- step S24 the updating unit 117 supplies the estimated parameter supplied from the estimating unit 115 to the aged deviation parameter storage unit 45 to store the estimated parameter. Update parameters. Then, the process ends.
- step S25 the imaging apparatus 10 notifies the user that the rectification has not been performed normally and requests a retry. To finish the process.
- FIG. 13 is a flowchart for explaining the details of the ⁇ estimation process in step S17 of FIG.
- the known area detection unit 41 detects a known area from the right image after rectification supplied from the right warping unit 32.
- the known area detection unit 41 supplies the detected coordinates of the known area and the ideal parallax value to the left and right pair detection unit 42.
- step S ⁇ b> 32 the left / right pair detection unit 42 performs block matching between the right image from the right warping unit 32 and the left image from the left warping unit 31 based on the coordinates of the known region from the known region detection unit 41. Thus, left and right pair coordinates are generated.
- the left / right pair detection unit 42 supplies the left / right pair coordinates of the known area and the ideal parallax value to the yaw angle estimation unit 43.
- step S33 the yaw angle estimation unit 43 obtains a horizontal coordinate difference between the left and right pair coordinates as an actual measurement value of parallax based on the left and right pair coordinates of the known area supplied from the left and right pair detection unit 42.
- step S34 the yaw angle estimator 43 estimates the yaw angle difference ⁇ based on the measured parallax value and the ideal value.
- the yaw angle estimation unit 43 supplies the estimated yaw angle difference ⁇ to the non-yaw angle estimation unit 44. Further, the yaw angle estimation unit 43 updates the yaw angle difference ⁇ by supplying and storing the estimated yaw angle difference ⁇ to the aging parameter storage unit 45. And a process returns to step S17 of FIG. 12, and progresses to step S18.
- FIG. 14 is a flowchart for explaining the details of the estimation parameter determination processing in step S20 of FIG.
- the distribution analysis unit 114 sets the number of feature points corresponding to the left and right pair coordinates held in the left and right pair buffer 113 to N. In step S42, the distribution analysis unit 114 determines whether the number N of feature points is greater than a threshold value k1 (for example, 100 [points]).
- a threshold value k1 for example, 100 [points]
- step S42 When it is determined in step S42 that the number of feature points is greater than the threshold value k1, the distribution analysis unit 114 determines the difference ⁇ as an estimation parameter and supplies the estimation parameter 115 to the estimation unit 115 in step S43.
- step S44 the distribution analysis unit 114 calculates the number N1 to N3 of feature points corresponding to the left and right pair coordinates in which the coordinates (X M , Y M ) of the midpoint are distributed in the regions a4 to c4 in FIG. . Further, the distribution analysis unit 114 calculates the number N4 to N6 of feature points corresponding to the left and right pair coordinates in which the coordinates of the midpoint (X M , Y M ) are distributed in the regions a3 to c3 in FIG.
- the distribution analysis unit 114 reads the left and right pair coordinates of each feature point from the left and right pair buffer 113. Then, the distribution analysis unit 114 calculates the coordinates (X M , Y M ) of the middle point of the left and right pair coordinates of each feature point based on the left and right pair coordinates of each feature point.
- Distribution analysis unit 114 the coordinates of each center point (X M, Y M) and processed in sequence, to be processed coordinates (X M, Y M) Y M of, the vertical direction of the screen size (height) If the value is equal to or less than ⁇ H / 6, where H, the number N1 of feature points is incremented by one.
- distribution analysis unit 114 if Y M is large H / 6 smaller than -H / 6, increments the number N2 of feature points by 1, if Y M is H / 6 or more, the number of feature points Increment N3 by 1.
- the distribution analysis unit 114 sets the number N1 to N3 of feature points when the coordinates (X M , Y M ) of all the midpoints are set as processing targets as the final number N1 to N3 of feature points.
- distribution analysis unit 114 the coordinate (X M, Y M) of the middle point is processed in the order, the processing target coordinates (X M, Y M) X M of the horizontal screen sizes (width ) Is less than -W / 6 where W is W, the number N4 of feature points is incremented by one.
- distribution analysis unit 114 when X M is large W / 6 less than -W / 6, increments the number N5 of feature points by 1, if X M is W / 6 or more, the number of feature points Increment N6 by 1.
- the distribution analysis unit 114 sets the number N4 to N6 of feature points when processing the coordinates (X M , Y M ) of all the midpoints as the processing target as the final number N4 to N6 of feature points.
- step S45 the distribution analysis unit 114 determines whether or not all of the feature points N1 to N3 are larger than a threshold value k2 (for example, 50 [points]). When it is determined in step S45 that all the number N1 to N3 of feature points are larger than the threshold value k2, the distribution analysis unit 114 determines the scale ratio ⁇ as an estimation parameter and supplies it to the estimation unit 115 in step S46. Then, the process proceeds to step S47.
- a threshold value k2 for example, 50 [points]
- step S45 determines whether the number of feature points N1 to N3 is equal to or less than the threshold value k2. If it is determined in step S45 that at least one of the number of feature points N1 to N3 is equal to or less than the threshold value k2, the process skips step S46 and proceeds to step S47.
- step S47 the distribution analysis unit 114 determines whether or not all of the feature points N4 to N6 are larger than the threshold value k2. If it is determined in step S47 that all of the number N4 to N6 of feature points are greater than the threshold value k2, the distribution analysis unit 114 determines the difference ⁇ as an estimation parameter and supplies the estimation parameter 115 to the estimation unit 115 in step S48. And a process returns to step S20 of FIG. 12, and progresses to step S21.
- step S47 if it is determined in step S47 that at least one of the number of feature points N4 to N6 is equal to or less than the threshold value k2, the process returns to step S20 in FIG. 12 and proceeds to step S21.
- step S42 if it is determined in step S42 that the number N of feature points is equal to or less than the threshold value k1, the process ends.
- the imaging device 10 may capture new left and right images until the number N of feature points becomes larger than the threshold value k1.
- the imaging apparatus 10 estimates the yaw angle difference ⁇ based on the left-right coordinate pair of the known region and the ideal value of the parallax, the yaw angle difference ⁇ can be estimated with high accuracy. Further, robustness of the estimation accuracy of the yaw angle difference ⁇ can be ensured.
- the imaging device 10 warps the left image and the right image based on the estimated high-precision yaw angle difference ⁇ , thereby causing the left image and the right image due to the difference ⁇ generated by the secular deviation of the stereo camera 21. Can be corrected with high accuracy.
- the imaging apparatus 10 determines whether each parameter other than the aging parameter difference ⁇ can be estimated with sufficient accuracy, and estimates only the parameters determined to be estimated with sufficient accuracy. The robustness of the estimation result can be ensured.
- FIG. 15 is a diagram for explaining an aging parameter.
- the main cause of the occurrence of the pitch angle difference ⁇ among the aging parameters is torsion of the chassis and the substrate due to stress applied from the outside of the casing of the imaging device 10.
- the vertical displacement between the corresponding pixels of the left image and the right image increases on the entire screen.
- Y displacement the vertical displacement between the corresponding pixels of the left image and the right image
- X displacement a horizontal displacement
- the effective area which is the area of the region where the depth generated by stereo matching in the screen is effective, is reduced. Therefore, when image recognition is performed using depth, the recognition accuracy is greatly reduced. Therefore, the importance of estimating the difference ⁇ is great.
- the difference ⁇ can be estimated with sufficient accuracy if there are a certain number of feature points on the entire screen, the estimation of the difference ⁇ is relatively easy.
- the main cause of the difference in yaw angle ⁇ is the heat generation of components mounted on the substrate surface of the imaging device 10, and the warping of the chassis and the substrate due to the temperature difference between the substrate surface and the back surface ( Deflection).
- the difference ⁇ can be estimated with sufficient accuracy if a known region exists in the left image and the right image.
- the main cause of the roll angle difference ⁇ is rotation of the left camera 21A and the right camera 21B around the Z axis.
- the difference ⁇ can be estimated with sufficient accuracy if the midpoints of the feature points are distributed in the horizontal direction of the screen.
- the main cause of the generation of the scale ratio ⁇ is the focus between the left camera 21A and the right camera 21B due to the occurrence of a temperature difference in the lens having the temperature dependence of the focal length built in the left camera 21A and the right camera 21B. Such as distance fluctuation.
- the scale ratio ⁇ can be estimated with sufficient accuracy even if the midpoints of the feature points are distributed in the vertical direction of the screen.
- the generation factors of each parameter are different. Therefore, the presence or absence of the generation of each parameter differs depending on the mechanical structure of the imaging apparatus 10 and the use conditions. Therefore, the imaging apparatus 10 may estimate only the parameters that can be generated according to the mechanical structure and use conditions.
- the imaging apparatus 10 may estimate only parameters that have a large degree of influence on the application in accordance with the application (process) to be executed.
- ⁇ Second Embodiment> (Description of computer to which the present disclosure is applied)
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
- FIG. 16 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 205 is further connected to the bus 204.
- An imaging unit 206, an input unit 207, an output unit 208, a storage unit 209, a communication unit 210, and a drive 211 are connected to the input / output interface 205.
- the imaging unit 206 includes the stereo camera 21 shown in FIG.
- the input unit 207 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 208 includes a display, a speaker, and the like.
- the storage unit 209 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 210 includes a network interface or the like.
- the drive 211 drives a removable medium 212 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 201 loads the program stored in the storage unit 209 to the RAM 203 via the input / output interface 205 and the bus 204 and executes the program. A series of processing is performed.
- the program executed by the computer 200 can be provided by being recorded on the removable medium 212 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 209 via the input / output interface 205 by attaching the removable medium 212 to the drive 211.
- the program can be received by the communication unit 210 via a wired or wireless transmission medium and installed in the storage unit 209.
- the program can be installed in advance in the ROM 202 or the storage unit 209.
- the program executed by the computer 200 may be a program that is processed in time series in the order described in this specification, or a necessary timing such as in parallel or when a call is made. It may be a program in which processing is performed.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be realized as an apparatus mounted on any type of vehicle such as an automobile, an electric vehicle, a hybrid electric vehicle, and a motorcycle.
- FIG. 17 is a block diagram illustrating an example of a schematic configuration of a vehicle control system 2000 to which the technology according to the present disclosure can be applied.
- the vehicle control system 2000 includes a plurality of electronic control units connected via a communication network 2010.
- the vehicle control system 2000 includes a drive system control unit 2100, a body system control unit 2200, a battery control unit 2300, a vehicle exterior information detection unit 2400, a vehicle interior information detection unit 2500, and an integrated control unit 2600.
- the communication network 2010 that connects these multiple control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
- Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives various devices to be controlled. Is provided.
- Each control unit includes a network I / F for performing communication with other control units via the communication network 2010, and wired or wireless communication with devices or sensors inside and outside the vehicle. A communication I / F for performing communication is provided. In FIG.
- a microcomputer 2610 As a functional configuration of the integrated control unit 2600, a microcomputer 2610, a general-purpose communication I / F 2620, a dedicated communication I / F 2630, a positioning unit 2640, a beacon receiving unit 2650, an in-vehicle device I / F 2660, an audio image output unit 2670, An in-vehicle network I / F 2680 and a storage unit 2690 are illustrated.
- other control units include a microcomputer, a communication I / F, a storage unit, and the like.
- the drive system control unit 2100 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 2100 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the drive system control unit 2100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
- a vehicle state detection unit 2110 is connected to the drive system control unit 2100.
- the vehicle state detection unit 2110 includes, for example, a gyro sensor that detects the angular velocity of the axial rotation of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of sensors for detecting an angle, an engine speed, a rotational speed of a wheel, or the like is included.
- the drive system control unit 2100 performs arithmetic processing using a signal input from the vehicle state detection unit 2110, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, or the like.
- the body system control unit 2200 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 2200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 2200 can be input with radio waves transmitted from a portable device that substitutes for a key or signals of various switches.
- the body system control unit 2200 receives the input of these radio waves or signals, and controls the vehicle door lock device, power window device, lamp, and the like.
- the battery control unit 2300 controls the secondary battery 2310 that is a power supply source of the drive motor according to various programs. For example, information such as battery temperature, battery output voltage, or remaining battery capacity is input to the battery control unit 2300 from a battery device including the secondary battery 2310. The battery control unit 2300 performs arithmetic processing using these signals, and controls the temperature adjustment control of the secondary battery 2310 or the cooling device provided in the battery device.
- the outside information detection unit 2400 detects information outside the vehicle on which the vehicle control system 2000 is mounted.
- the vehicle exterior information detection unit 2400 is connected to at least one of the imaging unit 2410 and the vehicle exterior information detection unit 2420.
- the imaging unit 2410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
- the outside information detection unit 2420 detects, for example, current weather or an environmental sensor for detecting weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle on which the vehicle control system 2000 is mounted. A surrounding information detection sensor is included.
- the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, and a snow sensor that detects snowfall.
- the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
- the imaging unit 2410 and the outside information detection unit 2420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
- FIG. 18 shows an example of installation positions of the imaging unit 2410 and the vehicle outside information detection unit 2420.
- the imaging units 2910, 2912, 2914, 2916, and 2918 are provided at, for example, at least one position among a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior of the vehicle 2900.
- An imaging unit 2910 provided in the front nose and an imaging unit 2918 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 2900.
- the imaging units 2912 and 2914 provided in the side mirror mainly acquire an image on the side of the vehicle 2900.
- An imaging unit 2916 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 2900.
- An imaging unit 2918 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 18 shows an example of shooting ranges of the respective imaging units 2910, 2912, 2914, and 2916.
- the imaging range a indicates the imaging range of the imaging unit 2910 provided in the front nose
- the imaging ranges b and c indicate the imaging ranges of the imaging units 2912 and 2914 provided in the side mirrors, respectively
- the imaging range d The imaging range of the imaging unit 2916 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 2910, 2912, 2914, and 2916, an overhead image when the vehicle 2900 is viewed from above is obtained.
- the vehicle outside information detection units 2920, 2922, 2924, 2926, 2928, 2930 provided on the front, rear, side, corner, and upper windshield of the vehicle 2900 may be, for example, an ultrasonic sensor or a radar device.
- the vehicle outside information detection units 2920, 2926, and 2930 provided on the front nose, the rear bumper, the back door, and the windshield in the vehicle interior of the vehicle 2900 may be, for example, LIDAR devices.
- These vehicle outside information detection units 2920 to 2930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.
- the vehicle outside information detection unit 2400 causes the imaging unit 2410 to capture an image outside the vehicle and receives the captured image data.
- the vehicle exterior information detection unit 2400 receives detection information from the vehicle exterior information detection unit 2420 connected thereto.
- the vehicle outside information detection unit 2420 is an ultrasonic sensor, a radar device, or a LIDAR device
- the vehicle outside information detection unit 2400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
- the outside information detection unit 2400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
- the vehicle outside information detection unit 2400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions, or the like based on the received information.
- the vehicle outside information detection unit 2400 may calculate a distance to an object outside the vehicle based on the received information.
- the outside information detection unit 2400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
- the vehicle exterior information detection unit 2400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by the different imaging units 2410 to generate an overhead image or a panoramic image. Also good.
- the vehicle exterior information detection unit 2400 may perform viewpoint conversion processing using image data captured by different imaging units 2410.
- the in-vehicle information detection unit 2500 detects in-vehicle information.
- a driver state detection unit 2510 that detects the driver's state is connected to the in-vehicle information detection unit 2500.
- the driver state detection unit 2510 may include a camera that captures images of the driver, a biological sensor that detects biological information of the driver, a microphone that collects sound in the passenger compartment, and the like.
- the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of an occupant sitting on the seat or a driver holding the steering wheel.
- the vehicle interior information detection unit 2500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 2510, and determines whether the driver is asleep. May be.
- the vehicle interior information detection unit 2500 may perform a process such as a noise canceling process on the collected audio signal.
- the integrated control unit 2600 controls the overall operation in the vehicle control system 2000 according to various programs.
- An input unit 2800 is connected to the integrated control unit 2600.
- the input unit 2800 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
- the input unit 2800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 2000. May be.
- the input unit 2800 may be, for example, a camera. In this case, the passenger can input information using a gesture.
- the input unit 2800 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input unit 2800 and outputs the input signal to the integrated control unit 2600.
- a passenger or the like operates the input unit 2800 to input various data or instruct a processing operation to the vehicle control system 2000.
- the storage unit 2690 may include a RAM (Random Access Memory) that stores various programs executed by the microcomputer, and a ROM (Read Only Memory) that stores various parameters, calculation results, sensor values, and the like.
- the storage unit 2690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- General-purpose communication I / F 2620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 2750.
- the general-purpose communication I / F 2620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX, LTE (Long Term Evolution) or LTE-A (LTE-Advanced), or a wireless LAN (Wi-Fi). (Also referred to as (registered trademark)) may be implemented.
- the general-purpose communication I / F 2620 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. May be. Further, the general-purpose communication I / F 2620 is connected to a terminal (for example, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the vehicle using, for example, P2P (Peer To Peer) technology. May be.
- a terminal for example, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal
- P2P Peer To Peer
- the dedicated communication I / F 2630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
- the dedicated communication I / F 2630 may implement a standard protocol such as WAVE (Wireless Access in Vehicle Environment) or DSRC (Dedicated Short Range Communications) which is a combination of the lower layer IEEE 802.11p and the upper layer IEEE 1609. .
- the dedicated communication I / F 2630 is typically a V2X concept that includes one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, and vehicle-to-pedestrian communication. Perform communication.
- the positioning unit 2640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), performs positioning, and performs latitude, longitude, and altitude of the vehicle.
- the position information including is generated.
- the positioning unit 2640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
- the beacon receiving unit 2650 receives, for example, radio waves or electromagnetic waves transmitted from radio stations installed on the road, and acquires information such as the current position, traffic jams, closed roads, or required time. Note that the function of the beacon receiving unit 2650 may be included in the dedicated communication I / F 2630 described above.
- the in-vehicle device I / F 2660 is a communication interface that mediates connections between the microcomputer 2610 and various devices existing in the vehicle.
- the in-vehicle device I / F 2660 may establish a wireless connection using a wireless communication protocol such as a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
- the in-vehicle device I / F 2660 may establish a wired connection via a connection terminal (and a cable if necessary).
- the in-vehicle device I / F 2660 exchanges a control signal or a data signal with, for example, a mobile device or wearable device that a passenger has, or an information device that is carried in or attached to the vehicle.
- the in-vehicle network I / F 2680 is an interface that mediates communication between the microcomputer 2610 and the communication network 2010.
- the in-vehicle network I / F 2680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 2010.
- the microcomputer 2610 of the integrated control unit 2600 is connected via at least one of a general-purpose communication I / F 2620, a dedicated communication I / F 2630, a positioning unit 2640, a beacon receiving unit 2650, an in-vehicle device I / F 2660, and an in-vehicle network I / F 2680.
- the vehicle control system 2000 is controlled according to various programs.
- the microcomputer 2610 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 2100. Also good.
- the microcomputer 2610 may perform cooperative control for the purpose of avoiding or reducing the collision of a vehicle, following traveling based on the inter-vehicle distance, traveling at a vehicle speed, automatic driving, and the like.
- the microcomputer 2610 is information acquired via at least one of the general-purpose communication I / F 2620, the dedicated communication I / F 2630, the positioning unit 2640, the beacon receiving unit 2650, the in-vehicle device I / F 2660, and the in-vehicle network I / F 2680. Based on the above, local map information including peripheral information on the current position of the vehicle may be created. Further, the microcomputer 2610 may generate a warning signal by predicting a danger such as collision of a vehicle, approach of a pedestrian or the like or approach to a closed road based on the acquired information. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
- the sound image output unit 2670 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or outside the vehicle.
- an audio speaker 2710, a display unit 2720, and an instrument panel 2730 are illustrated as output devices.
- the display unit 2720 may include at least one of an on-board display and a head-up display, for example.
- the display unit 2720 may have an AR (Augmented Reality) display function.
- the output device may be another device such as a headphone, a projector, or a lamp other than these devices.
- the display device can display the results obtained by various processes performed by the microcomputer 2610 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. Further, when the output device is an audio output device, the audio output device converts an audio signal made up of reproduced audio data or acoustic data into an analog signal and outputs it aurally.
- At least two control units connected via the communication network 2010 may be integrated as one control unit.
- each control unit may be configured by a plurality of control units.
- the vehicle control system 2000 may include another control unit not shown.
- some or all of the functions of any of the control units may be given to other control units.
- the predetermined arithmetic processing may be performed by any one of the control units.
- a sensor or device connected to one of the control units may be connected to another control unit, and a plurality of control units may transmit / receive detection information to / from each other via the communication network 2010. .
- the imaging device 10 in FIG. 1 can be applied to, for example, the imaging unit 2410 in FIG.
- the imaging unit 2410 can estimate the difference in yaw angle between the cameras caused by the secular deviation with high accuracy and correct the deviation between the left image and the right image.
- the vehicle exterior information detection unit 2400 can detect the position of the subject in the depth direction with high accuracy using the corrected left image and right image.
- the stereo camera may be configured by two cameras arranged in the vertical direction instead of the horizontal direction.
- the difference in the vertical direction of the left-right coordinate pair in the known area is used as the actual parallax value.
- this indication can also take the following structures.
- a warping unit that warps the first image and the second image based on a model formula using the difference between the yaw angles estimated by the yaw angle estimation unit as a parameter (1)
- the image processing apparatus according to any one of (3) to (3).
- a yaw angle that estimates at least one of a pitch angle difference and a roll angle difference between the first imaging unit and the second imaging unit, and a scale ratio between the first image and the second image.
- the image processing apparatus according to any one of (1) to (4), further including an estimation unit.
- At least one of the pitch angle difference, the roll angle difference, and the scale ratio estimated by the estimator other than the yaw angle, and the yaw angle difference estimated by the yaw angle estimator are parameters.
- the image processing apparatus further including: a warping unit that warps the first image and the second image based on the model formula.
- the warping unit is configured to warp the first image and the second image based on the model formula and an initial parameter measured by calibration.
- the image processing device Corresponding to the position of a known region where the ideal value of parallax in the first image captured by the first imaging unit is known and the known region in the second image captured by the second imaging unit Estimating the difference in yaw angle between the first imaging unit and the second imaging unit based on the measured parallax value of the known area calculated based on the position of the area and the ideal value of the parallax An image processing method including a yaw angle estimation step.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Measurement Of Optical Distance (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention concerne un dispositif et un procédé de traitement d'images qui permettent d'estimer avec une grande précision la différence d'angle de lacet entre les caméras résultant de l'écart lié à l'âge dans une caméra stéréo. Une unité d'estimation d'angle de lacet estime la différence entre les angles de lacet d'une caméra gauche et d'une caméra droite d'après la position d'une zone connue dans laquelle la valeur idéale pour une parallaxe dans une image droite capturée par une caméra droite est connue, une valeur mesurée réelle pour la parallaxe de la zone connue calculée d'après la position d'une zone correspondant à la zone connue dans une image gauche capturée par une caméra gauche, et la valeur idéale pour la parallaxe. L'invention peut s'appliquer, par exemple, à un dispositif d'imagerie ou analogue comprenant une caméra stéréo.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015-122069 | 2015-06-17 | ||
| JP2015122069 | 2015-06-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016203989A1 true WO2016203989A1 (fr) | 2016-12-22 |
Family
ID=57545577
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/066569 Ceased WO2016203989A1 (fr) | 2015-06-17 | 2016-06-03 | Dispositif et procédé de traitement d'images |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016203989A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018112522A (ja) * | 2017-01-13 | 2018-07-19 | 株式会社東芝 | 画像処理装置及び画像処理方法 |
| CN113112412A (zh) * | 2020-01-13 | 2021-07-13 | 株式会社理光 | 垂直校正矩阵的生成方法、装置及计算机可读存储介质 |
| JP2022182335A (ja) * | 2021-05-28 | 2022-12-08 | 株式会社Subaru | 車両の車外撮像装置 |
| WO2023068034A1 (fr) * | 2021-10-20 | 2023-04-27 | 日立Astemo株式会社 | Dispositif de traitement d'images |
| US11880993B2 (en) | 2018-09-03 | 2024-01-23 | Kabushiki Kaisha Toshiba | Image processing device, driving assistance system, image processing method, and program |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09133525A (ja) * | 1995-11-10 | 1997-05-20 | Nippon Soken Inc | 距離計測装置 |
| JP2003083742A (ja) * | 2001-09-13 | 2003-03-19 | Fuji Heavy Ind Ltd | 監視システムの距離補正装置および距離補正方法 |
| WO2013145025A1 (fr) * | 2012-03-30 | 2013-10-03 | 株式会社日立製作所 | Système de camera stéréo et objet mobile |
-
2016
- 2016-06-03 WO PCT/JP2016/066569 patent/WO2016203989A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09133525A (ja) * | 1995-11-10 | 1997-05-20 | Nippon Soken Inc | 距離計測装置 |
| JP2003083742A (ja) * | 2001-09-13 | 2003-03-19 | Fuji Heavy Ind Ltd | 監視システムの距離補正装置および距離補正方法 |
| WO2013145025A1 (fr) * | 2012-03-30 | 2013-10-03 | 株式会社日立製作所 | Système de camera stéréo et objet mobile |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018112522A (ja) * | 2017-01-13 | 2018-07-19 | 株式会社東芝 | 画像処理装置及び画像処理方法 |
| US10510163B2 (en) | 2017-01-13 | 2019-12-17 | Kabushiki Kaisha Toshiba | Image processing apparatus and image processing method |
| US11880993B2 (en) | 2018-09-03 | 2024-01-23 | Kabushiki Kaisha Toshiba | Image processing device, driving assistance system, image processing method, and program |
| CN113112412A (zh) * | 2020-01-13 | 2021-07-13 | 株式会社理光 | 垂直校正矩阵的生成方法、装置及计算机可读存储介质 |
| CN113112412B (zh) * | 2020-01-13 | 2024-03-19 | 株式会社理光 | 垂直校正矩阵的生成方法、装置及计算机可读存储介质 |
| JP2022182335A (ja) * | 2021-05-28 | 2022-12-08 | 株式会社Subaru | 車両の車外撮像装置 |
| JP7759739B2 (ja) | 2021-05-28 | 2025-10-24 | 株式会社Subaru | 車両の車外撮像装置 |
| WO2023068034A1 (fr) * | 2021-10-20 | 2023-04-27 | 日立Astemo株式会社 | Dispositif de traitement d'images |
| JP2023061621A (ja) * | 2021-10-20 | 2023-05-02 | 日立Astemo株式会社 | 画像処理装置 |
| US12525034B2 (en) | 2021-10-20 | 2026-01-13 | Hitachi Astemo, Ltd. | Image processing device |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6834964B2 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
| US11076141B2 (en) | Image processing device, image processing method, and vehicle | |
| JP6701532B2 (ja) | 画像処理装置および画像処理方法 | |
| JP6764573B2 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
| JP6645492B2 (ja) | 撮像装置および撮像方法 | |
| JPWO2017159382A1 (ja) | 信号処理装置および信号処理方法 | |
| CN108139211B (zh) | 用于测量的装置和方法以及程序 | |
| WO2017057044A1 (fr) | Dispositif de traitement d'informations et procédé de traitement d'informations | |
| WO2017122552A1 (fr) | Dispositif et procédé de traitement d'image, programme, et système de traitement d'image | |
| JPWO2018037678A1 (ja) | 画像処理装置および情報生成装置と情報生成方法 | |
| WO2019163315A1 (fr) | Dispositif de traitement de l'information, dispositif d'imagerie et système d'imagerie | |
| JP6922169B2 (ja) | 情報処理装置および方法、車両、並びに情報処理システム | |
| WO2016203989A1 (fr) | Dispositif et procédé de traitement d'images | |
| JPWO2018042815A1 (ja) | 画像処理装置と画像処理方法 | |
| WO2019093136A1 (fr) | Dispositif de traitement d'images, procédé de traitement d'images et programme | |
| CN111868778B (zh) | 图像处理装置、图像处理方法以及存储介质 | |
| WO2022196316A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme associé | |
| JP7059185B2 (ja) | 画像処理装置、画像処理方法、および撮像装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16811464 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16811464 Country of ref document: EP Kind code of ref document: A1 |