WO2008065707A1 - Image data recognizing method, image processing device, and image data recognizing program - Google Patents
Image data recognizing method, image processing device, and image data recognizing program Download PDFInfo
- Publication number
- WO2008065707A1 WO2008065707A1 PCT/JP2006/323685 JP2006323685W WO2008065707A1 WO 2008065707 A1 WO2008065707 A1 WO 2008065707A1 JP 2006323685 W JP2006323685 W JP 2006323685W WO 2008065707 A1 WO2008065707 A1 WO 2008065707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- vehicle
- pixel
- region
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- Image data recognition method image processing apparatus, and image data recognition program
- the present invention relates to image data recognition.
- the present invention relates to an image data recognition method for detecting a subject in image data.
- a technique for detecting a vehicle or a pedestrian from image data acquired by a fixed camera and image data acquired by a camera mounted on a moving body.
- the technology for detecting a vehicle or a pedestrian is used, for example, to acquire image data of an intersection, detect a vehicle in the intersection, and present the detected vehicle or pedestrian to the traveling vehicle.
- Methods for extracting a vehicle to be detected in image data include a background difference method and an edge detection method.
- the background difference method is a method in which image data of a background without an object is acquired in advance, and the object in the image data is extracted by comparing with image data to be extracted.
- the edge detection method is a method in which an edge in image data is detected, and model data based on edge data stored in advance is compared with detected edge data to identify a target vehicle.
- the background subtraction method detects a region including a shadow of a vehicle, there is a problem that the position of only the vehicle in the image data cannot be accurately detected (Patent Document 1).
- the edge detection method also detects shadows.
- the detection process takes time because the shape is identified by comparing the shape data of the edge detected part with the shape data stored in the object shape database in advance ( Patent Document 2).
- Patent Document 1 JP-A-2005-190142
- Patent Document 2 JP 2000-148977
- the present invention has the following configuration in order to provide a method for detecting a subject in image data that solves the above problems.
- the similarity value calculated from the first region and each second region is associated with each second pixel.
- the similarity value indicating the feature of the first region among the similarity values associated with the plurality of second pixels is associated with the first pixel.
- an area in which the shape determined according to the distribution of similarity values indicating the characteristics of the first area matches the shape previously determined as the characteristics of the object is detected as a portion of the object in the image data.
- the shape defined as the feature of the subject is a condition that determines the shape of the vehicle.
- the vehicle obtains the center position of the vehicle in the horizontal direction in the image data from the target of the distribution of the similarity value of the vehicle, and the conditional force that determines the shape of the vehicle distribution and the shape of the vehicle. It is characterized by specifying the position.
- FIG. 1 is a configuration diagram of an image processing apparatus 1 of the present embodiment.
- FIG. 2 shows image data taken by the camera 2 of the present embodiment.
- FIG. 3 is a configuration example of the database 20 in which the coordinates of the vehicle 13 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- FIG. 4 is a conceptual diagram for detecting a similarity value 38 from acquired image data 10.
- FIG. 5 is a flowchart of a method for calculating a similarity value 38.
- Figure 6 shows an example of a pedestrian crossing 41.
- FIG. 7 is a flowchart of processing for removing the pedestrian crossing 41 area from the image data 10 executed by the road mark deleting unit 5.
- FIG. 8 is a diagram showing an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- FIG. 8 is a diagram showing an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6.
- FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16.
- FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
- FIG. 12 is an example of vehicle detection according to the present embodiment.
- FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment.
- Point database that indicates the boundary between the roadway and the roadside belt
- FIG. 1 is a configuration diagram of an image processing apparatus 1 according to the present embodiment.
- the image processing apparatus 1 of the present invention includes a camera 2, an image memory 3, an autocorrelation calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Composed.
- the image data of the intersection imaged by the camera 2 is stored in the image memory 3.
- the autocorrelation difference calculator 4 creates a difference map of similar values from the image data stored in the image memory 3.
- the autocorrelation difference calculation unit 4 obtains a difference between similar values for each pixel and creates a distribution diagram based on the difference between similar values.
- the road mark deletion unit 5 deletes a road mark area in the image data stored in the image memory 3. From the above, the image processing apparatus 1 calculates a difference between similar values from the original image data, and obtains a distribution map from which road marks are removed.
- the pattern feature quantity calculation unit 6 calculates the vehicle pattern from the distribution of similar value differences created by the autocorrelation difference calculation unit 4. Detected.
- the pattern feature quantity calculation unit 6 acquires the vertical coordinate in the image data indicating the pattern force of the vehicle, the foremost part or the last part of the vehicle. When the pattern feature quantity calculation unit 6 detects a vehicle pattern, it also takes into account the area information deleted by the road mark deletion unit 5.
- the target feature quantity calculation unit 7 calculates the target property of the vehicle from the distribution diagram of the difference between the similar values created by the autocorrelation difference calculation unit 4.
- the target feature quantity calculation unit 7 obtains the center line where the vehicle exists in the image data from the target.
- the vehicle position determination unit 8 specifies the position of the vehicle in the horizontal direction within the image specified by the pattern feature quantity calculation unit 6 and the object feature quantity calculation unit 7. Further, the vehicle position determination unit 8 acquires the vehicle width determined from the position of the vehicle, and also searches the database for the acquired vehicle width and the vertical position force in the image data to specify the vehicle type.
- FIG. 2 shows image data 10 taken by the camera 2 of this embodiment.
- Image data 10 is stored in image memory 3 in digital format.
- Image data 10 is composed of a set of pixels 11.
- Pixel 11 is the smallest unit constituting image data 10.
- the vertical direction of the image data 10 indicates the direction of the vertical direction 13
- the horizontal direction indicates the direction of the horizontal direction 14.
- the ordinate indicates the coordinates that specify the position in the vertical direction 13 in the image data 10
- the abscissa indicates the coordinates that specify the position in the horizontal direction 14 in the image data 10.
- the vanishing point 12 is the deepest coordinate when the image data 10 projected onto the second dimension is viewed in the original three-dimensional space.
- the vanishing point 12 of the image data 10 when the camera 2 is installed in a place overlooking the upward force is generally arranged at a position above the vertical direction 13 in the image data 10.
- Arranging the vanishing point 12 at an upper position in the image data 10 makes it possible to increase the area in which the roadway 15 in the image data 10 is reflected. Therefore, the image processing apparatus 1 can be decomposed in the longitudinal direction 13 of the roadway 15 by many ordinates as compared with the case where the vanishing point 12 is in a lower position in the image data 10. When it becomes possible to disassemble the vertical direction 13 more, the image processing apparatus 1 can accurately acquire the position of the vehicle 16 on the roadway 15.
- the camera 2 when the camera 2 is mounted on the vehicle 16, the camera 2 is generally arranged so as to capture a horizontal direction with respect to the ground.
- the camera 2 is attached to the vehicle 15, the camera 2 is attached to a low position such as a front portion of the vehicle 15. Therefore, camera 2 is pointed downward In the case of an angle, the camera 2 cannot capture the highest part of the vehicle 15 traveling in front.
- the width of the roadway 15 is set to be the width of the roadway 16 to be acquired at a position below the vertical direction 13 in the image data 10. At this time, the point 17 indicating the boundary between the roadway 16 and the roadside belt in the foreground is the lowermost part of the image data 10.
- the camera 2 can capture a large width of the vehicle 16 in front of the image data 10.
- the image processing device 1 can be decomposed with more pixels than the width of the vehicle 16, and the image processing device 1 can accurately acquire the width information of the vehicle 16.
- the angle of the camera 2 with respect to the vanishing point 12 and the width of the roadway 15 can be changed according to the road condition and the object to be acquired.
- the image processing apparatus 1 has coordinate information of the vanishing point 12 in the image data 10. Further, the image processing apparatus 1 has the coordinates of a point 17 that indicates the boundary between the roadway 15 and the roadside belt in the foreground. The coordinates of point 17 indicating the boundary between roadway 15 and the roadside belt are two points on both sides of the road. Further, the image processing apparatus 1 obtains a straight line at the boundary between the roadway 15 and the roadside band from a straight line connecting the coordinates of the vanishing point 12 and the coordinates of the point 17 indicating the boundary between the roadway 15 and the roadside band. Further, the image processing apparatus 1 has a database 20 in which the coordinates of the vehicle 16 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- FIG. 3 is a configuration example of the database 20 in which the coordinate value 21 in the vertical direction 13 of the vehicle 16 existing in the image data 10 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- the image data 10 in which the coordinate value 21 in the vertical direction 13 is composed of 480 pixels 11 has the width information of the vehicle 16 corresponding to the 480 coordinate values 21 in the vertical direction 13.
- the width of the vehicle 16 is almost determined according to the light car 22, the normal car 23, and the large car 24.
- the width of a typical vehicle of ordinary car 23 is 1700 mm.
- the image processing apparatus 1 can obtain in advance the number of pixels in the width of the vehicle 16 in the lateral direction 14 corresponding to the position of the coordinate value 21 in the longitudinal direction 13 of the vehicle 16 in the image data 10.
- the database 20 of the present embodiment stores the vehicle 16 in the lateral direction 14 such as a light car 22, a normal car 23, and a large car 24 according to the position of the coordinate value 21 in the vertical direction 13 as the width information of the vehicle 16. Are stored in association with the number of pixels having a width of.
- the length of the bottom of the quadrilateral indicating the vehicle 16 on the roadway 15 in the image data 10 is the number of pixels that is the width of the vehicle 16.
- the number of pixels of the width of the vehicle 16 changes according to the coordinate in the vertical direction 13 where the square is detected.
- the autocorrelation difference calculation unit 4 calculates a difference 39 of the similarity values 38 associated with each pixel 11 of the image data 10 stored in the image memory 3.
- the similarity value 38 is a numerical value indicating the degree of approximation of the entire area when the first area and the second area set in the image data 10 are compared.
- FIG. 4 is a conceptual diagram in which the autocorrelation difference calculation unit 4 detects the similarity value 38 from the acquired image data 10.
- the calculation target pixel 33 (first pixel) is a pixel 11 for obtaining the similarity value 38 in the image data 10 that is the current target of the calculation.
- the calculation target pixels 33 are all the pixels 11 in the image data 10. It should be noted that the calculation of the similarity value 38 can be omitted for the pixel 11 located in an area where the vehicle 16 cannot exist in the image data 10.
- the autocorrelation difference calculation unit 4 can omit the calculation for the pixel 11 that exceeds the roadside band and is in a range where the vehicle 16 is clearly not present.
- the area where the autocorrelation difference calculation unit 4 can omit the calculation is set in advance.
- the autocorrelation difference calculation unit 4 calculates the similarity value 38 for all the pixels 11 other than the region where the calculation can be omitted. Since the autocorrelation difference calculation unit 4 omits the calculation for pixels in a range where the vehicle 16 is clearly assumed not to exist, the amount of calculation performed by the image processing device 1 is reduced. It is possible to shorten the time until the result is obtained.
- the comparison target pixel 35 (second pixel) is the pixel 11 that is the current target of the calculation for obtaining the similarity value 38 in the image data 10 for the calculation target pixel 33.
- the calculation area 32 (first area) is an area that is the current comparison target of the calculation for obtaining the similarity value 38 with the calculation target pixel 33 as the center.
- the comparison area 34 (second area) is an area to be compared with the calculation area 32 around the comparison target pixel 35.
- the search range 31 is a range in which the comparison target pixel 35 is set for the calculation target pixel 33. .
- the search range 31 is set because it is not necessary to calculate the similarity value 38 when the distance between the coordinates in the image data 10 is large between the calculation target pixel 33 and the comparison target pixel 35. This is because when the distance between the coordinates is far away, the calculation area 32 and the comparison area 34 are not considered to be a single vehicle 16.
- calculating the similarity value 38 between completely different objects is a cause of erroneous detection of the vehicle 16 in the image data 10.
- the search range 31 is a range in which the vehicle 16 can exist around the calculation target pixel 33.
- the comparison target pixels 35 are all the pixels 11 within the search range 31. If the search range 31 and the region where the vehicle 16 cannot exist in the image data 10 overlap, the autocorrelation difference calculation unit 4 calculates the similarity value 38 of the comparison target pixel 35 in the overlap range. It is also possible to omit the calculation. If the calculation for calculating the similarity value 38 with the comparison target pixel 35 in the overlapping range is omitted, the amount of calculation performed by the image processing device 1 decreases, so the time required for the image processing device 1 to obtain the calculation result is reduced. It can be shortened.
- FIG. 5 is a flowchart of a method for calculating the similarity value 38.
- the autocorrelation difference calculation unit 4 performs a calculation for calculating the similarity value 38 for each calculation target pixel 33 in the image data 10 according to the following procedure.
- the autocorrelation difference calculation unit 4 specifies the coordinates (X, y) of the calculation target pixel 33 (S01).
- the coordinates of the pixel to be calculated 33 in Fig. 4 are (X, y).
- “X” is the value of the coordinate in the horizontal direction 14 of the calculation target pixel 33 in the image data 10.
- “y” is the value of the coordinate in the vertical direction 13 of the calculation target pixel 33 in the image data 10.
- the area around the coordinates (X, y) of the calculation target pixel 33 is the calculation area 32.
- each pixel 11 in the calculation area 32 is set to (x + i, y + j).
- “i” is a variable that specifies the coordinate in the horizontal direction 14 of the pixel 11 in the area around the coordinate (X, y) of the pixel to be calculated 33.
- “ ⁇ is a variable that specifies the coordinate in the vertical direction 13 of the pixel 11 in the area around the coordinates (X, y) of the pixel 33 to be calculated.
- the autocorrelation difference calculation unit 4 specifies the search range 31 (S02).
- the search range 31 is a value having a preset size, and is set around the coordinates (X, y) of the calculation target pixel 33.
- the autocorrelation difference calculation unit 4 specifies the coordinates (x + u, y + v) of the comparison target pixel 35 in the search range 31 (S03).
- "u” is a variable for specifying the horizontal 14 coordinates in the search range 31 It is.
- “v” is a variable for specifying the coordinate in the vertical direction 13 within the search range 31.
- the area around the comparison target pixel 35 becomes the comparison area 34. Further, each pixel in the comparison area 34 is assumed to be (X + u + i, y + v + j).
- “i” is a variable for specifying the coordinate in the horizontal direction 14 of the pixel 11 in the region around the comparison target pixel (x + u, y + v).
- T is a variable for specifying the coordinate in the vertical direction 13 of the pixel 11 in the region around the comparison target pixel (x + u, y + v).
- the autocorrelation difference calculation unit 4 calculates a similarity value 38 between the calculation area 32 and the comparison area 34.
- the calculation to calculate the similarity value 38 is the coordinates of each pixel 11 in the area around the calculation target pixel 33.
- (x + i, y + j) "i", "j” and the comparison target pixel 35 Show the coordinates of each pixel in the area of (x + u + i, y + v + j), change the values of “i” and “j” and compare for each pixel corresponding to each coordinate (S04).
- the autocorrelation difference calculation unit 4 calculates the similarity value 38 between the calculation region 32 and the comparison region 34 by the following method. 1.Calculation method based on the sum of absolute values of the difference between the brightness of the pixels in the computation area 32 and the brightness of the pixels in the comparison area 34.SAD (Sum Of Absolute Difference) 2.Comparison with the brightness of the pixels in the computation area 32 Method of calculating from sum of squares of luminance and difference of pixels in area 34 SSD (Sum Of Square Difference), and 3. Method of calculating by normalization NCOR (Normalized Correlation).
- the autocorrelation difference calculation unit 4 obtains the luminance difference between each pixel between the calculation region 32 and the comparison region 34, and calculates the similarity 38 as the entire comparison region 34 from the sum of the luminance differences. calculate.
- the autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 by SAD.
- the brightness PI (x + i, y + j) of the pixel 11 at each coordinate (x + i, y + j) in the calculation area 32 is acquired.
- the brightness P2 (x + u + i, y + v + j) of the pixel 11 at the coordinates (x + u + i, y + v + j) in the comparison area 34 is acquired.
- the absolute value of the difference in luminance is calculated between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j).
- Equation (1) is used to calculate the sum of the absolute values of the luminance differences. “T” indicates that the currently acquired image data 10 is the target. [0034] [Equation 1] p _ p
- the autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 using the SSD.
- the autocorrelation difference calculation unit 4 calculates the square of the difference in luminance between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j). To do.
- the autocorrelation difference calculation unit 4 calculates the sum of squares of the differences in luminance calculated between all the pixels in the calculation region 32 and all the pixels in the comparison region 34.
- the autocorrelation difference calculation unit 4 sets the calculated value as the similarity value 38 corresponding to the comparison target pixel 35. Equation (2) is used to calculate the sum of the squares of the luminance differences. “T” indicates that the currently acquired image data 10 is a target.
- the autocorrelation difference calculation unit 4 performs the calculation of Equation (3) when calculating the similarity value 38 by NCOR. “t” indicates the currently acquired image. “T” indicates that the currently acquired image data 10 is the target.
- the autocorrelation difference calculation unit 4 repeats the process until the luminance difference is calculated for all the pixels between the comparison area 34 and the calculation area 32 (S05: no).
- the autocorrelation difference calculation unit 4 calculates the sum of the luminance differences (S05 : yes )
- the similarity value 38 similar value calculated from the first region and the second region) is associated with the comparison target pixel 35.
- Store S06
- the autocorrelation difference calculation unit 4 repeats the process until the similarity value 38 between all the comparison target pixels 35 and the calculation target pixel 33 within the search range 31 is calculated (S07: no).
- S07: yes When the autocorrelation calculation unit 4 calculates the similarity value 38 for all the comparison target pixels 35 in the search range 31 (S07: yes), each calculation target pixel 35 is compared with each comparison target pixel 35 in the search range 31.
- S08 has The difference 39 of the similar value 38 indicating the feature among the difference 39 of the similar value 38 (similar value indicating the feature of the first region) is associated (S08).
- the difference 39 of the similarity value 38 indicating the feature is a large value when the image of the calculation region 32 includes a corner, and the image of the calculation region 32 is small when there are few changes such as a straight line and a fast surface. Value.
- the difference 39 of the similarity value 38 associated in S08 is used for the vehicle detection process described later.
- the autocorrelation difference calculation unit 4 calculates the difference 39 of the similarity value 38 associated with the calculation target pixel 33 according to the following procedure.
- the comparison target pixel 35 calculated by the autocorrelation difference calculation unit 4 within the search range 31 includes the calculation target pixel 33 itself.
- the similarity value 38 calculated between the calculation target pixels 33 itself is a value indicating the most similarity among the similar values 38 corresponding to the comparison target pixels 35 in the search range 31. Therefore, it is necessary to exclude the similarity value 38 calculated between the calculation target pixel 33 itself.
- the autocorrelation difference calculation unit 4 calculates the comparison target pixel 35 having the second-like similarity value 38 in the search range 31. Is detected. For example, the autocorrelation difference calculation unit 4 removes the similarity value 38 between the calculation target pixel 33 itself, and among the similar value 38 stored corresponding to each comparison target pixel 35 in the search range 31. To get the minimum similarity value 38.
- the autocorrelation difference calculation unit 4 sets the obtained value as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33.
- the calculation obtained from SAD is shown in Equation (4).
- the calculation obtained from the SSD is shown in Equation (5).
- T indicates that the currently acquired image data 10 is a target.
- the autocorrelation difference calculating unit 4 detects the comparison target pixel 35 having the second-like similarity value 38 in the search range 31.
- the similar value 38 that is the maximum value among the similar values 38 stored corresponding to each comparison target pixel 35 in the search range 31 is processed by performing processing that excludes the similar value 38 between the calculation target pixel 33 itself. Get 38.
- the autocorrelation difference calculation unit 4 sets a value 39 obtained by subtracting the second-like similarity value 38 from “1” as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33.
- the calculation obtained from NCOR is shown in Equation (6). “T” indicates that the currently acquired image data 10 is the target.
- the difference 39 of the similarity value 38 calculated by the autocorrelation difference calculation unit 4 has a feature that increases when the calculation area 32 has a corner.
- the vehicle 16 has many corners when viewed from the front. For example, a frame portion on both sides of a front glass attached to the vehicle 16, a fender mirror portion of the vehicle 16, a front grill portion in front of the vehicle 16, a number plate portion in front of the vehicle 16, and the like. These parts of the vehicle 16 are shown in the image data 10 as corners, and the calculated similarity value 38 is larger than the similarity value 38 in the region where other corners are shown.
- the similarity value 38 increases because the calculation area 32 and the comparison area 34 rarely have corners having the same shape, and the number of pixels having different luminance between the calculation area 32 and the comparison area 34 is large. .
- the difference 39 of the similarity values 38 calculated by the autocorrelation difference calculation unit 4 is characterized by being small.
- the case where there is no corner is, for example, a case where a straight edge portion such as a roof portion of the vehicle 16 is used as a region, or a case where an interior such as asphalt is configured with similar colors.
- the autocorrelation difference calculation unit 4 performs a calculation process for obtaining the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 according to the above-described procedure as the calculation target pixel 33 for all the pixels 11 in the image data 10 (S09). ).
- the autocorrelation difference calculation unit 4 calculates all calculation target pixels in the image data 10. If the difference 39 of the similar value 38 is associated with 33, if not (S09: no), the calculation target pixel 33 is moved to the next pixel 11 in the image data 10 and similarly the difference 39 of the similar value 38 is obtained. Is calculated.
- the autocorrelation difference calculation unit 4 performs each calculation target in the image data 10.
- a distribution map 50 based on the difference 39 of the similarity value 38 associated with the pixel 33 is created (S10).
- Distribution map 50 is composed of difference 39 of similarity value 38. Pixels 11 having a large difference 39 value of similar values 38 in the distribution map 50 are distributed in a large area in the image data 10 where corners are shown. However, the difference 39 of the similarity value 38 is large even in the region where the corner of the road mark is shown. Therefore, such a road mark causes false detection. Therefore, the excess part of the background is deleted from the image data 10 stored in the image memory 3.
- the extra portion of the background is, for example, a road mark.
- a road mark is a line that indicates the boundary between a pedestrian crossing, a center line of a road, and a sidewalk.
- processing for deleting information related to the pedestrian crossing in the image data 10 is performed.
- a pedestrian crossing is drawn in the image data 10 and parallel to the horizontal direction 14.
- the road mark deletion process is performed every time the image data 10 is acquired. This is because in the process of deleting the road mark area in advance, even if there is a vehicle 16 on the road mark, the road mark area is deleted and the vehicle 16 on the road mark may not be detected.
- FIG. 6 shows an example of a pedestrian crossing 41.
- the pedestrian crossing 41 is an area in which a white stripe pattern 42 is drawn on the roadway 15.
- the road mark deletion unit 5 detects the area of the pedestrian crossing 41 as well as the image data 10 by the following procedure.
- the area of the pedestrian crossing 41 detected by the road mark deletion unit 5 is excluded from the calculation when the vehicle 16 is detected.
- FIG. 7 is a flowchart of a process for excluding the pedestrian crossing 41 area from the image data 10 executed by the road mark deletion unit 5.
- the road mark deletion unit 5 deletes the edge 47 along the line segment facing the vanishing point 12 in the image data 10 (S21).
- the road mark deleting unit 5 deletes the edge 48 in the horizontal direction 14 periodically present in the image data 10 (S22).
- Crosswalk 41 striped pattern 42 is image data This is because it exists in the horizontal direction 14 in the data 10.
- the road mark deletion unit 5 detects a repeating pattern having a period length in the horizontal direction 14 that matches the pedestrian crossing 41 (S23). Pedestrian crossing 4 1 Stripe 42 Width 49 is defined. Therefore, the road mark deletion unit 5 can store in advance the width 49 of the striped pattern 42 in the horizontal direction 14 according to the coordinates in the vertical direction 13 in the image data 10.
- Ap (X, y) is the luminance P3 (x, y) of the coordinate 43 (x, y) that is the target of the current calculation and the coordinate 43 (X, y) of the target of the current calculation.
- it is the average value of the luminance between the coordinates P4 (x + L, y) of the coordinates 44 (x + L, y) moved in the horizontal direction 14 by a predetermined amount L.
- the predetermined amount L is the width in the horizontal direction 14 of the striped pattern 42 of the pedestrian crossing 41 according to the coordinate “y” in the vertical direction 13 in the image data 10.
- Ap (x, y) is the average value of the luminance of P3 and P4, so it increases when the white part of the pedestrian crossing.
- the road mark deletion unit 5 calculates Ap from the equation (7).
- Equation 7 Equation (7)
- Am (x, y) is a coordinate 45 (x—L / 2, x) that is moved in the horizontal direction 14 by a predetermined amount (-LZ2) with respect to the coordinate 43 (x, y) that is the object of the current calculation.
- y) Luminance P5 (x—L / 2, y) and coordinate 43 (X, y) for the current calculation.
- / 2, y) is the average value of luminance between P6 (x + L / 2, y). The average value of the brightness increases when the white part of the pedestrian crossing.
- the road mark deletion unit 5 calculates Am from Equation (8).
- the relationship between Ap and Am is a relationship in which the luminance is inverted because P5 and P6 are offset by 14 in the horizontal direction by "LZ2" with respect to P3 and P4.
- P3 and P4 are located in the white painted part of the striped pattern 42 of the pedestrian crossing 41, P5 and P6 are crosswalks.
- the relationship is located on the asphalt part of the striped pattern 42 on the road 41. That is, Am (x, y) has a minimum value when Ap (x, y) is a maximum value, and Am (x, y) has a maximum value when Ap (x, y) is a minimum value. .
- Cp (x, y) is an absolute difference between the luminance P3 (x, y) and the luminance P4 (x + L, y). Therefore, it becomes a small value when P3 and P4 are in the same state.
- P3 5 and P4 are located in the white part of the striped pattern 42 of the pedestrian crossing 41; ⁇ P3 and P4 are respectively located on the asphalt part of the striped pattern 42 of the pedestrian crossing 41 Is the case.
- Part 5 calculates Cp from equation (9).
- Cm (x, y) is the absolute difference between the luminance P5 (x-L / 2, y) and the luminance P6 (x + L / 2, y). Therefore, the value is small when P5 and P6 are in the same state. For example, P5 and P6 are placed on the white part of the striped pattern 42 of the pedestrian crossing 41; or P5 and P6 are placed on the asphalt part of the striped pattern 42 of the pedestrian crossing; ⁇ This is the case of standing.
- the road mark deletion unit 5 also calculates the force of Cm using equation (10).
- the road mark deleting unit 5 calculates the feature point Ecw of the pedestrian crossing 41 from the equation (11).
- Ecw takes a large value when Ap and Am are in a reversed relationship. Ecw is the difference difference A large value is obtained when the pair values Cp and Cm are small. Therefore, in the area of the pedestrian crossing 41, the feature point is high.
- the road mark deletion unit 5 associates the calculated Ecw value with the corresponding pixel 11 (S24).
- the road mark deletion unit 5 performs the above calculation for each pixel 11 in the image data 10 (S25).
- the road mark deletion unit 5 excludes the area determined to be a pedestrian crossing 41 from the process of identifying the vehicle 16 (S26).
- the road mark deletion unit 5 associates the pixel detected as the pedestrian crossing 41 with the pixel in the distribution map 50 created by the autocorrelation calculation unit 4, and is excluded from the subsequent vehicle 16 detection processing.
- the autocorrelation difference calculation unit 4 can also create the distribution map 50 from the image data 10 after the crosswalk 41 is removed by the road mark deletion unit 5.
- the image processing apparatus 1 can reduce erroneous detection of the vehicle 16 in the process of specifying the vehicle 16.
- the road mark deletion unit 5 acquires the area of the pedestrian crossing 41 to be deleted in FIG. 6C with respect to the original image data 10 in FIG. 6B.
- the image processing apparatus 1 calculates an area in which the shape determined according to the distribution of similar values indicating the characteristics of the calculation area 32 matches the shape previously determined as the characteristics of the vehicle (subject). Detect as part of the vehicle inside.
- the image processing apparatus 1 detects the vehicle 16 from the distribution map 50 created by the autocorrelation difference calculation unit 4 and the road mark deletion unit 5.
- the image processing apparatus 1 detects the vehicle in the distribution map 50 using the shape of the left and right objects of the vehicle 16.
- FIG. 8 shows an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- the camera 2 installed in this embodiment takes a picture of the vehicle 16 in front or rear.
- the front or back of vehicle 16 is the shape of the left and right target
- the difference 39 between the similar values 38 of the pixel groups 51 located on the left and right of the vehicle 16 is increased. This is because the left and right sides of the vehicle 16 have many corners.
- the difference 39 between the similarity values 38 of the pixel groups located in the front part of the vehicle 16 is large. This is because the front part of the vehicle 16 has many corners. The front part is also symmetrical from the center of the vehicle 16.
- the difference 39 between the similar values 38 of the pixel group 52 located outside the left and right side surfaces of the vehicle 16 is reduced. vehicle This is because the difference 39 of the similarity value 38 is small because the outside of 16 is asphalt.
- the difference 39 of the similarity value 38 corresponding to the pixel group 53 located in front of the front portion of the vehicle 16, that is, in front of the vehicle 16 is reduced.
- the similarity value 38 is small because the outside of the vehicle 16 is asphalt. Small values are distributed among the similar values of the pixel group located in the front part of the vehicle 16. It should be noted that the front of the front portion of the vehicle 16 facing forward is the coordinate in the vertical direction 13 below the vehicle 16 in the two-dimensional image data 10.
- a pattern that is a characteristic of the vehicle is defined in advance.
- the pattern that is a feature of the vehicle 16 in this embodiment is defined by the following conditions.
- the pattern feature quantity calculation unit 6 of the image processing apparatus 1 detects a region satisfying the above conditions from the distribution map 50.
- FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6.
- the pattern feature quantity calculation unit 6 performs the following calculation for each pixel 54 in the distribution diagram 50 in FIG. First, the pattern feature quantity calculation unit 6 sets a pixel 54 that is a current calculation target in the distribution map 50 as a detection target pixel 55 (S31). The no-turn feature quantity calculation unit 6 calculates the feature quantity using the detection target pixel 55 as a reference position (S32). Specifically, the pattern feature quantity calculation unit 6 acquires the result of the following expressions (12) to (16) for the distribution state of the difference 39 of the similarity value 38. “T” indicates that the currently acquired image data 10 is the target.
- Eleft 56 obtained by Expression (12) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “kl” and “11” with the detection target pixel 55 as a reference coordinate.
- Eright 57 obtained by equation (13) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “k2” and “12” with the detection target pixel 55 as the reference position.
- the central force of the vehicle 16 corresponds to the range to the left and right side parts.
- the side part and the front part of the vehicle 16 have many corner parts and many values with a large difference 39 of similar values 38 are distributed.
- “T” indicates that the currently acquired image data 10 is a target.
- Eside 58 obtained by the equation (14) is obtained by calculating the difference 39 of the similarity value 38 of the pixel 54 in the region defined by "k3" and "13" on the left side of the distribution map 50 with the detection target pixel 55 as a reference position.
- the sum. Eside 58 corresponds to the outside of the side of the vehicle 16.
- the area further outside the side face of the vehicle 16 is asphalt, and there are many distributions with a small difference 39 between the similarity values 38.
- Ebottom 59 obtained by the equation (15) is distributed with the detection target pixel 55 as a reference position.
- Ebottom 59 is on the underside of vehicle 16.
- the reference position of the detection target pixel 55 is the lower end of the front part of the vehicle 16, the area of the Ebottom 59 is a road. Therefore, the value of Ebottom59 becomes smaller.
- the pattern feature quantity calculation unit 6 substitutes the values calculated by the above equations (12) to (15) into the following equation (16) to obtain the characteristics of the vehicle 16 corresponding to the detection target pixel 55.
- Indicate size Get the value 60E (x, y). “T” indicates that the currently acquired image data 10 is the target.
- E (x, y) increases when the values of Eleft56 and Eright57 are large! /.
- E (x, y) becomes smaller when there is a left / right balance of Eleft56 and Eright57.
- E (x, y) is the value of Eside58.
- E (x, y) becomes smaller when the value of Ebottom59 is small.
- the values for Cside and Cbottom are values set by the designer according to the road conditions photographed by the camera 2 and the like.
- the no-turn feature quantity calculation unit 6 associates the calculated E (x, y) with the detection target pixel 55 (S33).
- the non-turn feature quantity calculator 6 calculates the above processing for all the pixels 54 in the distribution map 50 (S34).
- the pattern feature quantity calculation unit 6 can acquire the coordinate in the vertical direction 13 and the coordinate in the horizontal direction 14 of an area where the vehicle 16 is considered to exist in the distribution map 50 based on the calculation result.
- the autocorrelation difference calculation unit 4 obtains the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 for the original image data 10, and the similarity value Since the distribution map 50 is created with the difference 39 of 38, the influence of shadows and the like appearing in the image data 10 is reduced.
- the target characteristic amount calculation unit 7 of the image processing apparatus 1 performs an operation for obtaining the left and right center coordinates of the vehicle 16 from the distribution map 50.
- FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16. The shape of the vehicle 16 is subject to right and left. Therefore, the difference 39 of the similarity value 38 is the car Distributed from left to right from 16 centers. Therefore, the target characteristic feature quantity calculation unit 7 specifies the position of the vehicle 16 using the left-right symmetry of the vehicle 16. The evaluation of objectivity is performed according to the following procedure.
- FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
- the target feature quantity calculation unit 7 sets the calculation central coordinates 61 (x, y) in FIG. 10 as the coordinates to be the target of the current calculation (S41).
- the target characteristic amount calculation unit 7 calculates the characteristic amount of the calculation center coordinate 61 (X, y) (S42).
- the object feature calculation unit 7 calculates the coordinate 63 ”+ i” in the horizontal direction 14 centered on the coordinate “X” in the horizontal direction 14 of the calculation central coordinate 61 (X, y) and Get the luminance of each coordinate 62 ”—i” away in the negative lateral direction 14 away. “i” is appropriately determined within a range in which the objectivity should be calculated.
- the range of “i” is set to half the width of the vehicle 16 corresponding to the coordinate in the vertical direction 13 in the image data 10.
- the target feature calculation unit 7 uses the coordinate 63 (x, y) as the coordinate 63 “x + i, y” where the coordinate 63 (X, y) force is also in the positive lateral direction 14 and the luminance of the coordinate 63 “x + i, y” is calculated.
- P7 (x + i, y) The target characteristic calculation unit 7 uses the coordinate 62, x-i, y as the coordinate 62, x-i, y, where the coordinate 62 (x, y) force is also negative in the lateral direction 14
- the objectivity feature quantity calculation unit 7 calculates the total value of the current luminance P7 and luminance P8 from S (i) in equation (17).
- S (i) takes a large value when the luminance P7 and the luminance P8 are large.
- T indicates that the currently acquired image data 10 is a target.
- the target characteristic calculation unit 7 calculates the difference value between the current luminance P7 and the luminance P8 by D (i) in equation (18).
- D (i) is a value that is small when the brightness P7 and the brightness P8 are close and values. Therefore, D (i) represents the balance between luminance P7 and luminance P8. “T” is the currently acquired image data 10 Indicates that is the target.
- the objectivity feature quantity calculation unit 7 calculates a line passing through the center of the vehicle by Esym (x, y) in Expression (19).
- Esym (x, y) is the absolute value of D (i) from the sum of the absolute values of S (i) when "i" is changed within the specified range from the operation center coordinates 61 (x, y) Is the value obtained by subtracting the sum of Therefore, Esym (x, y) has a large value in the case where the left and right luminances in the region to be calculated are equal and the left and right luminances are large.
- the object feature quantity calculation unit 7 associates Esym ( X , y) calculated in S32 with the calculation center coordinate 61 (S43).
- the target characteristic amount calculation unit 7 performs the above calculation for all the pixels 54 in the distribution map 50.
- the object feature calculation unit 7 can also detect a line passing through the center of the vehicle 16 by the equation (20) for obtaining the center of gravity Rsym (x, y) of the vehicle 16 by the following equation. . “T” indicates that the acquired image data 10 is the target.
- the vehicle position determination unit 8 uses the pattern feature quantity calculation unit 6 to recognize the pattern of the vehicle 16 and obtains the coordinates of the vertical direction 13 and the horizontal direction 14 where the vehicle 16 exists in the distribution map 50. obtain.
- the coordinates in the vertical direction 13 are the lower end of the front of the vehicle 16 when the vehicle 16 is facing forward in the image data 10.
- the vehicle position determination unit 8 calculates the objectivity feature amount.
- the coordinate in the horizontal direction 14 which is the center of the vehicle 16 in the distribution map 50 is obtained from the processing for obtaining the line passing through the left and right centers of the vehicle 16 by the unit 7.
- the vehicle position determination unit 8 superimposes the calculation result obtained by the pattern feature quantity calculation unit 6 and the calculation result obtained by the target characteristic quantity calculation unit 7 to obtain a reference coordinate in which the vehicle 16 exists in the image data 10.
- the vehicle position determination unit 8 detects the luminance peak of the image data 10 or the distribution map 50 from the acquired reference coordinates, and acquires the number of pixels indicating the width of the front portion of the vehicle 16.
- the number of pixels 13 in the width of the front portion of the vehicle 16 obtained by the coordinates of the vertical direction 13 and the peak detection as a reference of the vehicle 16 is detected in the database 20.
- the image processing apparatus 1 can detect the position of the vehicle 16 in the image data 10 and specify the vehicle type of the vehicle 16.
- FIG. 12 is an example of vehicle detection according to the present embodiment.
- the original image data (A) is an image taken by the camera 2.
- the vehicle portion (B) is an area where the vehicle portion of the original image data (A) is extracted.
- Detection (C) is a detection range when detection is performed by the processing of the present embodiment.
- Detection by background difference is the detection range when detecting by the background difference method, which is a conventional technique.
- Edge detection is a detection range in the case of detection by the edge detection method that is a conventional technique. In the background difference method and edge detection method, the range of the vehicle alone cannot be detected because it is affected by shadows.
- FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment.
- the image processing device 1 ⁇ , CPU101, camera 2, memory 103, output light and nose 105 power.
- the CPU 101, the camera 2, the memory 103, and the output unit 104 are connected by a bus 105 and exchange data between the units.
- the CPU 101 is a central processing unit that reads the image processing program 100 stored in the memory 103 into a work area and executes it.
- the image processing program 100 functions as the CPU 101 of the image processing apparatus 1 as an autocorrelation difference calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Let me.
- the memory 103 stores various information of the image processing apparatus 1. For example, RAM, hard disk Isk device.
- the memory 103 also serves as a work area when the CPU 101 performs calculations. Information stored in the database 20 is also stored.
- the output unit 104 is an interface for outputting information of the position of the vehicle 16 detected in the image data 10 and the specified vehicle type.
- the output destination is a wireless terminal of another vehicle.
- Other vehicles receive information that the vehicle 16 exists in the image data 10.
- the interface communicates with wireless terminals using a wireless protocol (for example, procedures defined in IEEE 802.11).
- the difference 39 of the similarity values 38 between the pixels 11 is used as a basic feature amount for detecting the vehicle 16.
- the feature value due to the difference 39 of the similarity value 38 has a large value at the curve and corner. Further, the fact that the front or back of the vehicle is symmetrical is utilized. Therefore, the feature value based on the difference 39 of the similarity value 38 is effective when detecting the front or back of the vehicle 16.
- the feature value based on the difference 39 of the similarity value 38 has a small response in a linear part or a part with a small curvature. Therefore, the feature value based on the difference 39 of the similarity value 38 is less responsive to a linear part such as a shadow or a part that tends to be a simple shape. As a result, the feature value based on the difference 39 of the similarity value 38 can reduce false detection due to shadows and false detection due to road surface reflection.
- the present embodiment when the present embodiment is applied to a system that acquires the image data 10 at regular time intervals, the difference of the positions where the vehicle 16 exists among the plurality of image data 10 is acquired to obtain the vehicle 16. It is also possible to calculate the moving speed.
- an area obtained by grouping a plurality of pixels 11 in the force image data 10 in which the calculation processing is performed using the pixels 11 as a reference can be used as a unit of calculation. That is, when the size is changed so as to reduce the number of pixels of the digital image data 10, the size of the plurality of pixels 11 is collected and reduced to one pixel. On the other hand, when the size is changed so that the number of pixels of the digital image data 10 is increased, the size of the pixel 11 that has also interpolated the relational power with the surrounding pixels 11 is changed to the new pixel 11. Therefore, performing detection in which a plurality of pixels 11 are regarded as a single region is substantially the same as performing detection processing for each pixel 11 after resizing the image data 10.
- the calculation area 32, the comparison area 34, and the search range 31 set by the autocorrelation difference calculation unit 4 are all described as being uniform in the image data 10.
- the range of the search range 31, the calculation region 32, and the comparison region 34 is reduced for the region that is the back when viewed as a three-dimensional space.
- the search range 31, the calculation region 32, and the comparison region 34 are enlarged for the region that comes to the front when viewed as a three-dimensional space.
- the range By changing the range, the amount of computation executed by the autocorrelation difference calculation unit 4 can be reduced. Also, by changing the range, it is possible to detect the accuracy according to each coordinate position. Specifically, data for changing the range of the search range 31, the calculation region 32, and the comparison region 34 according to the coordinate value in the vertical direction 13 in the image data 10 of the calculation target pixel 33 is provided in advance. Is possible.
- a similarity value can be calculated by a combination of lightness, hue, and saturation that is obtained only by luminance.
- the present invention can be used in a system that notifies the state of a vehicle in image data by enabling detection of the position of the vehicle in image data with a small amount of computation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
明 細 書 Specification
画像データ認識方法、画像処理装置および画像データ認識プログラム 技術分野 Image data recognition method, image processing apparatus, and image data recognition program
[0001] 本発明は、画像データ認識に関する。特に、画像データ内の被写体を検出する画 像データ認識方法に関する。 [0001] The present invention relates to image data recognition. In particular, the present invention relates to an image data recognition method for detecting a subject in image data.
背景技術 Background art
[0002] 交通に関するシステムにおいて、固定されたカメラにより取得した画像データおよび 移動体に搭載したカメラにより取得した画像データから車両や歩行者を検出する技 術がある。車両や歩行者を検出する技術は、例えば、交差点の画像データを取得し 、交差点内の車両を検出し、検出された車両や歩行者の存在を走行中の車両に提 示することに利用される。車両や歩行者の存在を走行中の車両に提示することにより 、車両の運転者は交差点内の状況を未然に知ることができる。 In a system related to traffic, there is a technique for detecting a vehicle or a pedestrian from image data acquired by a fixed camera and image data acquired by a camera mounted on a moving body. The technology for detecting a vehicle or a pedestrian is used, for example, to acquire image data of an intersection, detect a vehicle in the intersection, and present the detected vehicle or pedestrian to the traveling vehicle. The By presenting the presence of a vehicle or a pedestrian to the traveling vehicle, the driver of the vehicle can know the situation in the intersection in advance.
[0003] 画像データ内の検出の対象となる車両を抽出する方式は、背景差分方式、エッジ 検出方式等がある。背景差分方式は、対象物が無い背景の画像データを予め取得 しておき、抽出の対象となる画像データと比較することにより、画像データ内の対象 物を抽出する方式である。エッジ検出方式は、画像データ内のエッジを検出し、予め 格納されたエッジデータによる雛型データと検出したエッジデータとを比較し、対象 の車両を特定する方式である。 [0003] Methods for extracting a vehicle to be detected in image data include a background difference method and an edge detection method. The background difference method is a method in which image data of a background without an object is acquired in advance, and the object in the image data is extracted by comparing with image data to be extracted. The edge detection method is a method in which an edge in image data is detected, and model data based on edge data stored in advance is compared with detected edge data to identify a target vehicle.
[0004] 背景差分方式は、車両の影等を含む領域を検出することとなるため、画像データ内 の車両のみの位置を正確に検出することができない問題があった (特許文献 1)。ま た、エッジ検出方式も影を検出する。また、エッジ検出された部分の形状データと予 め所持する物体の形状のデータベースに格納された形状データとの比較を行って物 体を特定するため、検出処理に時間を要する問題があった (特許文献 2)。 [0004] Since the background subtraction method detects a region including a shadow of a vehicle, there is a problem that the position of only the vehicle in the image data cannot be accurately detected (Patent Document 1). The edge detection method also detects shadows. In addition, there is a problem that the detection process takes time because the shape is identified by comparing the shape data of the edge detected part with the shape data stored in the object shape database in advance ( Patent Document 2).
特許文献 1 :特開 2005— 190142号公報 Patent Document 1: JP-A-2005-190142
特許文献 2 :特開 2000— 148977号公報 Patent Document 2: JP 2000-148977
発明の開示 Disclosure of the invention
[0005] (発明が解決しょうとする課題) 本発明は、輝度の影響を受けずにかつ計算量が少なく画像データ内の被写体を 検出する画像データ認識方法を提供することを課題とする。 [0005] (Problems to be solved by the invention) It is an object of the present invention to provide an image data recognition method that detects a subject in image data without being affected by luminance and having a small calculation amount.
(課題を解決するための手段) (Means for solving problems)
本発明は、上記の問題を解決する画像データ内の被写体を検出する方法を提供 するために以下の構成を有する。まず、第一の画素と該第一の画素を中心とする第 一の領域と前記第一の画素と比較の対象とする複数の第二の画素と該複数の第二 の画素をそれぞれ中心とする複数の第二の領域とを定める。次に第一の領域とそれ ぞれの第二の領域とから算出した類似値をそれぞれの第二の画素に対応付ける。次 に複数の第二の画素に対応付けられた類似値の内の第一の領域の特徴を示す類 似値を第一の画素に対応付ける。次に第一の領域の特徴を示す類似値の分布の状 態に応じて定まる形状が予め被写体の特徴として定めた形状に合致する領域を画像 データ内の被写体の部分として検出する。 The present invention has the following configuration in order to provide a method for detecting a subject in image data that solves the above problems. First, a first pixel, a first region centered on the first pixel, a plurality of second pixels to be compared with the first pixel, and a plurality of the second pixels, respectively. A plurality of second regions to be defined. Next, the similarity value calculated from the first region and each second region is associated with each second pixel. Next, the similarity value indicating the feature of the first region among the similarity values associated with the plurality of second pixels is associated with the first pixel. Next, an area in which the shape determined according to the distribution of similarity values indicating the characteristics of the first area matches the shape previously determined as the characteristics of the object is detected as a portion of the object in the image data.
[0006] また、画像データの撮影対象領域である三次元空間での手前となる領域は大きぐ 三次元空間での奥となる領域を小さくする。この構成により演算量を減少させることが 可能となる。 [0006] In addition, the area in front of the three-dimensional space that is the imaging target area of the image data is large. The area behind the three-dimensional space is reduced. With this configuration, the amount of computation can be reduced.
[0007] また、被写体の特徴として定めた形状は、車両の形状を定める条件であることを特 長とする。 [0007] Further, the shape defined as the feature of the subject is a condition that determines the shape of the vehicle.
[0008] また、検出範囲の両側に大きい類似値が分布し、かつ、両側の類似値に対象性が あり、かつ、検出範囲の下には小さい類似値が分布する座標を前記車両のフロントの 最下部の座標とすることを特長とする。 [0008] In addition, coordinates on the front side of the vehicle in which large similar values are distributed on both sides of the detection range, the similarity values on both sides are symmetric, and the small similar values are distributed below the detection range. It is characterized by the lowest coordinate.
[0009] また、車両は、車両の類似値の分布の対象性から画像データ内の横方向の車両の 中心位置を求め、車両の分布の形状と車両の形状を定める条件力 縦軸方向の位 置を特定することを特長とする。 [0009] Further, the vehicle obtains the center position of the vehicle in the horizontal direction in the image data from the target of the distribution of the similarity value of the vehicle, and the conditional force that determines the shape of the vehicle distribution and the shape of the vehicle. It is characterized by specifying the position.
(発明の効果) (The invention's effect)
本発明によれば、画像データ内の画素毎に画素を中心とする領域毎の類似値を検 出し、類似値の分布形状が予め定める条件に合致する画像データ内の領域を検出 の対象とする。したがって、輝度の影響を受けずにかつ計算量が少なく画像データ 内の被写体を検出する事が可能となる。 図面の簡単な説明 According to the present invention, for each pixel in the image data, a similar value for each region centered on the pixel is detected, and a region in the image data whose similarity value distribution shape meets a predetermined condition is detected. . Therefore, it is possible to detect the subject in the image data without being affected by the luminance and with a small amount of calculation. Brief Description of Drawings
[0010] [図 1]図 1は本実施例の画像処理装置 1の構成図である。 FIG. 1 is a configuration diagram of an image processing apparatus 1 of the present embodiment.
[図 2]図 2は本実施例のカメラ 2が撮影する画像データである。 [FIG. 2] FIG. 2 shows image data taken by the camera 2 of the present embodiment.
[図 3]図 3は画像データ 10内に存在する車両 16の縦方向 13の座標と車両 16の幅と を車両 16の種類毎に関連付けたデータベース 20の構成例である。 FIG. 3 is a configuration example of the database 20 in which the coordinates of the vehicle 13 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
[図 4]図 4は取得した画像データ 10から類似値 38を検出する概念図である。 FIG. 4 is a conceptual diagram for detecting a similarity value 38 from acquired image data 10.
[図 5]図 5は類似値 38の算出方法のフローチャートである。 FIG. 5 is a flowchart of a method for calculating a similarity value 38.
[図 6]図 6は横断歩道 41の例である。 [Figure 6] Figure 6 shows an example of a pedestrian crossing 41.
[図 7]図 7は路上マーク削除部 5が実行する画像データ 10から横断歩道 41領域を除 外する処理のフローチャートである。 [FIG. 7] FIG. 7 is a flowchart of processing for removing the pedestrian crossing 41 area from the image data 10 executed by the road mark deleting unit 5.
[図 8]図 8は元の画像データ 10の車両 16部分の画像 (A)と分布図 50の車両 16部分 の画像 (B)とを示す図である。 8 is a diagram showing an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50. FIG.
[図 9]図 9はパターン特徴量計算部 6が実行する分布図 10から車両 16が存在する領 域を検出する処理のフローチャートである。 FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6.
[図 10]図 10は分布図 50と車両 16の対象性との関係を示す図である。 FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16.
[図 11]図 11は、対象性特徴量計算部 7が実行する分布図 10から車両 16が存在する 領域を検出する処理のフローチャートである。 [FIG. 11] FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
[図 12]図 12は、本実施例による車両の検出例である。 FIG. 12 is an example of vehicle detection according to the present embodiment.
[図 13]図 13は、本実施例の画像処理装置 1のハードウェア構成例である。 FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment.
符号の説明 Explanation of symbols
[0011] 1 画像処理装置 [0011] 1 Image processing apparatus
2 カメラ 2 Camera
3 画像メモリ 3 Image memory
4 自己相関差計算部 4 Autocorrelation calculator
5 路上マーク削除部 5 Road mark deletion section
6 パターン特徴量計算部 6 Pattern feature quantity calculator
7 対象性特徴量計算部 7 Objectivity feature calculator
8 車両位置判定部 画像データ 8 Vehicle position determination unit image data
画素 Pixel
消失点 Vanishing point
縦方向 Longitudinal direction
横方向 Lateral direction
車道 Roadway
束両 Bundle
最も手前における車道と路側帯との境界を示す点 データベース Point database that indicates the boundary between the roadway and the roadside belt
軽自動車 Light car
普通車 Standard-sized car
大型車 Large car
探索範囲 Search range
演算領域 Calculation area
演算対象画素 Target pixel
比較対象画素 Comparison target pixel
比較領域 Comparison area
類似値 Similar values
差分 Difference
横断歩道 crosswalk
縞模様 Striped pattern
演算の対象とする座標点 Coordinate points for calculation
所定量 Lだけ横方向 14に移動した座標 所定量(一 LZ2)だけ横方向 14に移動した座標 所定量 (LZ2)だけ横方向 14に移動した座標 消失点 12を向く線分に沿うエッジ Coordinates moved in the horizontal direction 14 by a predetermined amount L Coordinates moved in the horizontal direction 14 by a predetermined amount (one LZ2) Coordinates moved in the horizontal direction 14 by a predetermined amount (LZ2) Edge along the line segment facing the vanishing point 12
水平方向のエッジ Horizontal edge
分布図 51 車両 16の左右に位置する画素群 Distribution map 51 Group of pixels located on the left and right of vehicle 16
52 車両 16の左右の側面の外側に位置する画素群 52 Group of pixels located outside the left and right sides of vehicle 16
53 車両 16の前方に位置する画素群 53 Pixel group located in front of vehicle 16
54 画素 54 pixels
55 検出対象画素 55 Detection target pixels
56 Eleft 56 Eleft
57 Enght 57 Enght
58 Eside 58 Eside
59 ^bottom 59 ^ bottom
60 車両の特徴の大きさを示す値 60 Value indicating the size of the vehicle's features
61 演算中央座標 61 Calculation center coordinates
101 CPU 101 CPU
103 メモリ 103 memory
104 出力部 104 Output section
105 パス 105 passes
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
[0012] 本実施例は、交差点の固定カメラの画像データから車両を取得する動作について 説明する。 In this embodiment, an operation for acquiring a vehicle from image data of a fixed camera at an intersection will be described.
[0013] 図 1は本実施例の画像処理装置 1の構成図である。本発明の画像処理装置 1は、 カメラ 2、画像メモリ 3、自己相関差計算部 4、路上マーク削除部 5、パターン特徴量 計算部 6、対象性特徴量計算部 7、車両位置判定部 8から構成される。 FIG. 1 is a configuration diagram of an image processing apparatus 1 according to the present embodiment. The image processing apparatus 1 of the present invention includes a camera 2, an image memory 3, an autocorrelation calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Composed.
[0014] カメラ 2で撮像した交差点の画像データは画像メモリ 3に格納される。自己相関差計 算部 4は、画像メモリ 3に格納された画像データから類似値の差分の分布図を作成す る。自己相関差計算部 4は、画素毎に類似値の差分を求め、類似値の差分による分 布図を作成する。路上マーク削除部 5は、画像メモリ 3に格納された画像データに写 る路上マークの領域を削除する。以上から画像処理装置 1は、元の画像データから 類似値の差分を算出し、路上マークを除去した分布図を得る。次にパターン特徴量 計算部 6は、自己相関差計算部 4で作成した類似値の差分の分布図から車両のバタ ーンを検出する。パターン特徴量計算部 6は車両のパターン力 車両の最前部ある いは最後部を示す画像データ内の縦方向の座標を取得する。パターン特徴量計算 部 6は車両パターンを検出する際、路上マーク削除部 5で削除した領域情報も加味 する。対象性特徴量計算部 7は、自己相関差計算部 4で作成した類似値の差分の分 布図から車両の対象性を演算する。対象性特徴量計算部 7は対象性から画像デー タ内の車両が存在する中心線を取得する。車両位置判定部 8は、パターン特徴量計 算部 6と対象性特徴量計算部 7とから特定される画像内の車両の横方向の位置を特 定する。また、車両位置判定部 8は、車両の位置から定まる車幅を取得し、取得した 車幅と画像データ内の縦方向の位置力もデータベースを検索して車種を特定する。 The image data of the intersection imaged by the camera 2 is stored in the image memory 3. The autocorrelation difference calculator 4 creates a difference map of similar values from the image data stored in the image memory 3. The autocorrelation difference calculation unit 4 obtains a difference between similar values for each pixel and creates a distribution diagram based on the difference between similar values. The road mark deletion unit 5 deletes a road mark area in the image data stored in the image memory 3. From the above, the image processing apparatus 1 calculates a difference between similar values from the original image data, and obtains a distribution map from which road marks are removed. Next, the pattern feature quantity calculation unit 6 calculates the vehicle pattern from the distribution of similar value differences created by the autocorrelation difference calculation unit 4. Detected. The pattern feature quantity calculation unit 6 acquires the vertical coordinate in the image data indicating the pattern force of the vehicle, the foremost part or the last part of the vehicle. When the pattern feature quantity calculation unit 6 detects a vehicle pattern, it also takes into account the area information deleted by the road mark deletion unit 5. The target feature quantity calculation unit 7 calculates the target property of the vehicle from the distribution diagram of the difference between the similar values created by the autocorrelation difference calculation unit 4. The target feature quantity calculation unit 7 obtains the center line where the vehicle exists in the image data from the target. The vehicle position determination unit 8 specifies the position of the vehicle in the horizontal direction within the image specified by the pattern feature quantity calculation unit 6 and the object feature quantity calculation unit 7. Further, the vehicle position determination unit 8 acquires the vehicle width determined from the position of the vehicle, and also searches the database for the acquired vehicle width and the vertical position force in the image data to specify the vehicle type.
[0015] 次に画像処理装置 1がカメラ 2で取得する画像データについて説明する。図 2は本 実施例のカメラ 2が撮影する画像データ 10である。画像データ 10は画像メモリ 3にデ イジタル形式で格納される。画像データ 10は画素 11の集合によって構成される。画 素 11とは画像データ 10を構成する最小単位である。以降の説明にお 、て画像デー タ 10の縦方向とは縦方向 13の方向を示し、横方向とは横方向 14の方向を示す。ま た、縦座標とは画像データ 10内の縦方向 13の位置を特定する座標を示し、横座標 とは画像データ 10内の横方向 14の位置を特定する座標を示す。消失点 12は、二次 元に投影された画像データ 10を元の三次元空間で見た場合に最も奥にある座標で ある。 Next, image data acquired by the image processing apparatus 1 with the camera 2 will be described. FIG. 2 shows image data 10 taken by the camera 2 of this embodiment. Image data 10 is stored in image memory 3 in digital format. Image data 10 is composed of a set of pixels 11. Pixel 11 is the smallest unit constituting image data 10. In the following description, the vertical direction of the image data 10 indicates the direction of the vertical direction 13, and the horizontal direction indicates the direction of the horizontal direction 14. Also, the ordinate indicates the coordinates that specify the position in the vertical direction 13 in the image data 10, and the abscissa indicates the coordinates that specify the position in the horizontal direction 14 in the image data 10. The vanishing point 12 is the deepest coordinate when the image data 10 projected onto the second dimension is viewed in the original three-dimensional space.
[0016] 上方力も見下ろす場所にカメラ 2を設置する場合の画像データ 10の消失点 12は、 一般的に、画像データ 10内の縦方向 13の上方の位置に配置する。画像データ 10 内の上方の位置に消失点 12を配置することは、画像データ 10内の車道 15が写る領 域を多くすることを可能とする。したがって画像処理装置 1は車道 15の縦方向 13を 消失点 12が画像データ 10内で下方の位置にある場合と比較して多くの縦座標によ つて分解することが可能となる。縦方向 13をより多く分解することが可能となると、画 像処理装置 1は車道 15上の車両 16の位置を正確に取得することができる。一方、車 両 16にカメラ 2を搭載する場合は、一般的に、カメラ 2は地面に対して水平方向を撮 影するように配置される。カメラ 2が車両 15に取付けられる場合は、カメラ 2は車両 15 のフロント部分等の低い位置に取付けられる。したがって、カメラ 2が下方に向けられ たアングルの場合、カメラ 2は前方を走行する車両 15の最高部を撮影することができ なくなる。 [0016] The vanishing point 12 of the image data 10 when the camera 2 is installed in a place overlooking the upward force is generally arranged at a position above the vertical direction 13 in the image data 10. Arranging the vanishing point 12 at an upper position in the image data 10 makes it possible to increase the area in which the roadway 15 in the image data 10 is reflected. Therefore, the image processing apparatus 1 can be decomposed in the longitudinal direction 13 of the roadway 15 by many ordinates as compared with the case where the vanishing point 12 is in a lower position in the image data 10. When it becomes possible to disassemble the vertical direction 13 more, the image processing apparatus 1 can accurately acquire the position of the vehicle 16 on the roadway 15. On the other hand, when the camera 2 is mounted on the vehicle 16, the camera 2 is generally arranged so as to capture a horizontal direction with respect to the ground. When the camera 2 is attached to the vehicle 15, the camera 2 is attached to a low position such as a front portion of the vehicle 15. Therefore, camera 2 is pointed downward In the case of an angle, the camera 2 cannot capture the highest part of the vehicle 15 traveling in front.
[0017] 車道 15の幅は画像データ 10内の縦方向 13の下方の位置で取得すべき車道 16の 幅となるようにする。このとき、最も手前における車道 16と路側帯との境界を示す点 1 7が画像データ 10の最下部となる。画像データ 10内の下方の位置で車道 15の幅が 最大となるように配置することによりカメラ 2は画像データ 10内の手前において車両 1 6の幅を大きく写すことが可能となる。車両 16の幅を大きく写すことにより画像処理装 置 1は車両 16の幅より多くの画素で分解することができ、画像処理装置 1は車両 16 の幅の情報を正確に取得することができる。なお、消失点 12、車道 15の幅について のカメラ 2のアングルは、道路条件、取得すべき対象に応じて変更することが可能で ある。 The width of the roadway 15 is set to be the width of the roadway 16 to be acquired at a position below the vertical direction 13 in the image data 10. At this time, the point 17 indicating the boundary between the roadway 16 and the roadside belt in the foreground is the lowermost part of the image data 10. By disposing the vehicle 15 so that the width of the roadway 15 is maximized at a lower position in the image data 10, the camera 2 can capture a large width of the vehicle 16 in front of the image data 10. By enlarging the width of the vehicle 16, the image processing device 1 can be decomposed with more pixels than the width of the vehicle 16, and the image processing device 1 can accurately acquire the width information of the vehicle 16. The angle of the camera 2 with respect to the vanishing point 12 and the width of the roadway 15 can be changed according to the road condition and the object to be acquired.
[0018] 次に画像処理装置 1が有する車両 16を検出するためのデータについて説明する。 Next, data for detecting the vehicle 16 included in the image processing apparatus 1 will be described.
画像処理装置 1は画像データ 10内の消失点 12の座標情報を有する。また、画像処 理装置 1は最も手前における車道 15と路側帯との境界を示す点 17の座標を有する。 車道 15と路側帯との境界を示す点 17の座標は道路の両側の 2点となる。また、画像 処理装置 1は消失点 12の座標と車道 15と路側帯との境界を示す点 17の座標とを結 ぶ直線から車道 15と路側帯との境界の直線を得る。また、画像処理装置 1は画像デ ータ 10内に存在する車両 16の縦方向 13の座標と車両 16の幅とを車両 16の種類毎 に関連付けたデータベース 20を有する。 The image processing apparatus 1 has coordinate information of the vanishing point 12 in the image data 10. Further, the image processing apparatus 1 has the coordinates of a point 17 that indicates the boundary between the roadway 15 and the roadside belt in the foreground. The coordinates of point 17 indicating the boundary between roadway 15 and the roadside belt are two points on both sides of the road. Further, the image processing apparatus 1 obtains a straight line at the boundary between the roadway 15 and the roadside band from a straight line connecting the coordinates of the vanishing point 12 and the coordinates of the point 17 indicating the boundary between the roadway 15 and the roadside band. Further, the image processing apparatus 1 has a database 20 in which the coordinates of the vehicle 16 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
[0019] 図 3は、画像データ 10内に存在する車両 16の縦方向 13の座標値 21と車両 16の 幅とを車両 16の種類毎に関連付けたデータベース 20の構成例である。例えば、縦 方向 13の座標値 21が 480個の画素 11で構成される画像データ 10は、縦方向 13の 480個の座標値 21に応じた車両 16の幅情報を有する。車両 16の横幅は軽自動車 22、普通車 23、大型車 24に応じてほぼ決まる。例えば普通車 23の一般的な車両の 幅は 1700mmである。画像処理装置 1は画像データ 10内の車両 16の縦方向 13の 座標値 21の位置に対応する横方向 14の車両 16の幅の画素数を予め求めることが できる。本実施例のデータベース 20は、車両 16の幅の情報として縦方向 13の座標 値 21の位置に応じた軽自動車 22、普通車 23、大型車 24等の横方向 14の車両 16 の幅となる画素数とを関係づけて記憶する。 FIG. 3 is a configuration example of the database 20 in which the coordinate value 21 in the vertical direction 13 of the vehicle 16 existing in the image data 10 and the width of the vehicle 16 are associated with each type of the vehicle 16. For example, the image data 10 in which the coordinate value 21 in the vertical direction 13 is composed of 480 pixels 11 has the width information of the vehicle 16 corresponding to the 480 coordinate values 21 in the vertical direction 13. The width of the vehicle 16 is almost determined according to the light car 22, the normal car 23, and the large car 24. For example, the width of a typical vehicle of ordinary car 23 is 1700 mm. The image processing apparatus 1 can obtain in advance the number of pixels in the width of the vehicle 16 in the lateral direction 14 corresponding to the position of the coordinate value 21 in the longitudinal direction 13 of the vehicle 16 in the image data 10. The database 20 of the present embodiment stores the vehicle 16 in the lateral direction 14 such as a light car 22, a normal car 23, and a large car 24 according to the position of the coordinate value 21 in the vertical direction 13 as the width information of the vehicle 16. Are stored in association with the number of pixels having a width of.
[0020] 図 2で説明すると、画像データ 10内の車道 15内の車両 16を示す四角形の底辺の 長さが車両 16の幅となる画素数である。車両 16の幅の画素数は四角形が検出され る縦方向 13の座標に応じて変化する。 Referring to FIG. 2, the length of the bottom of the quadrilateral indicating the vehicle 16 on the roadway 15 in the image data 10 is the number of pixels that is the width of the vehicle 16. The number of pixels of the width of the vehicle 16 changes according to the coordinate in the vertical direction 13 where the square is detected.
[0021] 次に画像処理装置 1の自己相関差計算部 4が行う類似値の差分の分布図の作成 処理について説明する。自己相関差計算部 4は、画像メモリ 3に格納された画像デー タ 10の各画素 11に対応付けられる類似値 38の差分 39を算出する。類似値 38は、 画像データ 10内に設定した第一の領域と第二の領域とを比較した時の領域全体とし て近似する程度を数値ィ匕したものである。 Next, a process of creating a difference map of similar values performed by the autocorrelation difference calculation unit 4 of the image processing apparatus 1 will be described. The autocorrelation difference calculation unit 4 calculates a difference 39 of the similarity values 38 associated with each pixel 11 of the image data 10 stored in the image memory 3. The similarity value 38 is a numerical value indicating the degree of approximation of the entire area when the first area and the second area set in the image data 10 are compared.
[0022] 図 4は、取得した画像データ 10から自己相関差計算部 4が類似値 38を検出する概 念図である。 FIG. 4 is a conceptual diagram in which the autocorrelation difference calculation unit 4 detects the similarity value 38 from the acquired image data 10.
[0023] 演算対象画素 33 (第一の画素)は、演算の現在の対象となる画像データ 10内の類 似値 38を求める画素 11である。演算対象画素 33は画像データ 10内の全ての画素 11が対象となる。なお、画像データ 10内において車両 16が存在し得ない領域に位 置する画素 11は、類似値 38の演算を省略することも可能である。例えば、自己相関 差計算部 4は路側帯を超え、車両 16が明らかに存在しないと想定される範囲にある 画素 11については演算を省略することが可能である。自己相関差計算部 4が演算を 省略することが可能な領域は予め設定される。自己相関差計算部 4は演算を省略す ることが可能な領域以外の画素 11の全てについて類似値 38を算出する。車両 16が 明らかに存在しないと想定される範囲にある画素について自己相関差計算部 4が演 算を省略する構成により、画像処理装置 1が行う演算量が減少するため、画像処理 装置 1は演算結果を得るまでの時間を短縮することが可能となる。 The calculation target pixel 33 (first pixel) is a pixel 11 for obtaining the similarity value 38 in the image data 10 that is the current target of the calculation. The calculation target pixels 33 are all the pixels 11 in the image data 10. It should be noted that the calculation of the similarity value 38 can be omitted for the pixel 11 located in an area where the vehicle 16 cannot exist in the image data 10. For example, the autocorrelation difference calculation unit 4 can omit the calculation for the pixel 11 that exceeds the roadside band and is in a range where the vehicle 16 is clearly not present. The area where the autocorrelation difference calculation unit 4 can omit the calculation is set in advance. The autocorrelation difference calculation unit 4 calculates the similarity value 38 for all the pixels 11 other than the region where the calculation can be omitted. Since the autocorrelation difference calculation unit 4 omits the calculation for pixels in a range where the vehicle 16 is clearly assumed not to exist, the amount of calculation performed by the image processing device 1 is reduced. It is possible to shorten the time until the result is obtained.
[0024] 比較対象画素 35 (第二の画素)は、演算対象画素 33に対する画像データ 10内の 類似値 38を求める演算の現在の対象となる画素 11である。 The comparison target pixel 35 (second pixel) is the pixel 11 that is the current target of the calculation for obtaining the similarity value 38 in the image data 10 for the calculation target pixel 33.
[0025] 演算領域 32 (第一の領域)は、演算対象画素 33を中心とした類似値 38を求める演 算の現在の比較の対象となる領域である。比較領域 34 (第二の領域)は、比較対象 画素 35を中心とした演算領域 32との比較の対象となる領域である。 The calculation area 32 (first area) is an area that is the current comparison target of the calculation for obtaining the similarity value 38 with the calculation target pixel 33 as the center. The comparison area 34 (second area) is an area to be compared with the calculation area 32 around the comparison target pixel 35.
[0026] 探索範囲 31は、演算対象画素 33に対して比較対象画素 35を設定する範囲である 。探索範囲 31は演算対象画素 33と比較対象画素 35との間で画像データ 10内の座 標間の距離が大きい場合、類似値 38を算出する必要がないため設定される。座標 間の距離が大きく離れている場合、演算領域 32と比較領域 34とは一台の車両 16を 対象としていないと考えられるためである。一方、全く異なる対象物との間で類似値 3 8を算出することは画像データ 10内の車両 16の誤検出の原因となるためである。例 えば、探索範囲 31は演算対象画素 33を中心として車両 16が存在し得る範囲とする The search range 31 is a range in which the comparison target pixel 35 is set for the calculation target pixel 33. . The search range 31 is set because it is not necessary to calculate the similarity value 38 when the distance between the coordinates in the image data 10 is large between the calculation target pixel 33 and the comparison target pixel 35. This is because when the distance between the coordinates is far away, the calculation area 32 and the comparison area 34 are not considered to be a single vehicle 16. On the other hand, calculating the similarity value 38 between completely different objects is a cause of erroneous detection of the vehicle 16 in the image data 10. For example, the search range 31 is a range in which the vehicle 16 can exist around the calculation target pixel 33.
[0027] 比較対象画素 35は探索範囲内 31の全ての画素 11を対象となる。なお、探索範囲 31と画像データ 10上において車両 16が存在し得ない領域とが重複する場合は、自 己相関差計算部 4は重複する範囲にある比較対象画素 35の類似値 38を算出する 演算を省略することも可能である。重複する範囲にある比較対象画素 35との類似値 38を算出する演算を省略する構成にすると画像処理装置 1が行う演算量が減少する ため、画像処理装置 1は演算結果を得るまでの時間が短縮することが可能となる。 The comparison target pixels 35 are all the pixels 11 within the search range 31. If the search range 31 and the region where the vehicle 16 cannot exist in the image data 10 overlap, the autocorrelation difference calculation unit 4 calculates the similarity value 38 of the comparison target pixel 35 in the overlap range. It is also possible to omit the calculation. If the calculation for calculating the similarity value 38 with the comparison target pixel 35 in the overlapping range is omitted, the amount of calculation performed by the image processing device 1 decreases, so the time required for the image processing device 1 to obtain the calculation result is reduced. It can be shortened.
[0028] 次に、自己相関差計算部 4が行う類似値 38の算出方法について説明する。図 5は 類似値 38の算出方法のフローチャートである。自己相関差計算部 4は画像データ 1 0内の演算対象画素 33毎に以下の手順で類似値 38を算出するための演算を行う。 自己相関差計算部 4は演算対象画素 33の座標 (X, y)を特定する (S01)。図 4の演 算対象画素 33の座標を (X, y)とする。 "X"は画像データ 10内の演算対象画素 33の 横方向 14の座標の値である。 "y"は画像データ 10内の演算対象画素 33の縦方向 1 3の座標の値である。演算対象画素 33の座標 (X, y)の周囲の領域が演算領域 32と なる。また、演算領域 32内の各画素 11を (x+i, y+j)とする。 "i"は演算対象画素 3 3の座標 (X, y)の周囲の領域内の画素 11の横方向 14の座標を特定する変数である 。 "Γは演算対象画素 33の座標(X, y)の周囲の領域内の画素 11の縦方向 13の座 標を特定する変数である。 Next, a method for calculating the similarity value 38 performed by the autocorrelation difference calculation unit 4 will be described. FIG. 5 is a flowchart of a method for calculating the similarity value 38. The autocorrelation difference calculation unit 4 performs a calculation for calculating the similarity value 38 for each calculation target pixel 33 in the image data 10 according to the following procedure. The autocorrelation difference calculation unit 4 specifies the coordinates (X, y) of the calculation target pixel 33 (S01). The coordinates of the pixel to be calculated 33 in Fig. 4 are (X, y). “X” is the value of the coordinate in the horizontal direction 14 of the calculation target pixel 33 in the image data 10. “y” is the value of the coordinate in the vertical direction 13 of the calculation target pixel 33 in the image data 10. The area around the coordinates (X, y) of the calculation target pixel 33 is the calculation area 32. Further, each pixel 11 in the calculation area 32 is set to (x + i, y + j). “i” is a variable that specifies the coordinate in the horizontal direction 14 of the pixel 11 in the area around the coordinate (X, y) of the pixel to be calculated 33. “Γ is a variable that specifies the coordinate in the vertical direction 13 of the pixel 11 in the area around the coordinates (X, y) of the pixel 33 to be calculated.
[0029] 自己相関差計算部 4は探索範囲 31を特定する(S02)。探索範囲 31は予め設定さ れた大きさの値であり、演算対象画素 33の座標 (X, y)を中心に設定される。 [0029] The autocorrelation difference calculation unit 4 specifies the search range 31 (S02). The search range 31 is a value having a preset size, and is set around the coordinates (X, y) of the calculation target pixel 33.
[0030] 自己相関差計算部 4は探索範囲 31の中の比較対象画素 35の座標 (x+u, y+v) を特定する(S03)。 "u"は探索範囲 31内の横方向 14の座標を特定するための変数 である。 "v"は探索範囲 31内の縦方向 13の座標を特定するための変数である。比較 対象画素 35の周囲の領域が比較領域 34となる。また比較領域 34内の各画素を (X +u+i, y+v+j)とする。 "i"は比較対象画素(x+u, y+v)の周囲の領域内の画素 11の横方向 14の座標を特定するための変数である。 Tは比較対象画素 (x+u, y +v)の周囲の領域内の画素 11の縦方向 13の座標を特定するための変数である。 The autocorrelation difference calculation unit 4 specifies the coordinates (x + u, y + v) of the comparison target pixel 35 in the search range 31 (S03). "u" is a variable for specifying the horizontal 14 coordinates in the search range 31 It is. “v” is a variable for specifying the coordinate in the vertical direction 13 within the search range 31. The area around the comparison target pixel 35 becomes the comparison area 34. Further, each pixel in the comparison area 34 is assumed to be (X + u + i, y + v + j). “i” is a variable for specifying the coordinate in the horizontal direction 14 of the pixel 11 in the region around the comparison target pixel (x + u, y + v). T is a variable for specifying the coordinate in the vertical direction 13 of the pixel 11 in the region around the comparison target pixel (x + u, y + v).
[0031] 次に自己相関差計算部 4は演算領域 32と比較領域 34との間で類似値 38を算出 する。類似値 38を算出する演算は、演算対象画素 33の周囲の領域内の各画素 11 の座標を示す (x+i, y+j)の" i"、 "j"と比較対象画素 35の周囲の領域内の各画素 1 1の座標を示す (x+u+i, y+v+j)の" i"、 "j"の値を変化させてそれぞれの座標に 対応する画素毎に比較することにより行う(S04)。 Next, the autocorrelation difference calculation unit 4 calculates a similarity value 38 between the calculation area 32 and the comparison area 34. The calculation to calculate the similarity value 38 is the coordinates of each pixel 11 in the area around the calculation target pixel 33. (x + i, y + j) "i", "j" and the comparison target pixel 35 Show the coordinates of each pixel in the area of (x + u + i, y + v + j), change the values of “i” and “j” and compare for each pixel corresponding to each coordinate (S04).
[0032] 例えば、自己相関差計算部 4は以下の方法により演算領域 32と比較領域 34との 間の類似値 38を算出する。 1.演算領域 32の画素の輝度と比較領域 34の画素の輝 度との差分の絶対値の総和から算出する方法 SAD (Sum Of Absolute Differe nce)、 2.演算領域 32の画素の輝度と比較領域 34の画素の輝度と差分の二乗の総 和から算出する方法 SSD (Sum Of Square Difference)、および 3.正規化に より算出する方法 NCOR (Normalized Correlation)によって算出する。以上の方 法によって自己相関差計算部 4は演算領域 32と比較領域 34との間の各画素間の輝 度の差分を求め、輝度の差分の総和から比較領域 34全体としての類似度 38を算出 する。 For example, the autocorrelation difference calculation unit 4 calculates the similarity value 38 between the calculation region 32 and the comparison region 34 by the following method. 1.Calculation method based on the sum of absolute values of the difference between the brightness of the pixels in the computation area 32 and the brightness of the pixels in the comparison area 34.SAD (Sum Of Absolute Difference) 2.Comparison with the brightness of the pixels in the computation area 32 Method of calculating from sum of squares of luminance and difference of pixels in area 34 SSD (Sum Of Square Difference), and 3. Method of calculating by normalization NCOR (Normalized Correlation). By the above method, the autocorrelation difference calculation unit 4 obtains the luminance difference between each pixel between the calculation region 32 and the comparison region 34, and calculates the similarity 38 as the entire comparison region 34 from the sum of the luminance differences. calculate.
[0033] 自己相関差計算部 4は、 SADにより類似値 38を算出する場合は以下の演算を行う 。演算領域 32内の各座標 (x + i, y+j)の画素 11の輝度 PI (x+i, y+j)を取得する 。次に、比較領域 34内の座標(x + u+i, y+v+j)の画素 11の輝度 P2 (x + u+i, y +v+j)を取得する。次に、輝度 Pl (x+i, y+j)と輝度 P2 (x + u+i, y+v+j)とで 輝度の差の絶対値を演算する。演算領域 32内の全ての画素 11と比較領域 34の全 ての画素 11との間でそれぞれ演算した輝度の差の絶対値の総和を算出する。自己 相関差計算部 4は算出した値を比較対象画素 35に対応する類似値 38とする。輝度 の差分の絶対値の総和を算出する演算式は式(1)となる。なお、 "t"は現在取得した 画像データ 10が対象であることを示す。 [0034] [数 1] p _ p [0033] The autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 by SAD. The brightness PI (x + i, y + j) of the pixel 11 at each coordinate (x + i, y + j) in the calculation area 32 is acquired. Next, the brightness P2 (x + u + i, y + v + j) of the pixel 11 at the coordinates (x + u + i, y + v + j) in the comparison area 34 is acquired. Next, the absolute value of the difference in luminance is calculated between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j). The sum of absolute values of the luminance differences calculated between all the pixels 11 in the calculation area 32 and all the pixels 11 in the comparison area 34 is calculated. The autocorrelation difference calculation unit 4 sets the calculated value as the similarity value 38 corresponding to the comparison target pixel 35. Equation (1) is used to calculate the sum of the absolute values of the luminance differences. “T” indicates that the currently acquired image data 10 is the target. [0034] [Equation 1] p _ p
(t, x,y, , v) (t,x + i, y + j) (t, x + u + i,y + v + j) 式 (1) (t, x, y,, v) (t, x + i, y + j) (t, x + u + i, y + v + j) Equation (1)
J J
[0035] 自己相関差計算部 4は、 SSDにより類似値 38を算出する場合は以下の演算を行う [0035] The autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 using the SSD.
[0036] 自己相関差計算部 4は輝度 Pl (x+i, y+j)と輝度 P2 (x+u+i, y+v+j)との間 においての輝度の差分の 2乗を演算する。自己相関差計算部 4は演算領域 32の全 ての画素と比較領域 34の全ての画素との間でそれぞれ演算した輝度の差分の 2乗 の総和を算出する。自己相関差計算部 4は算出した値を比較対象画素 35に対応す る類似値 38とする。輝度の差の 2乗の総和を算出する演算式は式(2)となる。なお、 "t"は現在取得した画像データ 10が対象であることを示す。 [0036] The autocorrelation difference calculation unit 4 calculates the square of the difference in luminance between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j). To do. The autocorrelation difference calculation unit 4 calculates the sum of squares of the differences in luminance calculated between all the pixels in the calculation region 32 and all the pixels in the comparison region 34. The autocorrelation difference calculation unit 4 sets the calculated value as the similarity value 38 corresponding to the comparison target pixel 35. Equation (2) is used to calculate the sum of the squares of the luminance differences. “T” indicates that the currently acquired image data 10 is a target.
[0037] [数 2] 式 2) [0037] [Equation 2] (Equation 2)
" j "j
[0038] 自己相関差計算部 4は、 NCORにより類似値 38を算出する場合は式 (3)の演算を 行う。 "t"は現在取得した画像であることを示す。なお、 "t"は現在取得した画像デー タ 10が対象であることを示す。 [0038] The autocorrelation difference calculation unit 4 performs the calculation of Equation (3) when calculating the similarity value 38 by NCOR. “t” indicates the currently acquired image. “T” indicates that the currently acquired image data 10 is the target.
[0039] [数 3] [0039] [Equation 3]
( ( ゾ) ) ((Zo))
画 ' Picture ''
' - Υ · ' '-Υ ·'
[0040] 自己相関差計算部 4は、比較領域 34と演算領域 32との間の全ての画素について 輝度の差分を算出するまで処理を繰り返す (S05 :no)。自己相関差計算部 4は輝度 の差分の総和を算出すると (S05 : yes)、類似値 38 (第一の領域と第二の領域とから 算出した類似値)として比較対象画素 35に対応づけて格納する(S06)。 [0040] The autocorrelation difference calculation unit 4 repeats the process until the luminance difference is calculated for all the pixels between the comparison area 34 and the calculation area 32 (S05: no). When the autocorrelation difference calculation unit 4 calculates the sum of the luminance differences (S05 : yes ), the similarity value 38 (similar value calculated from the first region and the second region) is associated with the comparison target pixel 35. Store (S06).
[0041] 自己相関差計算部 4は、探索範囲 31内の全ての比較対象画素 35と演算対象画素 33との間の類似値 38を算出するまで処理を繰り返す (S07 :no)。自己相関差計算 部 4は探索範囲 31内の全ての比較対象画素 35について類似値 38の算出をした場 合は(S07 :yes)、演算対象画素 33に探索範囲 31内の各比較対象画素 35が有す る類似値 38の差分 39中で特徴を示す類似値 38の差分 39 (第一の領域の特徴を示 す類似値)を対応付ける(S08)。特徴を示す類似値 38の差分 39とは、演算領域 32 の画像が角を含む場合に大きな値となり、演算領域 32の画像が直線、及び、ァスフ アルト面のような変化が少な 、場合に少な 、値となる。 S08で対応付けられた類似値 38の差分 39は後述する車両検出処理に使用される。 [0041] The autocorrelation difference calculation unit 4 repeats the process until the similarity value 38 between all the comparison target pixels 35 and the calculation target pixel 33 within the search range 31 is calculated (S07: no). When the autocorrelation calculation unit 4 calculates the similarity value 38 for all the comparison target pixels 35 in the search range 31 (S07: yes), each calculation target pixel 35 is compared with each comparison target pixel 35 in the search range 31. Have The difference 39 of the similar value 38 indicating the feature among the difference 39 of the similar value 38 (similar value indicating the feature of the first region) is associated (S08). The difference 39 of the similarity value 38 indicating the feature is a large value when the image of the calculation region 32 includes a corner, and the image of the calculation region 32 is small when there are few changes such as a straight line and a fast surface. Value. The difference 39 of the similarity value 38 associated in S08 is used for the vehicle detection process described later.
[0042] ここで、 S08において自己相関差計算部 4が演算対象画素 33に対応付ける類似 値 38の差分 39の検出方法について説明する。自己相関差計算部 4は演算対象画 素 33に対応付ける類似値 38の差分 39を以下の手順で演算する。 Here, a method of detecting the difference 39 of the similarity value 38 that the autocorrelation difference calculation unit 4 associates with the calculation target pixel 33 in S08 will be described. The autocorrelation difference calculation unit 4 calculates the difference 39 of the similarity value 38 associated with the calculation target pixel 33 according to the following procedure.
[0043] 自己相関差計算部 4が探索範囲 31内で演算する比較対象画素 35には演算対象 画素 33自身も含まれる。演算対象画素 33自身の間で算出された類似値 38は探索 範囲 31内の比較対象画素 35に対応する類似値 38の中で最も類似することを示す 値となる。したがって、演算対象画素 33自身との間で算出した類似値 38は除く必要 がある。 The comparison target pixel 35 calculated by the autocorrelation difference calculation unit 4 within the search range 31 includes the calculation target pixel 33 itself. The similarity value 38 calculated between the calculation target pixels 33 itself is a value indicating the most similarity among the similar values 38 corresponding to the comparison target pixels 35 in the search range 31. Therefore, it is necessary to exclude the similarity value 38 calculated between the calculation target pixel 33 itself.
[0044] SADおよび SSDによって算出した場合は、演算対象画素 33自身との間は同一で あり、最も類似することを示す類似値 38は" 0"となる。 SAD、 SSDによって算出した 場合でかつ演算対象画素 33自身が比較対象画素 35に含まれる場合は、自己相関 差計算部 4は探索範囲 31中の第二位の類似値 38を有する比較対象画素 35を検出 する。例えば、自己相関差計算部 4は演算対象画素 33自身との間の類似値 38を除 く処理をし、探索範囲 31内の各比較対象画素 35に対応して格納された類似値 38の 中で最小値となる類似値 38を取得する。自己相関差計算部 4は取得した値を演算 対象画素 33に対応付ける類似値 38の差分 39とする。 SADから求める演算を式 (4) に示す。 SSDから求める演算を式(5)に示す。なお、 "t"は現在取得した画像データ 10が対象であることを示す。 [0044] When calculated by SAD and SSD, the calculation target pixel 33 itself is the same, and the similarity value 38 indicating the most similarity is "0". When the calculation target pixel 33 itself is included in the comparison target pixel 35 when it is calculated by SAD or SSD, the autocorrelation difference calculation unit 4 calculates the comparison target pixel 35 having the second-like similarity value 38 in the search range 31. Is detected. For example, the autocorrelation difference calculation unit 4 removes the similarity value 38 between the calculation target pixel 33 itself, and among the similar value 38 stored corresponding to each comparison target pixel 35 in the search range 31. To get the minimum similarity value 38. The autocorrelation difference calculation unit 4 sets the obtained value as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33. The calculation obtained from SAD is shown in Equation (4). The calculation obtained from the SSD is shown in Equation (5). “T” indicates that the currently acquired image data 10 is a target.
[0045] [数 4] [0045] [Equation 4]
A CD(t, x,y) = in SAD^ 式 (4) A CD (t, x, y) = in SAD ^ equation (4)
[0046] [数 5] [0046] [Equation 5]
A CD(t,x,y) = in SSD{ 式 (5) A CD (t, x, y) = in SSD ( Formula (5)
.v( =v≠0) " ' [0047] NCORの場合は、演算対象画素 33自身との間は同一であり、類似値 38は" 1"とな る。 NCORの場合でかつ演算対象画素 33自身が比較対象画素 35に含まれる場合 は、自己相関差計算部 4は探索範囲 31中の第二位の類似値 38を有する比較対象 画素 35を検出する。例えば、演算対象画素 33自身との間の類似値 38を除く処理を し、探索範囲 31内の各比較対象画素 35に対応して格納された類似値 38の中で最 大値となる類似値 38を取得する。その後、自己相関差計算部 4は" 1"から第二位の 類似値 38を引いた値を演算対象画素 33に対応付ける類似値 38の差分 39とする。 NCORから求める演算を式(6)に示す。なお、 "t"は現在取得した画像データ 10が 対象であることを示す。 .v (= v ≠ 0) "' In the case of NCOR, the calculation target pixel 33 itself is the same, and the similarity value 38 is “1”. In the case of NCOR and when the calculation target pixel 33 itself is included in the comparison target pixel 35, the autocorrelation difference calculating unit 4 detects the comparison target pixel 35 having the second-like similarity value 38 in the search range 31. For example, the similar value 38 that is the maximum value among the similar values 38 stored corresponding to each comparison target pixel 35 in the search range 31 is processed by performing processing that excludes the similar value 38 between the calculation target pixel 33 itself. Get 38. Thereafter, the autocorrelation difference calculation unit 4 sets a value 39 obtained by subtracting the second-like similarity value 38 from “1” as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33. The calculation obtained from NCOR is shown in Equation (6). “T” indicates that the currently acquired image data 10 is the target.
[0048] [数 6] [0048] [Equation 6]
ACD 、 = 1— min NCOR 、 式 ί6) ACD, = 1—min NCOR, formula ί6)
ν , ' ノ ιι, ν(ιι = ν≠ 0) ν ノ ν, 'ノ ιιι, ν (ιι = ν ≠ 0) ν ノ
[0049] 次に第二位の類似値 38の差分 39の特徴を説明する。自己相関差計算部 4が算出 した類似値 38の差分 39は演算領域 32に角がある場合に大きくなる特徴がある。車 両 16は正面から見ると多くの角を有する。例えば、車両 16に取付けられたフロントガ ラスの両側の枠部分、車両 16のフェンダーミラー部分、車両 16の正面のフロントグリ ル部分、車両 16の正面のナンバープレート部分等である。これらの車両 16の部分は 角として画像データ 10に写っており、算出される類似値 38は他の角以外が写る領域 の類似値 38と比較して大きい値となる。類似値 38が大きくなるのは、演算領域 32と 比較領域 34とが同一形状の角を有する場合が少ないこと、演算領域 32と比較領域 34との間で輝度が異なる画素数が多いためである。 Next, the feature of the difference 39 of the second-like similarity value 38 will be described. The difference 39 of the similarity value 38 calculated by the autocorrelation difference calculation unit 4 has a feature that increases when the calculation area 32 has a corner. The vehicle 16 has many corners when viewed from the front. For example, a frame portion on both sides of a front glass attached to the vehicle 16, a fender mirror portion of the vehicle 16, a front grill portion in front of the vehicle 16, a number plate portion in front of the vehicle 16, and the like. These parts of the vehicle 16 are shown in the image data 10 as corners, and the calculated similarity value 38 is larger than the similarity value 38 in the region where other corners are shown. The similarity value 38 increases because the calculation area 32 and the comparison area 34 rarely have corners having the same shape, and the number of pixels having different luminance between the calculation area 32 and the comparison area 34 is large. .
[0050] 一方、演算領域 32に角がない場合、自己相関差計算部 4が算出した類似値 38の 差分 39は小さくなる特徴がある。角がない場合とは、例えば、車両 16の屋根部分の ような直線のエッジ部分を領域とする場合や、アスファルトのような領域内が同系色で 構成される場合である。 On the other hand, when there is no corner in the calculation area 32, the difference 39 of the similarity values 38 calculated by the autocorrelation difference calculation unit 4 is characterized by being small. The case where there is no corner is, for example, a case where a straight edge portion such as a roof portion of the vehicle 16 is used as a region, or a case where an interior such as asphalt is configured with similar colors.
[0051] 自己相関差計算部 4は、以上の手順による演算対象画素 33に対する類似値 38の 差分 39を取得する演算処理を画像データ 10内の全ての画素 11を演算対象画素 33 として行う(S09)。自己相関差計算部 4は、画像データ 10内の全ての演算対象画素 33に類似値 38の差分 39が対応付けられて 、な 、場合 (S09: no)、演算対象画素 3 3を画像データ 10内の次の画素 11に移動して同様に類似値 38の差分 39を算出す る。一方、自己相関差計算部 4は、画像データ 10内の全ての演算対象画素 33に類 似値 38の差分 39が対応付けられた場合 (S09 :yes)、画像データ 10内の各演算対 象画素 33に対応付けられた類似値 38の差分 39による分布図 50を作成する(S10) [0051] The autocorrelation difference calculation unit 4 performs a calculation process for obtaining the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 according to the above-described procedure as the calculation target pixel 33 for all the pixels 11 in the image data 10 (S09). ). The autocorrelation difference calculation unit 4 calculates all calculation target pixels in the image data 10. If the difference 39 of the similar value 38 is associated with 33, if not (S09: no), the calculation target pixel 33 is moved to the next pixel 11 in the image data 10 and similarly the difference 39 of the similar value 38 is obtained. Is calculated. On the other hand, when the difference 39 of the similarity value 38 is associated with all the calculation target pixels 33 in the image data 10 (S09: yes), the autocorrelation difference calculation unit 4 performs each calculation target in the image data 10. A distribution map 50 based on the difference 39 of the similarity value 38 associated with the pixel 33 is created (S10).
[0052] ここで画像データ 10内の路上マークについて説明する。分布図 50は類似値 38の 差分 39によって構成される。分布図 50の類似値 38の差分 39の値が大きい画素 11 は、画像データ 10内の角が写る領域に多く分布する。しかし、類似値 38の差分 39 は路上マークの角が写る領域でも値が大きくなる。したがって、このような路上マーク は誤検出の原因となる。そこで画像メモリ 3に格納された画像データ 10から背景の余 分な部分の削除を行う。背景の余分な部分とは例えば路上マークである。路上マー クとは、横断歩道、道路の中央線、歩道との境界を示す線である。本実施例では、画 像データ 10内の横断歩道に関する情報を削除する処理を行う。また、本実施例では 横断歩道が画像データ 10内にぉ 、て横方向 14と平行に描かれて 、るものとする。ま た、路上マークの削除の処理は画像データ 10を取得する度に行う。事前に路上マー クの領域を削除する処理の場合、路上マーク上に車両 16がある場合も路上マークの 領域を削除することとなり、路上マーク上の車両 16を検出できない場合があるためで ある。 Here, a road mark in the image data 10 will be described. Distribution map 50 is composed of difference 39 of similarity value 38. Pixels 11 having a large difference 39 value of similar values 38 in the distribution map 50 are distributed in a large area in the image data 10 where corners are shown. However, the difference 39 of the similarity value 38 is large even in the region where the corner of the road mark is shown. Therefore, such a road mark causes false detection. Therefore, the excess part of the background is deleted from the image data 10 stored in the image memory 3. The extra portion of the background is, for example, a road mark. A road mark is a line that indicates the boundary between a pedestrian crossing, a center line of a road, and a sidewalk. In the present embodiment, processing for deleting information related to the pedestrian crossing in the image data 10 is performed. In this embodiment, it is assumed that a pedestrian crossing is drawn in the image data 10 and parallel to the horizontal direction 14. The road mark deletion process is performed every time the image data 10 is acquired. This is because in the process of deleting the road mark area in advance, even if there is a vehicle 16 on the road mark, the road mark area is deleted and the vehicle 16 on the road mark may not be detected.
[0053] 図 6は横断歩道 41の例である。図 6 (A)、(B)および (C)の横断歩道 41は車道 15 上に白色で縞模様 42が描かれた領域である。路上マーク削除部 5は画像データ 10 力も横断歩道 41の領域を以下の手順によって検出する。路上マーク削除部 5が検出 した横断歩道 41の領域は車両 16の検出の処理時に演算から除外する。図 7は路上 マーク削除部 5が実行する画像データ 10から横断歩道 41領域を除外する処理のフ ローチャートである。 FIG. 6 shows an example of a pedestrian crossing 41. In FIG. 6 (A), (B), and (C), the pedestrian crossing 41 is an area in which a white stripe pattern 42 is drawn on the roadway 15. The road mark deletion unit 5 detects the area of the pedestrian crossing 41 as well as the image data 10 by the following procedure. The area of the pedestrian crossing 41 detected by the road mark deletion unit 5 is excluded from the calculation when the vehicle 16 is detected. FIG. 7 is a flowchart of a process for excluding the pedestrian crossing 41 area from the image data 10 executed by the road mark deletion unit 5.
[0054] まず、路上マーク削除部 5は画像データ 10内で消失点 12を向く線分に沿うエッジ 4 7を消去する(S21)。次に、路上マーク削除部 5は画像データ 10内に周期的に存在 する横方向 14のエッジ 48を消去する(S22)。横断歩道 41の縞模様 42は画像デー タ 10内の横方向 14に存在するためである。次に、路上マーク削除部 5は横断歩道 4 1に合致する横方向 14の周期長の繰り返しパターンを検出する(S23)。横断歩道 4 1の縞模様 42の幅 49は規定されている。したがって、路上マーク削除部 5は画像デ ータ 10内の縦方向 13の座標に応じた横方向 14の縞模様 42の幅 49を事前に記憶 しておくことができる。 First, the road mark deletion unit 5 deletes the edge 47 along the line segment facing the vanishing point 12 in the image data 10 (S21). Next, the road mark deleting unit 5 deletes the edge 48 in the horizontal direction 14 periodically present in the image data 10 (S22). Crosswalk 41 striped pattern 42 is image data This is because it exists in the horizontal direction 14 in the data 10. Next, the road mark deletion unit 5 detects a repeating pattern having a period length in the horizontal direction 14 that matches the pedestrian crossing 41 (S23). Pedestrian crossing 4 1 Stripe 42 Width 49 is defined. Therefore, the road mark deletion unit 5 can store in advance the width 49 of the striped pattern 42 in the horizontal direction 14 according to the coordinates in the vertical direction 13 in the image data 10.
[0055] Ap (X, y)は、現在の演算の対象とする座標 43 (x, y)の輝度 P3 (x、 y)と現在の演 算の対象とする座標 43 (X, y)カゝら所定量 Lだけ横方向 14に移動した座標 44 (x+L , y)の輝度 P4 (x+L、 y)との間の輝度の平均値である。所定量 Lは、画像データ 10 内の縦方向 13の座標" y"に応じた横断歩道 41の縞模様 42の横方向 14の幅である 。 Ap (x, y)は P3と P4の輝度の平均値であるため横断歩道の白色部分の時に大きく なる。路上マーク削除部 5は Apを式(7)から求める。 [0055] Ap (X, y) is the luminance P3 (x, y) of the coordinate 43 (x, y) that is the target of the current calculation and the coordinate 43 (X, y) of the target of the current calculation. On the other hand, it is the average value of the luminance between the coordinates P4 (x + L, y) of the coordinates 44 (x + L, y) moved in the horizontal direction 14 by a predetermined amount L. The predetermined amount L is the width in the horizontal direction 14 of the striped pattern 42 of the pedestrian crossing 41 according to the coordinate “y” in the vertical direction 13 in the image data 10. Ap (x, y) is the average value of the luminance of P3 and P4, so it increases when the white part of the pedestrian crossing. The road mark deletion unit 5 calculates Ap from the equation (7).
[0056] [数 7] 式 (7) [0056] [Equation 7] Equation (7)
[0057] Am (x, y)は、現在の演算の対象とする座標 43 (x, y)に対して所定量(-LZ2) だけ横方向 14に移動した座標 45 (x— L/2, y)の輝度 P5 (x— L/2、 y)と現在の 演算の対象とする座標 43 (X, y)に対して所定量 (LZ2)だけ横方向 14に移動した 座標 46 (x+L/2, y)の輝度 P6 (x + L/2, y)との間の輝度の平均値である。輝度 の平均値は横断歩道の白色部分の時に大きくなる。路上マーク削除部 5は Amを式( 8)から求める。 [0057] Am (x, y) is a coordinate 45 (x—L / 2, x) that is moved in the horizontal direction 14 by a predetermined amount (-LZ2) with respect to the coordinate 43 (x, y) that is the object of the current calculation. y) Luminance P5 (x—L / 2, y) and coordinate 43 (X, y) for the current calculation. / 2, y) is the average value of luminance between P6 (x + L / 2, y). The average value of the brightness increases when the white part of the pedestrian crossing. The road mark deletion unit 5 calculates Am from Equation (8).
[0058] [数 8] [0058] [Equation 8]
x,y) = — 2—— ― 式 (8)x, y) = — 2 —— ― Equation ( 8 )
[0059] Apと Amとの関係は、 P5および P6が P3および P4に対して" LZ2"だけ横方向 14 ずらした座標のため輝度が反転する関係となる。輝度が反転する関係とは、 P3, P4 が横断歩道 41の縞模様 42の白く塗られた部分に位置する場合、 P5, P6は横断歩 道 41の縞模様 42のアスファルト部分に位置する関係である。すなわち、 Ap (x、 y)が 極大値の時に Am (x、 y)は極小値となり、 Ap (x、 y)が極小値の時に Am (x、 y)は極 大値となる関係である。 [0059] The relationship between Ap and Am is a relationship in which the luminance is inverted because P5 and P6 are offset by 14 in the horizontal direction by "LZ2" with respect to P3 and P4. When P3 and P4 are located in the white painted part of the striped pattern 42 of the pedestrian crossing 41, P5 and P6 are crosswalks. The relationship is located on the asphalt part of the striped pattern 42 on the road 41. That is, Am (x, y) has a minimum value when Ap (x, y) is a maximum value, and Am (x, y) has a maximum value when Ap (x, y) is a minimum value. .
[0060] Cp (x、 y)は、輝度 P3 (x、 y)と輝度 P4 (x + L、 y)との差分絶対値である。したがつ て、 P3と P4とが同じ状態の時に小さい値となる。例えば、横断歩道 41の縞模様 42の 白く塗られた部分に P3 5, P4がそれぞ; ^立置する場合、あるいは横断歩道 41の縞模 様 42のアスファルト部分に P3, P4がそれぞれ位置する場合である。路上マーク削除 [0060] Cp (x, y) is an absolute difference between the luminance P3 (x, y) and the luminance P4 (x + L, y). Therefore, it becomes a small value when P3 and P4 are in the same state. For example, P3 5 and P4 are located in the white part of the striped pattern 42 of the pedestrian crossing 41; ^ P3 and P4 are respectively located on the asphalt part of the striped pattern 42 of the pedestrian crossing 41 Is the case. Delete road mark
2 2
部 5は Cpを式(9)から求める。 Part 5 calculates Cp from equation (9).
[0061] [数 9] [0061] [Equation 9]
尸 尸
P (x,y) ― P 4( 6 P (x, y) ― P 4 (6
I, ) x+L,y) 式 (9) I,) x + L, y) (9)
[0062] Cm(x、 y)は、輝度 P5 (x-L/2, y)と輝度 P6 (x + L/2, y)との差分絶対値であ る。したがって、 P5と P6とが同じ状態の時に小さい値となる。例えば、横断歩道 41の 縞模様 42の白く塗られた部分に P5, P6がそれぞ; ^立置する場合である、あるいは 横断歩道の縞模様 42のアスファルト部分に P5, P6がそれぞ; ^立置する場合である 。路上マーク削除部 5は Cmを式(10)力も求める。 [0062] Cm (x, y) is the absolute difference between the luminance P5 (x-L / 2, y) and the luminance P6 (x + L / 2, y). Therefore, the value is small when P5 and P6 are in the same state. For example, P5 and P6 are placed on the white part of the striped pattern 42 of the pedestrian crossing 41; or P5 and P6 are placed on the asphalt part of the striped pattern 42 of the pedestrian crossing; ^ This is the case of standing. The road mark deletion unit 5 also calculates the force of Cm using equation (10).
[0063] [数 10] [0063] [Equation 10]
Cm, 式 (10) Cm, formula (10)
[0064] 以上の算出した Ap、 Am、 Cp、 Cmに基づき、路上マーク削除部 5は横断歩道 41 の特徴点 Ecwを式(11)から算出する。 [0064] Based on the calculated Ap, Am, Cp, and Cm, the road mark deleting unit 5 calculates the feature point Ecw of the pedestrian crossing 41 from the equation (11).
[0065] [数 11] [0065] [Equation 11]
一 one
[0066] Ecwは Apと Amが反転する関係にあるときに大きい値となる。また、 Ecwは差分絶 対値 Cpおよび Cmが小さい場合に大きい値となる。したがって、横断歩道 41の領域 では高 、特徴点を示すこととなる。 [0066] Ecw takes a large value when Ap and Am are in a reversed relationship. Ecw is the difference difference A large value is obtained when the pair values Cp and Cm are small. Therefore, in the area of the pedestrian crossing 41, the feature point is high.
[0067] 路上マーク削除部 5は算出した Ecwの値を対応する画素 11に対応付ける(S24)。 The road mark deletion unit 5 associates the calculated Ecw value with the corresponding pixel 11 (S24).
路上マーク削除部 5は以上の演算を画像データ 10内の画素 11毎に行う(S25)。路 上マーク削除部 5は横断歩道 41であると判断した領域について車両 16を特定する 処理から除外する(S26)。具体的には、路上マーク削除部 5が横断歩道 41として検 出した画素を自己相関差計算部 4が作成した分布図 50内の画素とを対応付けて、 以降の車両 16検出処理の対象外とする。なお、自己相関差計算部 4は路上マーク 削除部 5によって横断歩道 41が除去された後の画像データ 10によって分布図 50を 作成することも可能である。横断歩道 41の領域を除外することにより、画像処理装置 1は車両 16を特定する処理において車両 16の誤検出を低減することができる。路上 マーク削除部 5は図 6 (B)の元の画像データ 10に対して、図 6 (C)の削除すべき横断 歩道 41の領域を取得する。 The road mark deletion unit 5 performs the above calculation for each pixel 11 in the image data 10 (S25). The road mark deletion unit 5 excludes the area determined to be a pedestrian crossing 41 from the process of identifying the vehicle 16 (S26). Specifically, the road mark deletion unit 5 associates the pixel detected as the pedestrian crossing 41 with the pixel in the distribution map 50 created by the autocorrelation calculation unit 4, and is excluded from the subsequent vehicle 16 detection processing. And The autocorrelation difference calculation unit 4 can also create the distribution map 50 from the image data 10 after the crosswalk 41 is removed by the road mark deletion unit 5. By excluding the area of the pedestrian crossing 41, the image processing apparatus 1 can reduce erroneous detection of the vehicle 16 in the process of specifying the vehicle 16. The road mark deletion unit 5 acquires the area of the pedestrian crossing 41 to be deleted in FIG. 6C with respect to the original image data 10 in FIG. 6B.
[0068] 次に、画像処理装置 1は演算領域 32の特徴を示す類似値の分布の状態に応じて 定まる形状が予め車両 (被写体)の特徴として定めた形状に合致する領域を画像デ ータ内の車両の部分として検出する。画像処理装置 1は自己相関差計算部 4及び路 上マーク削除部 5によって作成された分布図 50から車両 16を検出する。 [0068] Next, the image processing apparatus 1 calculates an area in which the shape determined according to the distribution of similar values indicating the characteristics of the calculation area 32 matches the shape previously determined as the characteristics of the vehicle (subject). Detect as part of the vehicle inside. The image processing apparatus 1 detects the vehicle 16 from the distribution map 50 created by the autocorrelation difference calculation unit 4 and the road mark deletion unit 5.
[0069] まず、車両 16と類似値 38の差分 39 (特徴を示す類似値)との関係について説明す る。画像処理装置 1は、車両 16の左右対象の形状を利用して分布図 50内の車両を 検出する。図 8は、元の画像データ 10の車両 16部分の画像 (A)と分布図 50の車両 16部分の画像 (B)とを示す。本実施例で設置されたカメラ 2は車両 16を正面あるい は背面力も撮影することとなる。車両 16の正面または背面は左右対象の形状である First, the relationship between the vehicle 16 and the difference 39 (similar value indicating characteristics) between the similar values 38 will be described. The image processing apparatus 1 detects the vehicle in the distribution map 50 using the shape of the left and right objects of the vehicle 16. FIG. 8 shows an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50. The camera 2 installed in this embodiment takes a picture of the vehicle 16 in front or rear. The front or back of vehicle 16 is the shape of the left and right target
[0070] 車両 16の左右に位置する画素群 51の類似値 38の差分 39は大きくなる。車両 16 の左右の側面には角となる部分が多いためである。車両 16のフロント部分に位置す る画素群の類似値 38の差分 39は大きくなる。車両 16のフロント部分も角となる部分 が多いためである。なお、フロント部分も車両 16の中心から左右対称となる。車両 16 の左右の側面の外側に位置する画素群 52の類似値 38の差分 39は小さくなる。車両 16の外側はアスファルトであるため類似値 38の差分 39は小さくなるためである。車 両 16のフロント部分の更に手前、すなわち車両 16の前方に位置する画素群 53に対 応する類似値 38の差分 39は小さくなる。車両 16の外側はアスファルトであるため類 似値 38は小さくなるためである。車両 16のフロント部分に位置する画素群の類似値 は小さい値が分布する。なお、手前方向に向く車両 16のフロント部分の前方は、二 次元の画像データ 10では車両 16よりも下の縦方向 13の座標となる。 The difference 39 between the similar values 38 of the pixel groups 51 located on the left and right of the vehicle 16 is increased. This is because the left and right sides of the vehicle 16 have many corners. The difference 39 between the similarity values 38 of the pixel groups located in the front part of the vehicle 16 is large. This is because the front part of the vehicle 16 has many corners. The front part is also symmetrical from the center of the vehicle 16. The difference 39 between the similar values 38 of the pixel group 52 located outside the left and right side surfaces of the vehicle 16 is reduced. vehicle This is because the difference 39 of the similarity value 38 is small because the outside of 16 is asphalt. The difference 39 of the similarity value 38 corresponding to the pixel group 53 located in front of the front portion of the vehicle 16, that is, in front of the vehicle 16 is reduced. This is because the similarity value 38 is small because the outside of the vehicle 16 is asphalt. Small values are distributed among the similar values of the pixel group located in the front part of the vehicle 16. It should be noted that the front of the front portion of the vehicle 16 facing forward is the coordinate in the vertical direction 13 below the vehicle 16 in the two-dimensional image data 10.
[0071] 以上の車両 16の左右に位置する画素群 51、車両 16の左右の側面の外側に位置 する画素群 52および車両 16の前方に位置する画素群 53の分布図 50内の状態に 基づき、予め車両の特徴となるパターンを定義する。本実施例での車両 16の特徴と なるパターンは以下の条件によって定義する。 [0071] Based on the above-described states in the distribution group 50 of the pixel group 51 located on the left and right of the vehicle 16, the pixel group 52 located outside the left and right side surfaces of the vehicle 16, and the pixel group 53 located in front of the vehicle 16. A pattern that is a characteristic of the vehicle is defined in advance. The pattern that is a feature of the vehicle 16 in this embodiment is defined by the following conditions.
•条件 1.車両 16の左右に類似値 38の差分 39が大きい画素群 51が分布する。 '条件 2.車両 16の両側面の更に外側となる領域に類似値 38の差分 39が小さい画 素群 52が分布する。 • Condition 1. The pixel group 51 with a large difference 39 of similar values 38 is distributed on the left and right of the vehicle 16. 'Condition 2. A group of pixels 52 having a small difference 39 of the similarity value 38 is distributed in a region further outside the both sides of the vehicle 16.
•条件 3.車両 16の下側には類似値 38の差分 39が小さい画素群 53が分布する。 • Condition 3. On the lower side of the vehicle 16, a pixel group 53 having a small difference 39 of similar values 38 is distributed.
[0072] 画像処理装置 1のパターン特徴量計算部 6は以上の条件を満たす領域を分布図 5 0から検出する。 The pattern feature quantity calculation unit 6 of the image processing apparatus 1 detects a region satisfying the above conditions from the distribution map 50.
[0073] 図 9はパターン特徴量計算部 6が実行する分布図 10から車両 16が存在する領域 を検出する処理のフローチャートである。パターン特徴量計算部 6は図 8の分布図 50 の各画素 54について以下の計算を行う。まず、パターン特徴量計算部 6は分布図 5 0の中の現在の演算の対象となる画素 54を検出対象画素 55として設定する(S31)。 ノターン特徴量計算部 6は検出対象画素 55を基準位置として特徴量を算出する(S 32)。具体的には、パターン特徴量計算部 6は類似値 38の差分 39の分布の状態を 下記の式(12)乃至式(16)の結果力 取得する。なお、 "t"は現在取得した画像デ ータ 10が対象であることを示す。 FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6. The pattern feature quantity calculation unit 6 performs the following calculation for each pixel 54 in the distribution diagram 50 in FIG. First, the pattern feature quantity calculation unit 6 sets a pixel 54 that is a current calculation target in the distribution map 50 as a detection target pixel 55 (S31). The no-turn feature quantity calculation unit 6 calculates the feature quantity using the detection target pixel 55 as a reference position (S32). Specifically, the pattern feature quantity calculation unit 6 acquires the result of the following expressions (12) to (16) for the distribution state of the difference 39 of the similarity value 38. “T” indicates that the currently acquired image data 10 is the target.
[0074] [数 12] 一 1,ァ +/1) (12) [0074] [Equation 12] (1, 1, + / 1) (12)
[0075] [数 13] ^right(t,x,y) - ^, ^(t,x+k2,y+l2) 式 (13) k2,l2 [0075] [Equation 13] ^ right (t, x, y)-^, ^ (t, x + k2, y + l2) Equation (13) k2, l2
[0076] 式(12)によって求められる Eleft56は検出対象画素 55を基準の座標として" kl", "11"で定められる領域内の画素 54の類似値 38の差分 39の総和である。式(13)に よって求められる Eright57は検出対象画素 55を基準位置として" k2", "12"で定め られる領域内の画素 54の類似値 38の差分 39の総和である。 Eleft56、 Eright57は 車両 16の中央力も左右の側面部分までの範囲に対応する。車両 16の側面部分およ びフロント部分は角部分が多く類似値 38の差分 39が大きい値が多く分布する。なお 、 "t"は現在取得した画像データ 10が対象であることを示す。 Eleft 56 obtained by Expression (12) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “kl” and “11” with the detection target pixel 55 as a reference coordinate. Eright 57 obtained by equation (13) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “k2” and “12” with the detection target pixel 55 as the reference position. For Eleft56 and Eright57, the central force of the vehicle 16 corresponds to the range to the left and right side parts. The side part and the front part of the vehicle 16 have many corner parts and many values with a large difference 39 of similar values 38 are distributed. “T” indicates that the currently acquired image data 10 is a target.
[0077] [数 14] [0077] [Equation 14]
k3 3 k3 3
[0078] [数 15] 式 (1 [0078] [Equation 15] Formula (1
[0079] 式(14)によって求められる Eside58は検出対象画素 55を基準位置として分布図 5 0の左側の" k3", "13"で定められる領域内の画素 54の類似値 38の差分 39の総和 である。 Eside58は車両 16の側面よりも更に外側に対応する。車両 16の側面よりも 更に外側となる領域はアスファルトであり類似値 38の差分 39が小さい値が多く分布 する。 [0079] Eside 58 obtained by the equation (14) is obtained by calculating the difference 39 of the similarity value 38 of the pixel 54 in the region defined by "k3" and "13" on the left side of the distribution map 50 with the detection target pixel 55 as a reference position. The sum. Eside 58 corresponds to the outside of the side of the vehicle 16. The area further outside the side face of the vehicle 16 is asphalt, and there are many distributions with a small difference 39 between the similarity values 38.
[0080] 式(15)によって求められる Ebottom59は検出対象画素 55を基準位置として分布 図 50の下側の" k4", "14"で定められる領域内の画素 54の類似値 38の差分 39の総 和である。 Ebottom59は車両 16の下側となる。検出対象画素 55の基準位置が車 両 16のフロント部分の下端である場合、 Ebottom59の領域は道路となる。したがつ て、 Ebottom59の値は小さくなる。 [0080] Ebottom 59 obtained by the equation (15) is distributed with the detection target pixel 55 as a reference position. The difference 39 of the similarity value 38 of the pixel 54 in the region defined by “k4” and “14” on the lower side of FIG. It is the sum. Ebottom 59 is on the underside of vehicle 16. When the reference position of the detection target pixel 55 is the lower end of the front part of the vehicle 16, the area of the Ebottom 59 is a road. Therefore, the value of Ebottom59 becomes smaller.
[0081] パターン特徴量計算部 6は以上の各式(12)乃至式(15)により算出した値を下記 の式(16)に代入して、検出対象画素 55に対応する車両 16の特徴の大きさを示す 値 60E(x, y)を得る。なお、 "t"は現在取得した画像データ 10が対象であることを示 す。 [0081] The pattern feature quantity calculation unit 6 substitutes the values calculated by the above equations (12) to (15) into the following equation (16) to obtain the characteristics of the vehicle 16 corresponding to the detection target pixel 55. Indicate size Get the value 60E (x, y). “T” indicates that the currently acquired image data 10 is the target.
[0082] [数 16] [0082] [Equation 16]
^(t,x,y) ~ L ieft、t ,x,y、 + ^ right (t,x,y) 一 ^ left (t,x,y) ― ^ right (t,x,y) side (t,x,y) ^ (t, x, y) ~ L i e f t , t , x , y, + ^ right (t, x, y) One ^ left (t, x, y) ― ^ right (t, x, y ) side (t, x, y)
— C bottom E bottom (t,x,y、 式 lO) — C bottom E bottom (t, x, y, expression lO)
[0083] E (x、 y)は、 Eleft56、 Eright57の値が大き!/、場合に大きくなる。 [0083] E (x, y) increases when the values of Eleft56 and Eright57 are large! /.
また、検出対象画素 55が車両 16の中央であった場合、 Eleft56と Eright57とは左 右対称に位置することとなる。したがって、 Eleft56の値と Eright57の値は近い値と なる。 E(x、 y)は Eleft56、 Eright57の左右のバランスがある場合に小さくなる。 E(x 、 y)は Eside58の値力 、さい場合に小さくなる。 E(x、 y)は Ebottom59の値が小さ い場合に小さくなる。なお、 Cside、 Cbottomの値は、カメラ 2が撮影する道路の状態 等に応じて設計者が設定する値である。ノターン特徴量計算部 6は算出した E(x、 y )を検出対象画素 55に対応付ける(S33)。 Further, when the detection target pixel 55 is in the center of the vehicle 16, Eleft56 and Eright57 are positioned left-right symmetrically. Therefore, the value of Eleft56 and the value of Eright57 are close. E (x, y) becomes smaller when there is a left / right balance of Eleft56 and Eright57. E (x, y) is the value of Eside58. E (x, y) becomes smaller when the value of Ebottom59 is small. Note that the values for Cside and Cbottom are values set by the designer according to the road conditions photographed by the camera 2 and the like. The no-turn feature quantity calculation unit 6 associates the calculated E (x, y) with the detection target pixel 55 (S33).
[0084] ノターン特徴量計算部 6は以上の処理を分布図 50内の全画素 54について演算す る(S34)。パターン特徴量計算部 6は、演算結果によって、分布図 50内で車両 16が 存在すると考えられる領域の縦方向 13の座標、および横方向 14の座標を取得する ことができる。 The non-turn feature quantity calculator 6 calculates the above processing for all the pixels 54 in the distribution map 50 (S34). The pattern feature quantity calculation unit 6 can acquire the coordinate in the vertical direction 13 and the coordinate in the horizontal direction 14 of an area where the vehicle 16 is considered to exist in the distribution map 50 based on the calculation result.
[0085] パターン特徴量計算部 6による車両 16の検出の演算は自己相関差計算部 4が元 の画像データ 10について演算対象画素 33を基準とした類似値 38の差分 39を取得 し、類似値 38の差分 39によって分布図 50を作成するため、画像データ 10に写る影 等の影響は低減している。 [0085] In the calculation of the detection of the vehicle 16 by the pattern feature quantity calculation unit 6, the autocorrelation difference calculation unit 4 obtains the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 for the original image data 10, and the similarity value Since the distribution map 50 is created with the difference 39 of 38, the influence of shadows and the like appearing in the image data 10 is reduced.
[0086] 一方、画像処理装置 1の対象性特徴量計算部 7は分布図 50から車両 16の左右の 中心座標を求める演算を行う。図 10は分布図 50と車両 16の対象性との関係を示す 図である。車両 16の形状は左右対象である。したがって、類似値 38の差分 39は車 両 16の中心から左右に対象に分布する。そこで、対象性特徴量計算部 7は車両 16 の左右対称性を利用した車両 16の位置の特定を行う。対象性の評価は以下の手順 により行う。 On the other hand, the target characteristic amount calculation unit 7 of the image processing apparatus 1 performs an operation for obtaining the left and right center coordinates of the vehicle 16 from the distribution map 50. FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16. The shape of the vehicle 16 is subject to right and left. Therefore, the difference 39 of the similarity value 38 is the car Distributed from left to right from 16 centers. Therefore, the target characteristic feature quantity calculation unit 7 specifies the position of the vehicle 16 using the left-right symmetry of the vehicle 16. The evaluation of objectivity is performed according to the following procedure.
[0087] 図 11は、対象性特徴量計算部 7が実行する分布図 10から車両 16が存在する領域 を検出する処理のフローチャートである。 FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
[0088] 対象性特徴量計算部 7は現在の演算の対象となる座標として図 10の演算中央座 標 61 (x, y)を設定する (S41)。次に対象性特徴量計算部 7は演算中央座標 61 (X, y)の特徴量を算出する (S42)。具体的には、対象性特徴量計算部 7は演算中央座 標 61 (X, y)の横方向 14の座標" X"を中心として正の横方向 14に離れた座標 63" + i"および負の横方向 14に離れた座標 62"—i"だけ離れた座標のそれぞれの輝度を 取得する。 "i"は対象性を算出すべき範囲で適宜定められる。例えば、 "i"の範囲は 画像データ 10内の縦方向 13の座標に対応する車両 16の幅の半分の大きさに設定 される。対象性特徴量計算部 7は演算中央座標 61 (X, y)力も正の横方向 14に離れ た座標 63を" x+i, y"とし、座標 63"x+i, y"の輝度を P7 (x+i, y)とする。対象性特 徴量計算部 7は演算中央座標 61 (x, y)力も負の横方向 14に離れた座標 62を "X— i , y"とし、座標 62,,x— i, y"の輝度を P8 (x— i, y)とする。 The target feature quantity calculation unit 7 sets the calculation central coordinates 61 (x, y) in FIG. 10 as the coordinates to be the target of the current calculation (S41). Next, the target characteristic amount calculation unit 7 calculates the characteristic amount of the calculation center coordinate 61 (X, y) (S42). Specifically, the object feature calculation unit 7 calculates the coordinate 63 ”+ i” in the horizontal direction 14 centered on the coordinate “X” in the horizontal direction 14 of the calculation central coordinate 61 (X, y) and Get the luminance of each coordinate 62 ”—i” away in the negative lateral direction 14 away. “i” is appropriately determined within a range in which the objectivity should be calculated. For example, the range of “i” is set to half the width of the vehicle 16 corresponding to the coordinate in the vertical direction 13 in the image data 10. The target feature calculation unit 7 uses the coordinate 63 (x, y) as the coordinate 63 “x + i, y” where the coordinate 63 (X, y) force is also in the positive lateral direction 14 and the luminance of the coordinate 63 “x + i, y” is calculated. Let P7 (x + i, y). The target characteristic calculation unit 7 uses the coordinate 62, x-i, y as the coordinate 62, x-i, y, where the coordinate 62 (x, y) force is also negative in the lateral direction 14 Let the luminance be P8 (x—i, y).
[0089] 対象性特徴量計算部 7は現在の輝度 P7と輝度 P8との合計値を式(17)の S (i)によ つて算出する。 The objectivity feature quantity calculation unit 7 calculates the total value of the current luminance P7 and luminance P8 from S (i) in equation (17).
[0090] [数 17] [0090] [Equation 17]
S( = P 1 i , x+i,y) + 8( , χ - , ) (17) S ( = P 1 i, x + i, y) + 8 (, χ-,) (17)
[0091] S (i)は輝度 P7および輝度 P8のそれぞれの輝度が大き 、場合に大きな値となる。 [0091] S (i) takes a large value when the luminance P7 and the luminance P8 are large.
なお、 "t"は現在取得した画像データ 10が対象であることを示す。対象性特徴量計 算部 7は現在の輝度 P7と輝度 P8との差分値を式(18)の D (i)によって算出する。 “T” indicates that the currently acquired image data 10 is a target. The target characteristic calculation unit 7 calculates the difference value between the current luminance P7 and the luminance P8 by D (i) in equation (18).
[0092] [数 18] (,·) = P (t, x +i,y) ― P^(t,x-i, y) (18)[0092] [Equation 18] (, ·) = P (t, x + i, y) ― P ^ (t, xi, y) (18)
[0093] D (i)は輝度 P7および輝度 P8が近 、値の場合に小さ!/、値となる。したがって、 D (i) は輝度 P7および輝度 P8のバランスを示す。なお、 "t"は現在取得した画像データ 10 が対象であることを示す。 [0093] D (i) is a value that is small when the brightness P7 and the brightness P8 are close and values. Therefore, D (i) represents the balance between luminance P7 and luminance P8. “T” is the currently acquired image data 10 Indicates that is the target.
[0094] 対象性特徴量計算部 7は車両の中央を通る線を式(19)の Esym (x, y)によって算 出する。 [0094] The objectivity feature quantity calculation unit 7 calculates a line passing through the center of the vehicle by Esym (x, y) in Expression (19).
[0095] [数 19] ¾ 式 (19) [0095] [Equation 19] ¾ Equation (19)
I I I I
[0096] Esym (x, y)は演算中央座標 61 (x, y)から" i"を所定の範囲で変化させた時の S (i) の絶対値の総和から D (i)の絶対値の総和を引いた値である。したがって、 Esym (x, y)は演算の対象となる領域内の左右の輝度が等しくかつ左右の輝度が大き 、場合 に大きな値となる。対象性特徴量計算部 7は S32で算出した Esym (X, y)を演算中 央座標 61に対応付ける(S43)。 [0096] Esym (x, y) is the absolute value of D (i) from the sum of the absolute values of S (i) when "i" is changed within the specified range from the operation center coordinates 61 (x, y) Is the value obtained by subtracting the sum of Therefore, Esym (x, y) has a large value in the case where the left and right luminances in the region to be calculated are equal and the left and right luminances are large. The object feature quantity calculation unit 7 associates Esym ( X , y) calculated in S32 with the calculation center coordinate 61 (S43).
[0097] 以上の演算を対象性特徴量計算部 7が分布図 50内の全ての画素 54について行う [0097] The target characteristic amount calculation unit 7 performs the above calculation for all the pixels 54 in the distribution map 50.
(S44)ことにより、分布図 50内の車両 16の中央を通る線を検出することが可能となる (S44) makes it possible to detect a line passing through the center of the vehicle 16 in the distribution map 50.
[0098] また、対象性特徴量計算部 7は以下の式により車両 16の重心 Rsym(x、 y)を求め る式(20)により車両 16の中央を通る線を検出することも可能である。なお、 "t"は現 在取得した画像データ 10が対象であることを示す。 [0098] In addition, the object feature calculation unit 7 can also detect a line passing through the center of the vehicle 16 by the equation (20) for obtaining the center of gravity Rsym (x, y) of the vehicle 16 by the following equation. . “T” indicates that the acquired image data 10 is the target.
[0099] [数 20] [0099] [Equation 20]
式 (20) Formula (20)
[0100] 次に、車両位置判定部 8が行う処理について説明する。 [0100] Next, processing performed by the vehicle position determination unit 8 will be described.
[0101] 車両位置判定部 8は、パターン特徴量計算部 6による車両 16のパターンの認識処 理カゝら分布図 50内の車両 16が存在する縦方向 13の座標および横方向 14の座標を 得る。縦方向 13の座標は、画像データ 10内で車両 16が手前方向に向いている場 合、車両 16のフロントの下端となる。また、車両位置判定部 8は、対象性特徴量計算 部 7による車両 16の左右の中心を通る線を求める処理から分布図 50内の車両 16の 中央となる横方向 14の座標を得る。車両位置判定部 8はパターン特徴量計算部 6に よる演算結果と対象性特徴量計算部 7による演算結果を重ねあわせ、画像データ 10 内において車両 16が存在する基準となる座標を取得する。車両位置判定部 8は取 得した基準となる座標から画像データ 10あるいは分布図 50の輝度のピークを検出し て車両 16のフロント部分の幅を示す画素数を取得する。 [0101] The vehicle position determination unit 8 uses the pattern feature quantity calculation unit 6 to recognize the pattern of the vehicle 16 and obtains the coordinates of the vertical direction 13 and the horizontal direction 14 where the vehicle 16 exists in the distribution map 50. obtain. The coordinates in the vertical direction 13 are the lower end of the front of the vehicle 16 when the vehicle 16 is facing forward in the image data 10. In addition, the vehicle position determination unit 8 calculates the objectivity feature amount. The coordinate in the horizontal direction 14 which is the center of the vehicle 16 in the distribution map 50 is obtained from the processing for obtaining the line passing through the left and right centers of the vehicle 16 by the unit 7. The vehicle position determination unit 8 superimposes the calculation result obtained by the pattern feature quantity calculation unit 6 and the calculation result obtained by the target characteristic quantity calculation unit 7 to obtain a reference coordinate in which the vehicle 16 exists in the image data 10. The vehicle position determination unit 8 detects the luminance peak of the image data 10 or the distribution map 50 from the acquired reference coordinates, and acquires the number of pixels indicating the width of the front portion of the vehicle 16.
[0102] 車両 16の基準となる縦方向 13の座標およびピーク検出によって取得した車両 16 のフロント部分の幅の画素数力 データベース 20の合致する車種を検出する。以上 により、画像処理装置 1は画像データ 10内の車両 16の位置を検出し、及び、車両 1 6の車種を特定することが可能となる。 [0102] The number of pixels 13 in the width of the front portion of the vehicle 16 obtained by the coordinates of the vertical direction 13 and the peak detection as a reference of the vehicle 16 is detected in the database 20. As described above, the image processing apparatus 1 can detect the position of the vehicle 16 in the image data 10 and specify the vehicle type of the vehicle 16.
[0103] 図 12は、本実施例による車両の検出例である。 FIG. 12 is an example of vehicle detection according to the present embodiment.
[0104] 元の画像データ (A)はカメラ 2が撮影した画像である。 [0104] The original image data (A) is an image taken by the camera 2.
車両部分 (B)は、元の画像データ (A)の車両部分を抜き出した領域である。 The vehicle portion (B) is an area where the vehicle portion of the original image data (A) is extracted.
本実施例による検出(C)は、本実施例の処理によって検出した場合の検出範囲であ る。 Detection (C) according to the present embodiment is a detection range when detection is performed by the processing of the present embodiment.
背景差分による検出 (D)は、従来技術である背景差分方式によって検出した場合の 検出範囲である。エッジ検出(E)は、従来技術であるエッジ検出方式によって検出し た場合の検出範囲である。背景差分方式およびエッジ検出方式、では影部分の影 響を受けるため車両のみの範囲を検出することができない。 Detection by background difference (D) is the detection range when detecting by the background difference method, which is a conventional technique. Edge detection (E) is a detection range in the case of detection by the edge detection method that is a conventional technique. In the background difference method and edge detection method, the range of the vehicle alone cannot be detected because it is affected by shadows.
[0105] 図 13は、本実施例の画像処理装置 1のハードウ ア構成例である。画像処理装置 1ίま、 CPU101、カメラ 2、メモリ 103、出力咅 および、ノ ス 105力らなる。 CPU1 01、カメラ 2、メモリ 103、および、出力部 104はバス 105により接続され各部間でデ ータの授受を行う。 CPU101は、メモリ 103に格納されている画像処理プログラム 10 0をワークエリアに読出して実行する中央処理装置である。画像処理プログラム 100 は、画像処理装置 1の CPU101を自己相関差計算部 4、路上マーク削除部 5、バタ ーン特徴量計算部 6、対象性特徴量計算部 7及び車両位置判定部 8として機能させ る。 FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment. The image processing device 1ί, CPU101, camera 2, memory 103, output light and nose 105 power. The CPU 101, the camera 2, the memory 103, and the output unit 104 are connected by a bus 105 and exchange data between the units. The CPU 101 is a central processing unit that reads the image processing program 100 stored in the memory 103 into a work area and executes it. The image processing program 100 functions as the CPU 101 of the image processing apparatus 1 as an autocorrelation difference calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Let me.
[0106] メモリ 103は、画像処理装置 1の種々の情報を記憶する。例えば、 RAM,ハードデ イスク装置等である。メモリ 103は、 CPU101が演算を行う際のワークエリアともなる。 また、データベース 20に格納された情報も記憶する。 The memory 103 stores various information of the image processing apparatus 1. For example, RAM, hard disk Isk device. The memory 103 also serves as a work area when the CPU 101 performs calculations. Information stored in the database 20 is also stored.
[0107] 出力部 104は、画像データ 10内で検出した車両 16の位置及び特定した車種の情 報を出力するためのインターフェースである。例えば、出力先は他の車両が有する無 線端末である。他の車両は、画像データ 10内に車両 16が存在する旨の情報を受信 する。無線通信の場合のインターフェースは無線端末との間の無線プロトコル(例え ば IEEE802. 11に定められた手続き)で通信を行う。 The output unit 104 is an interface for outputting information of the position of the vehicle 16 detected in the image data 10 and the specified vehicle type. For example, the output destination is a wireless terminal of another vehicle. Other vehicles receive information that the vehicle 16 exists in the image data 10. In the case of wireless communication, the interface communicates with wireless terminals using a wireless protocol (for example, procedures defined in IEEE 802.11).
[0108] 本実施例は車両 16の検出のための基本的な特徴量として画素 11間の類似値 38 の差分 39を使用する。類似値 38の差分 39による特徴量は曲線および角部分の値 が大きくなる。また、車両の正面あるいは背面が左右対称であることを利用する。した がって、類似値 38の差分 39による特徴量は車両 16の正面あるいは背面を検出する 場合に有効である。一方、類似値 38の差分 39による特徴量は直線的な部分や曲率 の少ない部分での反応は小さくなる。したがって、類似値 38の差分 39による特徴量 は影などの直線の部分や単純な形状になりやすい部分に対しては反応が小さくなる 。その結果、類似値 38の差分 39による特徴量は影による誤検出や路面反射による 誤検出を減少させることができる。 In the present embodiment, the difference 39 of the similarity values 38 between the pixels 11 is used as a basic feature amount for detecting the vehicle 16. The feature value due to the difference 39 of the similarity value 38 has a large value at the curve and corner. Further, the fact that the front or back of the vehicle is symmetrical is utilized. Therefore, the feature value based on the difference 39 of the similarity value 38 is effective when detecting the front or back of the vehicle 16. On the other hand, the feature value based on the difference 39 of the similarity value 38 has a small response in a linear part or a part with a small curvature. Therefore, the feature value based on the difference 39 of the similarity value 38 is less responsive to a linear part such as a shadow or a part that tends to be a simple shape. As a result, the feature value based on the difference 39 of the similarity value 38 can reduce false detection due to shadows and false detection due to road surface reflection.
[0109] また、画像データ 10を一定時間の間隔で取得するシステムに本実施例を応用する 場合、複数の画像データ 10間において車両 16が存在する位置の差分を取得するこ とにより車両 16の移動速度を計算することも可能である。 In addition, when the present embodiment is applied to a system that acquires the image data 10 at regular time intervals, the difference of the positions where the vehicle 16 exists among the plurality of image data 10 is acquired to obtain the vehicle 16. It is also possible to calculate the moving speed.
[0110] なお、本実施例は、画素 11を基準として演算の処理を行った力 画像データ 10内 の複数の画素 11を一つのまとまりをした領域を演算の単位とすることも可能である。 すなわち、ディジタル画像データ 10の画素数を減少させるようなサイズの変更をした 場合、複数の画素 11が寄せ集められ一つの画素となるサイズの縮小がなされる。一 方、ディジタル画像データ 10の画素数を増加させるようなサイズの変更をした場合、 周囲の画素 11との関係力も補間した画素 11を新たな画素 11とするサイズの拡大が なされる。したがって、複数の画素 11を一つのまとまりとした領域とみなした検出を行 うことは画像データ 10のサイズ変更をした後、画素 11毎に検出処理を行うことと実質 同一である。 [0111] 更に、本実施例では、自己相関差計算部 4が設定する演算領域 32、比較領域 34 、及び探索範囲 31はいずれも画像データ 10内で均一として説明した。しかし、画像 データ 10内で三次元空間として見た場合に手前となる領域あるいは奥となる領域に 応じて探索範囲 31、演算領域 32、および、比較領域 34の範囲を変更することも可 能である。例えば、三次元空間として見た場合に奥となる領域については、探索範囲 31、演算領域 32、および、比較領域 34の範囲を小さくする。一方、三次元空間とし て見た場合に手前となる領域については、探索範囲 31、演算領域 32、および、比較 領域 34の範囲を大きくする。範囲を変更することにより、自己相関差計算部 4が実行 する演算量を削減することができる。また範囲を変更することにより、各座標位置に応 じた精度の検出を行うことができる。具体的には、演算対象画素 33の画像データ 10 内の縦方向 13の座標値に応じて探索範囲 31、演算領域 32、および、比較領域 34 の範囲を変化させるためのデータを予め設けておくことにより可能である。 Note that in this embodiment, an area obtained by grouping a plurality of pixels 11 in the force image data 10 in which the calculation processing is performed using the pixels 11 as a reference can be used as a unit of calculation. That is, when the size is changed so as to reduce the number of pixels of the digital image data 10, the size of the plurality of pixels 11 is collected and reduced to one pixel. On the other hand, when the size is changed so that the number of pixels of the digital image data 10 is increased, the size of the pixel 11 that has also interpolated the relational power with the surrounding pixels 11 is changed to the new pixel 11. Therefore, performing detection in which a plurality of pixels 11 are regarded as a single region is substantially the same as performing detection processing for each pixel 11 after resizing the image data 10. Furthermore, in the present embodiment, the calculation area 32, the comparison area 34, and the search range 31 set by the autocorrelation difference calculation unit 4 are all described as being uniform in the image data 10. However, when viewed as a three-dimensional space in the image data 10, it is also possible to change the range of the search range 31, the calculation region 32, and the comparison region 34 according to the region that is in front or the region that is behind. is there. For example, the range of the search range 31, the calculation region 32, and the comparison region 34 is reduced for the region that is the back when viewed as a three-dimensional space. On the other hand, the search range 31, the calculation region 32, and the comparison region 34 are enlarged for the region that comes to the front when viewed as a three-dimensional space. By changing the range, the amount of computation executed by the autocorrelation difference calculation unit 4 can be reduced. Also, by changing the range, it is possible to detect the accuracy according to each coordinate position. Specifically, data for changing the range of the search range 31, the calculation region 32, and the comparison region 34 according to the coordinate value in the vertical direction 13 in the image data 10 of the calculation target pixel 33 is provided in advance. Is possible.
[0112] また、本実施例は画像データ 10の輝度によって類似値を算出する方法について説 明した。本実施例は輝度だけでなぐ明度、色相、彩度の組合せによって類似値を算 出することができる。 Further, in the present embodiment, the method for calculating the similarity value based on the luminance of the image data 10 has been described. In the present embodiment, a similarity value can be calculated by a combination of lightness, hue, and saturation that is obtained only by luminance.
産業上の利用可能性 Industrial applicability
[0113] 本発明は、画像データ内の車両の位置の検出を少ない演算で可能とすることにより 、画像データ内の車両の状態を通知するシステムに利用することができる。 The present invention can be used in a system that notifies the state of a vehicle in image data by enabling detection of the position of the vehicle in image data with a small amount of computation.
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008529068A JP4743277B2 (en) | 2006-11-28 | 2006-11-28 | Image data recognition method, image processing apparatus, and image data recognition program |
| PCT/JP2006/323685 WO2008065707A1 (en) | 2006-11-28 | 2006-11-28 | Image data recognizing method, image processing device, and image data recognizing program |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2006/323685 WO2008065707A1 (en) | 2006-11-28 | 2006-11-28 | Image data recognizing method, image processing device, and image data recognizing program |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/232,683 Continuation US8204278B2 (en) | 2006-11-28 | 2008-09-22 | Image recognition method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2008065707A1 true WO2008065707A1 (en) | 2008-06-05 |
Family
ID=39467511
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2006/323685 Ceased WO2008065707A1 (en) | 2006-11-28 | 2006-11-28 | Image data recognizing method, image processing device, and image data recognizing program |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP4743277B2 (en) |
| WO (1) | WO2008065707A1 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011039977A1 (en) * | 2009-09-29 | 2011-04-07 | パナソニック株式会社 | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| JP2011525005A (en) * | 2008-05-21 | 2011-09-08 | アーデーツエー・オートモテイブ・デイスタンス・コントロール・システムズ・ゲゼルシヤフト・ミツト・ベシユレンクテル・ハフツング | Driver assistance system for avoiding collision between vehicle and pedestrian |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06201379A (en) * | 1992-12-28 | 1994-07-19 | Mitsubishi Electric Corp | Range finder |
| JPH09330404A (en) * | 1996-06-10 | 1997-12-22 | Nippon Telegr & Teleph Corp <Ntt> | Object detection device |
| JPH11345336A (en) * | 1998-06-03 | 1999-12-14 | Nissan Motor Co Ltd | Obstacle detection device |
| JP2000039306A (en) * | 1998-07-22 | 2000-02-08 | Nec Corp | Device and method for vehicle system area detection |
| JP2001351200A (en) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | In-vehicle object detection device |
| JP2005149250A (en) * | 2003-11-18 | 2005-06-09 | Daihatsu Motor Co Ltd | Vehicle detection method and vehicle detection system |
-
2006
- 2006-11-28 JP JP2008529068A patent/JP4743277B2/en not_active Expired - Fee Related
- 2006-11-28 WO PCT/JP2006/323685 patent/WO2008065707A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06201379A (en) * | 1992-12-28 | 1994-07-19 | Mitsubishi Electric Corp | Range finder |
| JPH09330404A (en) * | 1996-06-10 | 1997-12-22 | Nippon Telegr & Teleph Corp <Ntt> | Object detection device |
| JPH11345336A (en) * | 1998-06-03 | 1999-12-14 | Nissan Motor Co Ltd | Obstacle detection device |
| JP2000039306A (en) * | 1998-07-22 | 2000-02-08 | Nec Corp | Device and method for vehicle system area detection |
| JP2001351200A (en) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | In-vehicle object detection device |
| JP2005149250A (en) * | 2003-11-18 | 2005-06-09 | Daihatsu Motor Co Ltd | Vehicle detection method and vehicle detection system |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011525005A (en) * | 2008-05-21 | 2011-09-08 | アーデーツエー・オートモテイブ・デイスタンス・コントロール・システムズ・ゲゼルシヤフト・ミツト・ベシユレンクテル・ハフツング | Driver assistance system for avoiding collision between vehicle and pedestrian |
| WO2011039977A1 (en) * | 2009-09-29 | 2011-04-07 | パナソニック株式会社 | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| CN102483881A (en) * | 2009-09-29 | 2012-05-30 | 松下电器产业株式会社 | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| CN102483881B (en) * | 2009-09-29 | 2014-05-14 | 松下电器产业株式会社 | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| US8744131B2 (en) | 2009-09-29 | 2014-06-03 | Panasonic Corporation | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| JP5548212B2 (en) * | 2009-09-29 | 2014-07-16 | パナソニック株式会社 | Crosswalk sign detection method and crosswalk sign detection device |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2008065707A1 (en) | 2010-03-04 |
| JP4743277B2 (en) | 2011-08-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8204278B2 (en) | Image recognition method | |
| CN104700414B (en) | A kind of road ahead pedestrian's fast ranging method based on vehicle-mounted binocular camera | |
| JP5267596B2 (en) | Moving body detection device | |
| JP7050763B2 (en) | Detection of objects from camera images | |
| CN112084822A (en) | Lane detection device and method, electronic equipment | |
| JP5959073B2 (en) | Detection device, detection method, and program | |
| CN109583267A (en) | Vehicle object detection method, vehicle object detecting device and vehicle | |
| JPWO2013129361A1 (en) | Three-dimensional object detection device | |
| CN103366155B (en) | Temporal coherence in unobstructed pathways detection | |
| CN107491065B (en) | Method and apparatus for detecting side surface of object using ground boundary information of obstacle | |
| JP6021689B2 (en) | Vehicle specification measurement processing apparatus, vehicle specification measurement method, and program | |
| JP3729025B2 (en) | Pedestrian detection device | |
| CN107004137A (en) | System and method for target detection | |
| JP4341564B2 (en) | Object judgment device | |
| US20200193184A1 (en) | Image processing device and image processing method | |
| KR101236223B1 (en) | Method for detecting traffic lane | |
| CN107992788B (en) | Method, device and vehicle for identifying traffic lights | |
| JP5981284B2 (en) | Object detection device and object detection method | |
| JP2010271969A (en) | Lane detection device | |
| JP5786793B2 (en) | Vehicle detection device | |
| JP2020095623A (en) | Image processing device and image processing method | |
| CN108629226B (en) | Vehicle detection method and system based on image layering technology | |
| WO2008065707A1 (en) | Image data recognizing method, image processing device, and image data recognizing program | |
| JP4586571B2 (en) | Object judgment device | |
| CN104252707A (en) | Object detecting method and device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 2008529068 Country of ref document: JP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06833489 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 06833489 Country of ref document: EP Kind code of ref document: A1 |