WO2008065707A1 - Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image - Google Patents
Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image Download PDFInfo
- Publication number
- WO2008065707A1 WO2008065707A1 PCT/JP2006/323685 JP2006323685W WO2008065707A1 WO 2008065707 A1 WO2008065707 A1 WO 2008065707A1 JP 2006323685 W JP2006323685 W JP 2006323685W WO 2008065707 A1 WO2008065707 A1 WO 2008065707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- vehicle
- pixel
- region
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- Image data recognition method image processing apparatus, and image data recognition program
- the present invention relates to image data recognition.
- the present invention relates to an image data recognition method for detecting a subject in image data.
- a technique for detecting a vehicle or a pedestrian from image data acquired by a fixed camera and image data acquired by a camera mounted on a moving body.
- the technology for detecting a vehicle or a pedestrian is used, for example, to acquire image data of an intersection, detect a vehicle in the intersection, and present the detected vehicle or pedestrian to the traveling vehicle.
- Methods for extracting a vehicle to be detected in image data include a background difference method and an edge detection method.
- the background difference method is a method in which image data of a background without an object is acquired in advance, and the object in the image data is extracted by comparing with image data to be extracted.
- the edge detection method is a method in which an edge in image data is detected, and model data based on edge data stored in advance is compared with detected edge data to identify a target vehicle.
- the background subtraction method detects a region including a shadow of a vehicle, there is a problem that the position of only the vehicle in the image data cannot be accurately detected (Patent Document 1).
- the edge detection method also detects shadows.
- the detection process takes time because the shape is identified by comparing the shape data of the edge detected part with the shape data stored in the object shape database in advance ( Patent Document 2).
- Patent Document 1 JP-A-2005-190142
- Patent Document 2 JP 2000-148977
- the present invention has the following configuration in order to provide a method for detecting a subject in image data that solves the above problems.
- the similarity value calculated from the first region and each second region is associated with each second pixel.
- the similarity value indicating the feature of the first region among the similarity values associated with the plurality of second pixels is associated with the first pixel.
- an area in which the shape determined according to the distribution of similarity values indicating the characteristics of the first area matches the shape previously determined as the characteristics of the object is detected as a portion of the object in the image data.
- the shape defined as the feature of the subject is a condition that determines the shape of the vehicle.
- the vehicle obtains the center position of the vehicle in the horizontal direction in the image data from the target of the distribution of the similarity value of the vehicle, and the conditional force that determines the shape of the vehicle distribution and the shape of the vehicle. It is characterized by specifying the position.
- FIG. 1 is a configuration diagram of an image processing apparatus 1 of the present embodiment.
- FIG. 2 shows image data taken by the camera 2 of the present embodiment.
- FIG. 3 is a configuration example of the database 20 in which the coordinates of the vehicle 13 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- FIG. 4 is a conceptual diagram for detecting a similarity value 38 from acquired image data 10.
- FIG. 5 is a flowchart of a method for calculating a similarity value 38.
- Figure 6 shows an example of a pedestrian crossing 41.
- FIG. 7 is a flowchart of processing for removing the pedestrian crossing 41 area from the image data 10 executed by the road mark deleting unit 5.
- FIG. 8 is a diagram showing an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- FIG. 8 is a diagram showing an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6.
- FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16.
- FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
- FIG. 12 is an example of vehicle detection according to the present embodiment.
- FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment.
- Point database that indicates the boundary between the roadway and the roadside belt
- FIG. 1 is a configuration diagram of an image processing apparatus 1 according to the present embodiment.
- the image processing apparatus 1 of the present invention includes a camera 2, an image memory 3, an autocorrelation calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Composed.
- the image data of the intersection imaged by the camera 2 is stored in the image memory 3.
- the autocorrelation difference calculator 4 creates a difference map of similar values from the image data stored in the image memory 3.
- the autocorrelation difference calculation unit 4 obtains a difference between similar values for each pixel and creates a distribution diagram based on the difference between similar values.
- the road mark deletion unit 5 deletes a road mark area in the image data stored in the image memory 3. From the above, the image processing apparatus 1 calculates a difference between similar values from the original image data, and obtains a distribution map from which road marks are removed.
- the pattern feature quantity calculation unit 6 calculates the vehicle pattern from the distribution of similar value differences created by the autocorrelation difference calculation unit 4. Detected.
- the pattern feature quantity calculation unit 6 acquires the vertical coordinate in the image data indicating the pattern force of the vehicle, the foremost part or the last part of the vehicle. When the pattern feature quantity calculation unit 6 detects a vehicle pattern, it also takes into account the area information deleted by the road mark deletion unit 5.
- the target feature quantity calculation unit 7 calculates the target property of the vehicle from the distribution diagram of the difference between the similar values created by the autocorrelation difference calculation unit 4.
- the target feature quantity calculation unit 7 obtains the center line where the vehicle exists in the image data from the target.
- the vehicle position determination unit 8 specifies the position of the vehicle in the horizontal direction within the image specified by the pattern feature quantity calculation unit 6 and the object feature quantity calculation unit 7. Further, the vehicle position determination unit 8 acquires the vehicle width determined from the position of the vehicle, and also searches the database for the acquired vehicle width and the vertical position force in the image data to specify the vehicle type.
- FIG. 2 shows image data 10 taken by the camera 2 of this embodiment.
- Image data 10 is stored in image memory 3 in digital format.
- Image data 10 is composed of a set of pixels 11.
- Pixel 11 is the smallest unit constituting image data 10.
- the vertical direction of the image data 10 indicates the direction of the vertical direction 13
- the horizontal direction indicates the direction of the horizontal direction 14.
- the ordinate indicates the coordinates that specify the position in the vertical direction 13 in the image data 10
- the abscissa indicates the coordinates that specify the position in the horizontal direction 14 in the image data 10.
- the vanishing point 12 is the deepest coordinate when the image data 10 projected onto the second dimension is viewed in the original three-dimensional space.
- the vanishing point 12 of the image data 10 when the camera 2 is installed in a place overlooking the upward force is generally arranged at a position above the vertical direction 13 in the image data 10.
- Arranging the vanishing point 12 at an upper position in the image data 10 makes it possible to increase the area in which the roadway 15 in the image data 10 is reflected. Therefore, the image processing apparatus 1 can be decomposed in the longitudinal direction 13 of the roadway 15 by many ordinates as compared with the case where the vanishing point 12 is in a lower position in the image data 10. When it becomes possible to disassemble the vertical direction 13 more, the image processing apparatus 1 can accurately acquire the position of the vehicle 16 on the roadway 15.
- the camera 2 when the camera 2 is mounted on the vehicle 16, the camera 2 is generally arranged so as to capture a horizontal direction with respect to the ground.
- the camera 2 is attached to the vehicle 15, the camera 2 is attached to a low position such as a front portion of the vehicle 15. Therefore, camera 2 is pointed downward In the case of an angle, the camera 2 cannot capture the highest part of the vehicle 15 traveling in front.
- the width of the roadway 15 is set to be the width of the roadway 16 to be acquired at a position below the vertical direction 13 in the image data 10. At this time, the point 17 indicating the boundary between the roadway 16 and the roadside belt in the foreground is the lowermost part of the image data 10.
- the camera 2 can capture a large width of the vehicle 16 in front of the image data 10.
- the image processing device 1 can be decomposed with more pixels than the width of the vehicle 16, and the image processing device 1 can accurately acquire the width information of the vehicle 16.
- the angle of the camera 2 with respect to the vanishing point 12 and the width of the roadway 15 can be changed according to the road condition and the object to be acquired.
- the image processing apparatus 1 has coordinate information of the vanishing point 12 in the image data 10. Further, the image processing apparatus 1 has the coordinates of a point 17 that indicates the boundary between the roadway 15 and the roadside belt in the foreground. The coordinates of point 17 indicating the boundary between roadway 15 and the roadside belt are two points on both sides of the road. Further, the image processing apparatus 1 obtains a straight line at the boundary between the roadway 15 and the roadside band from a straight line connecting the coordinates of the vanishing point 12 and the coordinates of the point 17 indicating the boundary between the roadway 15 and the roadside band. Further, the image processing apparatus 1 has a database 20 in which the coordinates of the vehicle 16 existing in the image data 10 in the vertical direction 13 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- FIG. 3 is a configuration example of the database 20 in which the coordinate value 21 in the vertical direction 13 of the vehicle 16 existing in the image data 10 and the width of the vehicle 16 are associated with each type of the vehicle 16.
- the image data 10 in which the coordinate value 21 in the vertical direction 13 is composed of 480 pixels 11 has the width information of the vehicle 16 corresponding to the 480 coordinate values 21 in the vertical direction 13.
- the width of the vehicle 16 is almost determined according to the light car 22, the normal car 23, and the large car 24.
- the width of a typical vehicle of ordinary car 23 is 1700 mm.
- the image processing apparatus 1 can obtain in advance the number of pixels in the width of the vehicle 16 in the lateral direction 14 corresponding to the position of the coordinate value 21 in the longitudinal direction 13 of the vehicle 16 in the image data 10.
- the database 20 of the present embodiment stores the vehicle 16 in the lateral direction 14 such as a light car 22, a normal car 23, and a large car 24 according to the position of the coordinate value 21 in the vertical direction 13 as the width information of the vehicle 16. Are stored in association with the number of pixels having a width of.
- the length of the bottom of the quadrilateral indicating the vehicle 16 on the roadway 15 in the image data 10 is the number of pixels that is the width of the vehicle 16.
- the number of pixels of the width of the vehicle 16 changes according to the coordinate in the vertical direction 13 where the square is detected.
- the autocorrelation difference calculation unit 4 calculates a difference 39 of the similarity values 38 associated with each pixel 11 of the image data 10 stored in the image memory 3.
- the similarity value 38 is a numerical value indicating the degree of approximation of the entire area when the first area and the second area set in the image data 10 are compared.
- FIG. 4 is a conceptual diagram in which the autocorrelation difference calculation unit 4 detects the similarity value 38 from the acquired image data 10.
- the calculation target pixel 33 (first pixel) is a pixel 11 for obtaining the similarity value 38 in the image data 10 that is the current target of the calculation.
- the calculation target pixels 33 are all the pixels 11 in the image data 10. It should be noted that the calculation of the similarity value 38 can be omitted for the pixel 11 located in an area where the vehicle 16 cannot exist in the image data 10.
- the autocorrelation difference calculation unit 4 can omit the calculation for the pixel 11 that exceeds the roadside band and is in a range where the vehicle 16 is clearly not present.
- the area where the autocorrelation difference calculation unit 4 can omit the calculation is set in advance.
- the autocorrelation difference calculation unit 4 calculates the similarity value 38 for all the pixels 11 other than the region where the calculation can be omitted. Since the autocorrelation difference calculation unit 4 omits the calculation for pixels in a range where the vehicle 16 is clearly assumed not to exist, the amount of calculation performed by the image processing device 1 is reduced. It is possible to shorten the time until the result is obtained.
- the comparison target pixel 35 (second pixel) is the pixel 11 that is the current target of the calculation for obtaining the similarity value 38 in the image data 10 for the calculation target pixel 33.
- the calculation area 32 (first area) is an area that is the current comparison target of the calculation for obtaining the similarity value 38 with the calculation target pixel 33 as the center.
- the comparison area 34 (second area) is an area to be compared with the calculation area 32 around the comparison target pixel 35.
- the search range 31 is a range in which the comparison target pixel 35 is set for the calculation target pixel 33. .
- the search range 31 is set because it is not necessary to calculate the similarity value 38 when the distance between the coordinates in the image data 10 is large between the calculation target pixel 33 and the comparison target pixel 35. This is because when the distance between the coordinates is far away, the calculation area 32 and the comparison area 34 are not considered to be a single vehicle 16.
- calculating the similarity value 38 between completely different objects is a cause of erroneous detection of the vehicle 16 in the image data 10.
- the search range 31 is a range in which the vehicle 16 can exist around the calculation target pixel 33.
- the comparison target pixels 35 are all the pixels 11 within the search range 31. If the search range 31 and the region where the vehicle 16 cannot exist in the image data 10 overlap, the autocorrelation difference calculation unit 4 calculates the similarity value 38 of the comparison target pixel 35 in the overlap range. It is also possible to omit the calculation. If the calculation for calculating the similarity value 38 with the comparison target pixel 35 in the overlapping range is omitted, the amount of calculation performed by the image processing device 1 decreases, so the time required for the image processing device 1 to obtain the calculation result is reduced. It can be shortened.
- FIG. 5 is a flowchart of a method for calculating the similarity value 38.
- the autocorrelation difference calculation unit 4 performs a calculation for calculating the similarity value 38 for each calculation target pixel 33 in the image data 10 according to the following procedure.
- the autocorrelation difference calculation unit 4 specifies the coordinates (X, y) of the calculation target pixel 33 (S01).
- the coordinates of the pixel to be calculated 33 in Fig. 4 are (X, y).
- “X” is the value of the coordinate in the horizontal direction 14 of the calculation target pixel 33 in the image data 10.
- “y” is the value of the coordinate in the vertical direction 13 of the calculation target pixel 33 in the image data 10.
- the area around the coordinates (X, y) of the calculation target pixel 33 is the calculation area 32.
- each pixel 11 in the calculation area 32 is set to (x + i, y + j).
- “i” is a variable that specifies the coordinate in the horizontal direction 14 of the pixel 11 in the area around the coordinate (X, y) of the pixel to be calculated 33.
- “ ⁇ is a variable that specifies the coordinate in the vertical direction 13 of the pixel 11 in the area around the coordinates (X, y) of the pixel 33 to be calculated.
- the autocorrelation difference calculation unit 4 specifies the search range 31 (S02).
- the search range 31 is a value having a preset size, and is set around the coordinates (X, y) of the calculation target pixel 33.
- the autocorrelation difference calculation unit 4 specifies the coordinates (x + u, y + v) of the comparison target pixel 35 in the search range 31 (S03).
- "u” is a variable for specifying the horizontal 14 coordinates in the search range 31 It is.
- “v” is a variable for specifying the coordinate in the vertical direction 13 within the search range 31.
- the area around the comparison target pixel 35 becomes the comparison area 34. Further, each pixel in the comparison area 34 is assumed to be (X + u + i, y + v + j).
- “i” is a variable for specifying the coordinate in the horizontal direction 14 of the pixel 11 in the region around the comparison target pixel (x + u, y + v).
- T is a variable for specifying the coordinate in the vertical direction 13 of the pixel 11 in the region around the comparison target pixel (x + u, y + v).
- the autocorrelation difference calculation unit 4 calculates a similarity value 38 between the calculation area 32 and the comparison area 34.
- the calculation to calculate the similarity value 38 is the coordinates of each pixel 11 in the area around the calculation target pixel 33.
- (x + i, y + j) "i", "j” and the comparison target pixel 35 Show the coordinates of each pixel in the area of (x + u + i, y + v + j), change the values of “i” and “j” and compare for each pixel corresponding to each coordinate (S04).
- the autocorrelation difference calculation unit 4 calculates the similarity value 38 between the calculation region 32 and the comparison region 34 by the following method. 1.Calculation method based on the sum of absolute values of the difference between the brightness of the pixels in the computation area 32 and the brightness of the pixels in the comparison area 34.SAD (Sum Of Absolute Difference) 2.Comparison with the brightness of the pixels in the computation area 32 Method of calculating from sum of squares of luminance and difference of pixels in area 34 SSD (Sum Of Square Difference), and 3. Method of calculating by normalization NCOR (Normalized Correlation).
- the autocorrelation difference calculation unit 4 obtains the luminance difference between each pixel between the calculation region 32 and the comparison region 34, and calculates the similarity 38 as the entire comparison region 34 from the sum of the luminance differences. calculate.
- the autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 by SAD.
- the brightness PI (x + i, y + j) of the pixel 11 at each coordinate (x + i, y + j) in the calculation area 32 is acquired.
- the brightness P2 (x + u + i, y + v + j) of the pixel 11 at the coordinates (x + u + i, y + v + j) in the comparison area 34 is acquired.
- the absolute value of the difference in luminance is calculated between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j).
- Equation (1) is used to calculate the sum of the absolute values of the luminance differences. “T” indicates that the currently acquired image data 10 is the target. [0034] [Equation 1] p _ p
- the autocorrelation difference calculation unit 4 performs the following calculation when calculating the similarity value 38 using the SSD.
- the autocorrelation difference calculation unit 4 calculates the square of the difference in luminance between the luminance Pl (x + i, y + j) and the luminance P2 (x + u + i, y + v + j). To do.
- the autocorrelation difference calculation unit 4 calculates the sum of squares of the differences in luminance calculated between all the pixels in the calculation region 32 and all the pixels in the comparison region 34.
- the autocorrelation difference calculation unit 4 sets the calculated value as the similarity value 38 corresponding to the comparison target pixel 35. Equation (2) is used to calculate the sum of the squares of the luminance differences. “T” indicates that the currently acquired image data 10 is a target.
- the autocorrelation difference calculation unit 4 performs the calculation of Equation (3) when calculating the similarity value 38 by NCOR. “t” indicates the currently acquired image. “T” indicates that the currently acquired image data 10 is the target.
- the autocorrelation difference calculation unit 4 repeats the process until the luminance difference is calculated for all the pixels between the comparison area 34 and the calculation area 32 (S05: no).
- the autocorrelation difference calculation unit 4 calculates the sum of the luminance differences (S05 : yes )
- the similarity value 38 similar value calculated from the first region and the second region) is associated with the comparison target pixel 35.
- Store S06
- the autocorrelation difference calculation unit 4 repeats the process until the similarity value 38 between all the comparison target pixels 35 and the calculation target pixel 33 within the search range 31 is calculated (S07: no).
- S07: yes When the autocorrelation calculation unit 4 calculates the similarity value 38 for all the comparison target pixels 35 in the search range 31 (S07: yes), each calculation target pixel 35 is compared with each comparison target pixel 35 in the search range 31.
- S08 has The difference 39 of the similar value 38 indicating the feature among the difference 39 of the similar value 38 (similar value indicating the feature of the first region) is associated (S08).
- the difference 39 of the similarity value 38 indicating the feature is a large value when the image of the calculation region 32 includes a corner, and the image of the calculation region 32 is small when there are few changes such as a straight line and a fast surface. Value.
- the difference 39 of the similarity value 38 associated in S08 is used for the vehicle detection process described later.
- the autocorrelation difference calculation unit 4 calculates the difference 39 of the similarity value 38 associated with the calculation target pixel 33 according to the following procedure.
- the comparison target pixel 35 calculated by the autocorrelation difference calculation unit 4 within the search range 31 includes the calculation target pixel 33 itself.
- the similarity value 38 calculated between the calculation target pixels 33 itself is a value indicating the most similarity among the similar values 38 corresponding to the comparison target pixels 35 in the search range 31. Therefore, it is necessary to exclude the similarity value 38 calculated between the calculation target pixel 33 itself.
- the autocorrelation difference calculation unit 4 calculates the comparison target pixel 35 having the second-like similarity value 38 in the search range 31. Is detected. For example, the autocorrelation difference calculation unit 4 removes the similarity value 38 between the calculation target pixel 33 itself, and among the similar value 38 stored corresponding to each comparison target pixel 35 in the search range 31. To get the minimum similarity value 38.
- the autocorrelation difference calculation unit 4 sets the obtained value as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33.
- the calculation obtained from SAD is shown in Equation (4).
- the calculation obtained from the SSD is shown in Equation (5).
- T indicates that the currently acquired image data 10 is a target.
- the autocorrelation difference calculating unit 4 detects the comparison target pixel 35 having the second-like similarity value 38 in the search range 31.
- the similar value 38 that is the maximum value among the similar values 38 stored corresponding to each comparison target pixel 35 in the search range 31 is processed by performing processing that excludes the similar value 38 between the calculation target pixel 33 itself. Get 38.
- the autocorrelation difference calculation unit 4 sets a value 39 obtained by subtracting the second-like similarity value 38 from “1” as the difference 39 of the similarity value 38 that is associated with the calculation target pixel 33.
- the calculation obtained from NCOR is shown in Equation (6). “T” indicates that the currently acquired image data 10 is the target.
- the difference 39 of the similarity value 38 calculated by the autocorrelation difference calculation unit 4 has a feature that increases when the calculation area 32 has a corner.
- the vehicle 16 has many corners when viewed from the front. For example, a frame portion on both sides of a front glass attached to the vehicle 16, a fender mirror portion of the vehicle 16, a front grill portion in front of the vehicle 16, a number plate portion in front of the vehicle 16, and the like. These parts of the vehicle 16 are shown in the image data 10 as corners, and the calculated similarity value 38 is larger than the similarity value 38 in the region where other corners are shown.
- the similarity value 38 increases because the calculation area 32 and the comparison area 34 rarely have corners having the same shape, and the number of pixels having different luminance between the calculation area 32 and the comparison area 34 is large. .
- the difference 39 of the similarity values 38 calculated by the autocorrelation difference calculation unit 4 is characterized by being small.
- the case where there is no corner is, for example, a case where a straight edge portion such as a roof portion of the vehicle 16 is used as a region, or a case where an interior such as asphalt is configured with similar colors.
- the autocorrelation difference calculation unit 4 performs a calculation process for obtaining the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 according to the above-described procedure as the calculation target pixel 33 for all the pixels 11 in the image data 10 (S09). ).
- the autocorrelation difference calculation unit 4 calculates all calculation target pixels in the image data 10. If the difference 39 of the similar value 38 is associated with 33, if not (S09: no), the calculation target pixel 33 is moved to the next pixel 11 in the image data 10 and similarly the difference 39 of the similar value 38 is obtained. Is calculated.
- the autocorrelation difference calculation unit 4 performs each calculation target in the image data 10.
- a distribution map 50 based on the difference 39 of the similarity value 38 associated with the pixel 33 is created (S10).
- Distribution map 50 is composed of difference 39 of similarity value 38. Pixels 11 having a large difference 39 value of similar values 38 in the distribution map 50 are distributed in a large area in the image data 10 where corners are shown. However, the difference 39 of the similarity value 38 is large even in the region where the corner of the road mark is shown. Therefore, such a road mark causes false detection. Therefore, the excess part of the background is deleted from the image data 10 stored in the image memory 3.
- the extra portion of the background is, for example, a road mark.
- a road mark is a line that indicates the boundary between a pedestrian crossing, a center line of a road, and a sidewalk.
- processing for deleting information related to the pedestrian crossing in the image data 10 is performed.
- a pedestrian crossing is drawn in the image data 10 and parallel to the horizontal direction 14.
- the road mark deletion process is performed every time the image data 10 is acquired. This is because in the process of deleting the road mark area in advance, even if there is a vehicle 16 on the road mark, the road mark area is deleted and the vehicle 16 on the road mark may not be detected.
- FIG. 6 shows an example of a pedestrian crossing 41.
- the pedestrian crossing 41 is an area in which a white stripe pattern 42 is drawn on the roadway 15.
- the road mark deletion unit 5 detects the area of the pedestrian crossing 41 as well as the image data 10 by the following procedure.
- the area of the pedestrian crossing 41 detected by the road mark deletion unit 5 is excluded from the calculation when the vehicle 16 is detected.
- FIG. 7 is a flowchart of a process for excluding the pedestrian crossing 41 area from the image data 10 executed by the road mark deletion unit 5.
- the road mark deletion unit 5 deletes the edge 47 along the line segment facing the vanishing point 12 in the image data 10 (S21).
- the road mark deleting unit 5 deletes the edge 48 in the horizontal direction 14 periodically present in the image data 10 (S22).
- Crosswalk 41 striped pattern 42 is image data This is because it exists in the horizontal direction 14 in the data 10.
- the road mark deletion unit 5 detects a repeating pattern having a period length in the horizontal direction 14 that matches the pedestrian crossing 41 (S23). Pedestrian crossing 4 1 Stripe 42 Width 49 is defined. Therefore, the road mark deletion unit 5 can store in advance the width 49 of the striped pattern 42 in the horizontal direction 14 according to the coordinates in the vertical direction 13 in the image data 10.
- Ap (X, y) is the luminance P3 (x, y) of the coordinate 43 (x, y) that is the target of the current calculation and the coordinate 43 (X, y) of the target of the current calculation.
- it is the average value of the luminance between the coordinates P4 (x + L, y) of the coordinates 44 (x + L, y) moved in the horizontal direction 14 by a predetermined amount L.
- the predetermined amount L is the width in the horizontal direction 14 of the striped pattern 42 of the pedestrian crossing 41 according to the coordinate “y” in the vertical direction 13 in the image data 10.
- Ap (x, y) is the average value of the luminance of P3 and P4, so it increases when the white part of the pedestrian crossing.
- the road mark deletion unit 5 calculates Ap from the equation (7).
- Equation 7 Equation (7)
- Am (x, y) is a coordinate 45 (x—L / 2, x) that is moved in the horizontal direction 14 by a predetermined amount (-LZ2) with respect to the coordinate 43 (x, y) that is the object of the current calculation.
- y) Luminance P5 (x—L / 2, y) and coordinate 43 (X, y) for the current calculation.
- / 2, y) is the average value of luminance between P6 (x + L / 2, y). The average value of the brightness increases when the white part of the pedestrian crossing.
- the road mark deletion unit 5 calculates Am from Equation (8).
- the relationship between Ap and Am is a relationship in which the luminance is inverted because P5 and P6 are offset by 14 in the horizontal direction by "LZ2" with respect to P3 and P4.
- P3 and P4 are located in the white painted part of the striped pattern 42 of the pedestrian crossing 41, P5 and P6 are crosswalks.
- the relationship is located on the asphalt part of the striped pattern 42 on the road 41. That is, Am (x, y) has a minimum value when Ap (x, y) is a maximum value, and Am (x, y) has a maximum value when Ap (x, y) is a minimum value. .
- Cp (x, y) is an absolute difference between the luminance P3 (x, y) and the luminance P4 (x + L, y). Therefore, it becomes a small value when P3 and P4 are in the same state.
- P3 5 and P4 are located in the white part of the striped pattern 42 of the pedestrian crossing 41; ⁇ P3 and P4 are respectively located on the asphalt part of the striped pattern 42 of the pedestrian crossing 41 Is the case.
- Part 5 calculates Cp from equation (9).
- Cm (x, y) is the absolute difference between the luminance P5 (x-L / 2, y) and the luminance P6 (x + L / 2, y). Therefore, the value is small when P5 and P6 are in the same state. For example, P5 and P6 are placed on the white part of the striped pattern 42 of the pedestrian crossing 41; or P5 and P6 are placed on the asphalt part of the striped pattern 42 of the pedestrian crossing; ⁇ This is the case of standing.
- the road mark deletion unit 5 also calculates the force of Cm using equation (10).
- the road mark deleting unit 5 calculates the feature point Ecw of the pedestrian crossing 41 from the equation (11).
- Ecw takes a large value when Ap and Am are in a reversed relationship. Ecw is the difference difference A large value is obtained when the pair values Cp and Cm are small. Therefore, in the area of the pedestrian crossing 41, the feature point is high.
- the road mark deletion unit 5 associates the calculated Ecw value with the corresponding pixel 11 (S24).
- the road mark deletion unit 5 performs the above calculation for each pixel 11 in the image data 10 (S25).
- the road mark deletion unit 5 excludes the area determined to be a pedestrian crossing 41 from the process of identifying the vehicle 16 (S26).
- the road mark deletion unit 5 associates the pixel detected as the pedestrian crossing 41 with the pixel in the distribution map 50 created by the autocorrelation calculation unit 4, and is excluded from the subsequent vehicle 16 detection processing.
- the autocorrelation difference calculation unit 4 can also create the distribution map 50 from the image data 10 after the crosswalk 41 is removed by the road mark deletion unit 5.
- the image processing apparatus 1 can reduce erroneous detection of the vehicle 16 in the process of specifying the vehicle 16.
- the road mark deletion unit 5 acquires the area of the pedestrian crossing 41 to be deleted in FIG. 6C with respect to the original image data 10 in FIG. 6B.
- the image processing apparatus 1 calculates an area in which the shape determined according to the distribution of similar values indicating the characteristics of the calculation area 32 matches the shape previously determined as the characteristics of the vehicle (subject). Detect as part of the vehicle inside.
- the image processing apparatus 1 detects the vehicle 16 from the distribution map 50 created by the autocorrelation difference calculation unit 4 and the road mark deletion unit 5.
- the image processing apparatus 1 detects the vehicle in the distribution map 50 using the shape of the left and right objects of the vehicle 16.
- FIG. 8 shows an image (A) of the vehicle 16 portion of the original image data 10 and an image (B) of the vehicle 16 portion of the distribution map 50.
- the camera 2 installed in this embodiment takes a picture of the vehicle 16 in front or rear.
- the front or back of vehicle 16 is the shape of the left and right target
- the difference 39 between the similar values 38 of the pixel groups 51 located on the left and right of the vehicle 16 is increased. This is because the left and right sides of the vehicle 16 have many corners.
- the difference 39 between the similarity values 38 of the pixel groups located in the front part of the vehicle 16 is large. This is because the front part of the vehicle 16 has many corners. The front part is also symmetrical from the center of the vehicle 16.
- the difference 39 between the similar values 38 of the pixel group 52 located outside the left and right side surfaces of the vehicle 16 is reduced. vehicle This is because the difference 39 of the similarity value 38 is small because the outside of 16 is asphalt.
- the difference 39 of the similarity value 38 corresponding to the pixel group 53 located in front of the front portion of the vehicle 16, that is, in front of the vehicle 16 is reduced.
- the similarity value 38 is small because the outside of the vehicle 16 is asphalt. Small values are distributed among the similar values of the pixel group located in the front part of the vehicle 16. It should be noted that the front of the front portion of the vehicle 16 facing forward is the coordinate in the vertical direction 13 below the vehicle 16 in the two-dimensional image data 10.
- a pattern that is a characteristic of the vehicle is defined in advance.
- the pattern that is a feature of the vehicle 16 in this embodiment is defined by the following conditions.
- the pattern feature quantity calculation unit 6 of the image processing apparatus 1 detects a region satisfying the above conditions from the distribution map 50.
- FIG. 9 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the pattern feature quantity calculation unit 6.
- the pattern feature quantity calculation unit 6 performs the following calculation for each pixel 54 in the distribution diagram 50 in FIG. First, the pattern feature quantity calculation unit 6 sets a pixel 54 that is a current calculation target in the distribution map 50 as a detection target pixel 55 (S31). The no-turn feature quantity calculation unit 6 calculates the feature quantity using the detection target pixel 55 as a reference position (S32). Specifically, the pattern feature quantity calculation unit 6 acquires the result of the following expressions (12) to (16) for the distribution state of the difference 39 of the similarity value 38. “T” indicates that the currently acquired image data 10 is the target.
- Eleft 56 obtained by Expression (12) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “kl” and “11” with the detection target pixel 55 as a reference coordinate.
- Eright 57 obtained by equation (13) is the sum of the differences 39 of the similarity values 38 of the pixels 54 in the region defined by “k2” and “12” with the detection target pixel 55 as the reference position.
- the central force of the vehicle 16 corresponds to the range to the left and right side parts.
- the side part and the front part of the vehicle 16 have many corner parts and many values with a large difference 39 of similar values 38 are distributed.
- “T” indicates that the currently acquired image data 10 is a target.
- Eside 58 obtained by the equation (14) is obtained by calculating the difference 39 of the similarity value 38 of the pixel 54 in the region defined by "k3" and "13" on the left side of the distribution map 50 with the detection target pixel 55 as a reference position.
- the sum. Eside 58 corresponds to the outside of the side of the vehicle 16.
- the area further outside the side face of the vehicle 16 is asphalt, and there are many distributions with a small difference 39 between the similarity values 38.
- Ebottom 59 obtained by the equation (15) is distributed with the detection target pixel 55 as a reference position.
- Ebottom 59 is on the underside of vehicle 16.
- the reference position of the detection target pixel 55 is the lower end of the front part of the vehicle 16, the area of the Ebottom 59 is a road. Therefore, the value of Ebottom59 becomes smaller.
- the pattern feature quantity calculation unit 6 substitutes the values calculated by the above equations (12) to (15) into the following equation (16) to obtain the characteristics of the vehicle 16 corresponding to the detection target pixel 55.
- Indicate size Get the value 60E (x, y). “T” indicates that the currently acquired image data 10 is the target.
- E (x, y) increases when the values of Eleft56 and Eright57 are large! /.
- E (x, y) becomes smaller when there is a left / right balance of Eleft56 and Eright57.
- E (x, y) is the value of Eside58.
- E (x, y) becomes smaller when the value of Ebottom59 is small.
- the values for Cside and Cbottom are values set by the designer according to the road conditions photographed by the camera 2 and the like.
- the no-turn feature quantity calculation unit 6 associates the calculated E (x, y) with the detection target pixel 55 (S33).
- the non-turn feature quantity calculator 6 calculates the above processing for all the pixels 54 in the distribution map 50 (S34).
- the pattern feature quantity calculation unit 6 can acquire the coordinate in the vertical direction 13 and the coordinate in the horizontal direction 14 of an area where the vehicle 16 is considered to exist in the distribution map 50 based on the calculation result.
- the autocorrelation difference calculation unit 4 obtains the difference 39 of the similarity value 38 with respect to the calculation target pixel 33 for the original image data 10, and the similarity value Since the distribution map 50 is created with the difference 39 of 38, the influence of shadows and the like appearing in the image data 10 is reduced.
- the target characteristic amount calculation unit 7 of the image processing apparatus 1 performs an operation for obtaining the left and right center coordinates of the vehicle 16 from the distribution map 50.
- FIG. 10 is a diagram showing the relationship between the distribution map 50 and the object of the vehicle 16. The shape of the vehicle 16 is subject to right and left. Therefore, the difference 39 of the similarity value 38 is the car Distributed from left to right from 16 centers. Therefore, the target characteristic feature quantity calculation unit 7 specifies the position of the vehicle 16 using the left-right symmetry of the vehicle 16. The evaluation of objectivity is performed according to the following procedure.
- FIG. 11 is a flowchart of processing for detecting an area where the vehicle 16 exists from the distribution map 10 executed by the target characteristic amount calculation unit 7.
- the target feature quantity calculation unit 7 sets the calculation central coordinates 61 (x, y) in FIG. 10 as the coordinates to be the target of the current calculation (S41).
- the target characteristic amount calculation unit 7 calculates the characteristic amount of the calculation center coordinate 61 (X, y) (S42).
- the object feature calculation unit 7 calculates the coordinate 63 ”+ i” in the horizontal direction 14 centered on the coordinate “X” in the horizontal direction 14 of the calculation central coordinate 61 (X, y) and Get the luminance of each coordinate 62 ”—i” away in the negative lateral direction 14 away. “i” is appropriately determined within a range in which the objectivity should be calculated.
- the range of “i” is set to half the width of the vehicle 16 corresponding to the coordinate in the vertical direction 13 in the image data 10.
- the target feature calculation unit 7 uses the coordinate 63 (x, y) as the coordinate 63 “x + i, y” where the coordinate 63 (X, y) force is also in the positive lateral direction 14 and the luminance of the coordinate 63 “x + i, y” is calculated.
- P7 (x + i, y) The target characteristic calculation unit 7 uses the coordinate 62, x-i, y as the coordinate 62, x-i, y, where the coordinate 62 (x, y) force is also negative in the lateral direction 14
- the objectivity feature quantity calculation unit 7 calculates the total value of the current luminance P7 and luminance P8 from S (i) in equation (17).
- S (i) takes a large value when the luminance P7 and the luminance P8 are large.
- T indicates that the currently acquired image data 10 is a target.
- the target characteristic calculation unit 7 calculates the difference value between the current luminance P7 and the luminance P8 by D (i) in equation (18).
- D (i) is a value that is small when the brightness P7 and the brightness P8 are close and values. Therefore, D (i) represents the balance between luminance P7 and luminance P8. “T” is the currently acquired image data 10 Indicates that is the target.
- the objectivity feature quantity calculation unit 7 calculates a line passing through the center of the vehicle by Esym (x, y) in Expression (19).
- Esym (x, y) is the absolute value of D (i) from the sum of the absolute values of S (i) when "i" is changed within the specified range from the operation center coordinates 61 (x, y) Is the value obtained by subtracting the sum of Therefore, Esym (x, y) has a large value in the case where the left and right luminances in the region to be calculated are equal and the left and right luminances are large.
- the object feature quantity calculation unit 7 associates Esym ( X , y) calculated in S32 with the calculation center coordinate 61 (S43).
- the target characteristic amount calculation unit 7 performs the above calculation for all the pixels 54 in the distribution map 50.
- the object feature calculation unit 7 can also detect a line passing through the center of the vehicle 16 by the equation (20) for obtaining the center of gravity Rsym (x, y) of the vehicle 16 by the following equation. . “T” indicates that the acquired image data 10 is the target.
- the vehicle position determination unit 8 uses the pattern feature quantity calculation unit 6 to recognize the pattern of the vehicle 16 and obtains the coordinates of the vertical direction 13 and the horizontal direction 14 where the vehicle 16 exists in the distribution map 50. obtain.
- the coordinates in the vertical direction 13 are the lower end of the front of the vehicle 16 when the vehicle 16 is facing forward in the image data 10.
- the vehicle position determination unit 8 calculates the objectivity feature amount.
- the coordinate in the horizontal direction 14 which is the center of the vehicle 16 in the distribution map 50 is obtained from the processing for obtaining the line passing through the left and right centers of the vehicle 16 by the unit 7.
- the vehicle position determination unit 8 superimposes the calculation result obtained by the pattern feature quantity calculation unit 6 and the calculation result obtained by the target characteristic quantity calculation unit 7 to obtain a reference coordinate in which the vehicle 16 exists in the image data 10.
- the vehicle position determination unit 8 detects the luminance peak of the image data 10 or the distribution map 50 from the acquired reference coordinates, and acquires the number of pixels indicating the width of the front portion of the vehicle 16.
- the number of pixels 13 in the width of the front portion of the vehicle 16 obtained by the coordinates of the vertical direction 13 and the peak detection as a reference of the vehicle 16 is detected in the database 20.
- the image processing apparatus 1 can detect the position of the vehicle 16 in the image data 10 and specify the vehicle type of the vehicle 16.
- FIG. 12 is an example of vehicle detection according to the present embodiment.
- the original image data (A) is an image taken by the camera 2.
- the vehicle portion (B) is an area where the vehicle portion of the original image data (A) is extracted.
- Detection (C) is a detection range when detection is performed by the processing of the present embodiment.
- Detection by background difference is the detection range when detecting by the background difference method, which is a conventional technique.
- Edge detection is a detection range in the case of detection by the edge detection method that is a conventional technique. In the background difference method and edge detection method, the range of the vehicle alone cannot be detected because it is affected by shadows.
- FIG. 13 is a hardware configuration example of the image processing apparatus 1 of the present embodiment.
- the image processing device 1 ⁇ , CPU101, camera 2, memory 103, output light and nose 105 power.
- the CPU 101, the camera 2, the memory 103, and the output unit 104 are connected by a bus 105 and exchange data between the units.
- the CPU 101 is a central processing unit that reads the image processing program 100 stored in the memory 103 into a work area and executes it.
- the image processing program 100 functions as the CPU 101 of the image processing apparatus 1 as an autocorrelation difference calculation unit 4, a road mark deletion unit 5, a pattern feature amount calculation unit 6, an object feature amount calculation unit 7, and a vehicle position determination unit 8. Let me.
- the memory 103 stores various information of the image processing apparatus 1. For example, RAM, hard disk Isk device.
- the memory 103 also serves as a work area when the CPU 101 performs calculations. Information stored in the database 20 is also stored.
- the output unit 104 is an interface for outputting information of the position of the vehicle 16 detected in the image data 10 and the specified vehicle type.
- the output destination is a wireless terminal of another vehicle.
- Other vehicles receive information that the vehicle 16 exists in the image data 10.
- the interface communicates with wireless terminals using a wireless protocol (for example, procedures defined in IEEE 802.11).
- the difference 39 of the similarity values 38 between the pixels 11 is used as a basic feature amount for detecting the vehicle 16.
- the feature value due to the difference 39 of the similarity value 38 has a large value at the curve and corner. Further, the fact that the front or back of the vehicle is symmetrical is utilized. Therefore, the feature value based on the difference 39 of the similarity value 38 is effective when detecting the front or back of the vehicle 16.
- the feature value based on the difference 39 of the similarity value 38 has a small response in a linear part or a part with a small curvature. Therefore, the feature value based on the difference 39 of the similarity value 38 is less responsive to a linear part such as a shadow or a part that tends to be a simple shape. As a result, the feature value based on the difference 39 of the similarity value 38 can reduce false detection due to shadows and false detection due to road surface reflection.
- the present embodiment when the present embodiment is applied to a system that acquires the image data 10 at regular time intervals, the difference of the positions where the vehicle 16 exists among the plurality of image data 10 is acquired to obtain the vehicle 16. It is also possible to calculate the moving speed.
- an area obtained by grouping a plurality of pixels 11 in the force image data 10 in which the calculation processing is performed using the pixels 11 as a reference can be used as a unit of calculation. That is, when the size is changed so as to reduce the number of pixels of the digital image data 10, the size of the plurality of pixels 11 is collected and reduced to one pixel. On the other hand, when the size is changed so that the number of pixels of the digital image data 10 is increased, the size of the pixel 11 that has also interpolated the relational power with the surrounding pixels 11 is changed to the new pixel 11. Therefore, performing detection in which a plurality of pixels 11 are regarded as a single region is substantially the same as performing detection processing for each pixel 11 after resizing the image data 10.
- the calculation area 32, the comparison area 34, and the search range 31 set by the autocorrelation difference calculation unit 4 are all described as being uniform in the image data 10.
- the range of the search range 31, the calculation region 32, and the comparison region 34 is reduced for the region that is the back when viewed as a three-dimensional space.
- the search range 31, the calculation region 32, and the comparison region 34 are enlarged for the region that comes to the front when viewed as a three-dimensional space.
- the range By changing the range, the amount of computation executed by the autocorrelation difference calculation unit 4 can be reduced. Also, by changing the range, it is possible to detect the accuracy according to each coordinate position. Specifically, data for changing the range of the search range 31, the calculation region 32, and the comparison region 34 according to the coordinate value in the vertical direction 13 in the image data 10 of the calculation target pixel 33 is provided in advance. Is possible.
- a similarity value can be calculated by a combination of lightness, hue, and saturation that is obtained only by luminance.
- the present invention can be used in a system that notifies the state of a vehicle in image data by enabling detection of the position of the vehicle in image data with a small amount of computation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé de reconnaissance de données d'image permettant de détecter un élément dans des données d'image sans être influencé par une luminance mais avec une petite quantité de calculs. Le procédé de reconnaissance de données d'image pour détecter l'élément dans les données d'image a la structure suivante. Premièrement, le procédé détermine un premier pixel, une première région autour du premier pixel, une pluralité de deuxièmes pixels devant être comparés au premier pixel, et une seconde région autour des deuxièmes pixels. Ensuite, des valeurs similaires calculées à partir de la première région et de la seconde région sont adaptées de façon à correspondre aux deuxièmes pixels. Ensuite, la valeur similaire maximum correspondant aux deuxièmes pixels est adaptée de façon à correspondre au premier pixel. Ensuite, la région, dans laquelle la forme déterminée selon l'état de la répartition des valeurs similaires correspondant au premier pixel est identique à celle prédéterminée comme caractéristique de l'élément, est détectée comme étant la partie de l'élément dans les données d'image.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008529068A JP4743277B2 (ja) | 2006-11-28 | 2006-11-28 | 画像データ認識方法、画像処理装置および画像データ認識プログラム |
| PCT/JP2006/323685 WO2008065707A1 (fr) | 2006-11-28 | 2006-11-28 | Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2006/323685 WO2008065707A1 (fr) | 2006-11-28 | 2006-11-28 | Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/232,683 Continuation US8204278B2 (en) | 2006-11-28 | 2008-09-22 | Image recognition method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2008065707A1 true WO2008065707A1 (fr) | 2008-06-05 |
Family
ID=39467511
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2006/323685 Ceased WO2008065707A1 (fr) | 2006-11-28 | 2006-11-28 | Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP4743277B2 (fr) |
| WO (1) | WO2008065707A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011039977A1 (fr) * | 2009-09-29 | 2011-04-07 | パナソニック株式会社 | Procédé et dispositif de détection de marquages de passage pour piétons |
| JP2011525005A (ja) * | 2008-05-21 | 2011-09-08 | アーデーツエー・オートモテイブ・デイスタンス・コントロール・システムズ・ゲゼルシヤフト・ミツト・ベシユレンクテル・ハフツング | 車両と歩行者との衝突を回避するための運転者援助システム |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06201379A (ja) * | 1992-12-28 | 1994-07-19 | Mitsubishi Electric Corp | 距離測定装置 |
| JPH09330404A (ja) * | 1996-06-10 | 1997-12-22 | Nippon Telegr & Teleph Corp <Ntt> | 物体検出装置 |
| JPH11345336A (ja) * | 1998-06-03 | 1999-12-14 | Nissan Motor Co Ltd | 障害物検出装置 |
| JP2000039306A (ja) * | 1998-07-22 | 2000-02-08 | Nec Corp | 車両領域検出装置及び車両領域検定方法 |
| JP2001351200A (ja) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | 車載用物体検知装置 |
| JP2005149250A (ja) * | 2003-11-18 | 2005-06-09 | Daihatsu Motor Co Ltd | 車両検出方法及び車両検出装置 |
-
2006
- 2006-11-28 JP JP2008529068A patent/JP4743277B2/ja not_active Expired - Fee Related
- 2006-11-28 WO PCT/JP2006/323685 patent/WO2008065707A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06201379A (ja) * | 1992-12-28 | 1994-07-19 | Mitsubishi Electric Corp | 距離測定装置 |
| JPH09330404A (ja) * | 1996-06-10 | 1997-12-22 | Nippon Telegr & Teleph Corp <Ntt> | 物体検出装置 |
| JPH11345336A (ja) * | 1998-06-03 | 1999-12-14 | Nissan Motor Co Ltd | 障害物検出装置 |
| JP2000039306A (ja) * | 1998-07-22 | 2000-02-08 | Nec Corp | 車両領域検出装置及び車両領域検定方法 |
| JP2001351200A (ja) * | 2000-06-09 | 2001-12-21 | Nissan Motor Co Ltd | 車載用物体検知装置 |
| JP2005149250A (ja) * | 2003-11-18 | 2005-06-09 | Daihatsu Motor Co Ltd | 車両検出方法及び車両検出装置 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011525005A (ja) * | 2008-05-21 | 2011-09-08 | アーデーツエー・オートモテイブ・デイスタンス・コントロール・システムズ・ゲゼルシヤフト・ミツト・ベシユレンクテル・ハフツング | 車両と歩行者との衝突を回避するための運転者援助システム |
| WO2011039977A1 (fr) * | 2009-09-29 | 2011-04-07 | パナソニック株式会社 | Procédé et dispositif de détection de marquages de passage pour piétons |
| CN102483881A (zh) * | 2009-09-29 | 2012-05-30 | 松下电器产业株式会社 | 人行横道线检测方法及人行横道线检测装置 |
| CN102483881B (zh) * | 2009-09-29 | 2014-05-14 | 松下电器产业株式会社 | 人行横道线检测方法及人行横道线检测装置 |
| US8744131B2 (en) | 2009-09-29 | 2014-06-03 | Panasonic Corporation | Pedestrian-crossing marking detecting method and pedestrian-crossing marking detecting device |
| JP5548212B2 (ja) * | 2009-09-29 | 2014-07-16 | パナソニック株式会社 | 横断歩道標示検出方法および横断歩道標示検出装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2008065707A1 (ja) | 2010-03-04 |
| JP4743277B2 (ja) | 2011-08-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8204278B2 (en) | Image recognition method | |
| CN104700414B (zh) | 一种基于车载双目相机的前方道路行人快速测距方法 | |
| JP5267596B2 (ja) | 移動体検出装置 | |
| JP7050763B2 (ja) | カメラ画像からのオブジェクトの検出 | |
| CN112084822A (zh) | 车道检测装置及方法、电子设备 | |
| JP5959073B2 (ja) | 検出装置、検出方法、及び、プログラム | |
| JPWO2013129361A1 (ja) | 立体物検出装置 | |
| CN103366155B (zh) | 通畅路径检测中的时间相干性 | |
| CN107491065B (zh) | 利用障碍物的地面边界信息检测物体的侧面的方法和装置 | |
| JP6021689B2 (ja) | 車両諸元計測処理装置、車両諸元計測方法及びプログラム | |
| JP3729025B2 (ja) | 歩行者検知装置 | |
| CN107004137A (zh) | 用于目标检测的系统和方法 | |
| JP4341564B2 (ja) | 対象物判定装置 | |
| US20200193184A1 (en) | Image processing device and image processing method | |
| KR101236223B1 (ko) | 차선 검출 방법 | |
| CN107992788B (zh) | 识别交通灯的方法、装置及车辆 | |
| JP5981284B2 (ja) | 対象物検出装置、及び対象物検出方法 | |
| JP2010271969A (ja) | 車線検出装置 | |
| JP5786793B2 (ja) | 車両検出装置 | |
| JP2020095623A (ja) | 画像処理装置および画像処理方法 | |
| CN108629226B (zh) | 一种基于图像分层技术的车辆检测方法及系统 | |
| WO2008065707A1 (fr) | Procédé de reconnaissance de données d'image, dispositif de traitement d'image et programme de reconnaissance de données d'image | |
| JP4586571B2 (ja) | 対象物判定装置 | |
| CN104252707A (zh) | 对象检测方法和装置 | |
| KR101371875B1 (ko) | 스테레오 비전을 이용한 차량검출과 차간거리 산출 방법 및 그 장치 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 2008529068 Country of ref document: JP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06833489 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 06833489 Country of ref document: EP Kind code of ref document: A1 |