[go: up one dir, main page]

WO2024069682A1 - Dispositif et procédé de formation de motif sur la surface de la route - Google Patents

Dispositif et procédé de formation de motif sur la surface de la route Download PDF

Info

Publication number
WO2024069682A1
WO2024069682A1 PCT/JP2022/035638 JP2022035638W WO2024069682A1 WO 2024069682 A1 WO2024069682 A1 WO 2024069682A1 JP 2022035638 W JP2022035638 W JP 2022035638W WO 2024069682 A1 WO2024069682 A1 WO 2024069682A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
light
road surface
light projection
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/035638
Other languages
English (en)
Japanese (ja)
Inventor
敏 川村
悟 井上
誠治 長谷川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2022/035638 priority Critical patent/WO2024069682A1/fr
Priority to JP2024548814A priority patent/JP7607842B2/ja
Publication of WO2024069682A1 publication Critical patent/WO2024069682A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic

Definitions

  • This disclosure relates to a technology for drawing patterns such as figures or letters by shining light from a vehicle onto a road surface.
  • the road surface drawing device described in Patent Document 1 corrects the desired image and projects it based on the positional relationship of the headlights, such as their height or the distance between them, in order to correctly draw the desired image on the road surface without distorting it.
  • the road surface drawing device described in Patent Document 1 is based on the premise of drawing an image on a horizontal road surface, so there is a problem in that if the road surface is inclined or uneven, the drawn image becomes distorted.
  • This disclosure has been made to solve the above problems, and aims to project light from a vehicle onto a road surface, drawing the desired drawing pattern without distortion, regardless of the slope or unevenness of the road surface.
  • the road surface drawing device disclosed herein includes an image acquisition unit that acquires captured image information, which is information on a captured image of the road surface surrounding the vehicle captured by at least one camera mounted on the vehicle, and a light projection control unit that controls light projection onto the surrounding road surface by at least one light projection unit mounted on the vehicle.
  • the light projection control unit causes at least one light projection unit to project light of a first reference light projection pattern that draws a first reference drawing pattern on the reference road surface, extracts a first actual drawing pattern drawn on the surrounding road surface by the light of the first reference light projection pattern from the captured image information, calculates a correction amount based on the difference between the first actual drawing pattern and the first reference drawing pattern, draws a second reference drawing pattern that is the same as or different from the first reference drawing pattern on the reference road surface, creates a corrected light projection pattern by correcting the second reference light projection pattern that is the same as or different from the first reference light projection pattern with the correction amount, and causes at least one light projection unit to project light of the corrected light projection pattern onto the surrounding road surface, thereby drawing the second actual drawing pattern on the surrounding road surface.
  • the road surface drawing device disclosed herein can project light from a vehicle onto a road surface, allowing the intended drawing pattern to be drawn without distortion, regardless of the gradient of the road surface.
  • FIG. 1 is a diagram showing a state in which light is projected onto a surrounding road surface from a light projecting unit of a vehicle;
  • FIG. 2 is a diagram showing a reference drawing pattern drawn on a reference road surface.
  • FIG. 13 is a diagram showing an actual drawing pattern that is drawn on a road surface with a slope.
  • 1 is a block diagram showing the configuration of a road surface drawing device and its peripheral devices according to a first embodiment; 11A and 11B are diagrams showing changes in the actual drawing pattern according to the gradient of the road surface. 4 is a flowchart showing the operation of the road surface drawing device according to the first embodiment.
  • FIG. 13 is a diagram showing the area of an actual drawing pattern.
  • FIG. 13 is a diagram showing the area of each divided region of the actual writing pattern.
  • FIG. 13 is a diagram showing feature points of an actual drawing pattern.
  • FIG. 13 is a diagram showing dimensions of an actual drawing pattern.
  • FIG. 13 is a diagram showing dimensions of each divided area of an actual writing pattern.
  • a figure showing a reference drawing pattern projected onto a flat road surface S0. 1 is a diagram showing an actual drawing pattern projected onto a non-flat road surface S1.
  • FIG. 13 is a flowchart showing the operation of the road surface drawing device according to the fourth embodiment. 13 is a diagram showing how the azimuth angle of a projection point P1 of an actual drawing pattern as viewed from a viewpoint V is corrected to match the azimuth angle of a projection point P0 of a reference drawing pattern.
  • FIG. 13 is a diagram showing how the depression angle of a projection point P1 of an actual drawing pattern as viewed from a viewpoint V is corrected to match the depression angle of a projection point P0 of a reference drawing pattern.
  • 1 is a diagram showing a virtual rectangular parallelepiped viewed stereoscopically from a viewpoint V1. This is an xz plan view of a virtual rectangular parallelepiped viewed stereoscopically from viewpoint V1. This is an xy plan view of a virtual rectangular parallelepiped viewed stereoscopically from viewpoint V1.
  • 1 is an xy plan view of a virtual rectangular parallelepiped drawing pattern stereoscopically viewed from viewpoint V1.
  • FIG. 13 is a diagram showing a virtual rectangular parallelepiped viewed stereoscopically from a viewpoint V2. This is an xz plan view of a virtual rectangular parallelepiped viewed stereoscopically from viewpoint V2. This is an xy plan view of a virtual rectangular parallelepiped viewed stereoscopically from viewpoint V2. 1 is an xy plan view of a virtual rectangular parallelepiped drawing pattern stereoscopically viewed from viewpoint V2.
  • FIG. 23 is a block diagram showing the configuration of a road surface drawing device and its peripheral devices according to a seventh embodiment.
  • FIG. 13 is a diagram showing a missing actual drawing pattern. 23 is a flowchart showing the operation of the road surface drawing device according to the seventh embodiment.
  • FIG. 13A and 13B are diagrams showing alignment patterns projected in the same direction from two light projecting units.
  • FIG. 13 is a diagram showing a grid-like alignment pattern.
  • FIG. 2 is a diagram illustrating a hardware configuration of the road surface drawing device according to each embodiment.
  • FIG. 2 is a diagram illustrating a hardware configuration of the road surface drawing device according to each embodiment.
  • Fig. 1 shows how light is projected from light-projecting units 22L, 22R of a vehicle M onto a surrounding road surface 3.
  • the light-projecting units 22L, 22R are, for example, headlights of the vehicle M.
  • the surrounding road surface 3 is the road surface ahead of the vehicle M.
  • the illumination range of the light-projecting units 22L, 22R is indicated by the reference symbol 6.
  • the road surface drawing device in embodiment 1 controls the light projection of light-projecting units 22L, 22R so that a predetermined pattern is drawn on the surrounding road surface 3.
  • the pattern drawn on the surrounding road surface 3 by the light projection of light-projecting units 22L, 22R (hereinafter referred to as the "drawing pattern") is directed, for example, at driver 1 of vehicle M.
  • An example of the drawing pattern for driver 1 is an arrow representing the desired traveling direction of vehicle M.
  • Driver 1 can drive vehicle M along the traveling direction indicated in the drawing pattern on the surrounding road surface 3.
  • the drawing pattern may be directed at traffic participants, such as pedestrians 7 in the vicinity of vehicle M.
  • An example of a drawing pattern directed at pedestrians 7 is an arrow indicating the traveling direction of vehicle M.
  • pedestrians 7 can grasp the traveling direction of vehicle M and take action to avoid vehicle M.
  • the road surface drawing device controls the pattern of light projected from the light-projecting units 22L, 22R (hereinafter, "light projection pattern") so that a predetermined drawing pattern, such as an arrow, is drawn on the surrounding road surface 3.
  • the drawing pattern may be drawn by projecting light from either one of the light-projecting units 22L, 22R.
  • FIG. 1 shows a case where the gradient of the surrounding road surface 3 is a horizontal road surface that does not change between the traveling position of the vehicle M and the projection position of the light projecting unit 22.
  • a horizontal road surface is also called the reference road surface.
  • FIG. 2 shows a case in which the gradient of the surrounding road surface 3 changes between the location where vehicle M is traveling and the projection position of light-projecting unit 22.
  • the gradient of the surrounding road surface 3 is horizontal at the location where vehicle M is traveling, but becomes downward in front of vehicle M. Even if a light projection pattern similar to that for the reference road surface is projected onto the surrounding road surface 3, which is not the reference road surface, the desired drawing pattern cannot be obtained.
  • FIG 2 the surrounding road surface 3, which is assumed to be horizontal, is shown by a dashed line. If the surrounding road surface 3 is horizontal, a circular drawing pattern 8 is drawn on the surrounding road surface 3. This drawing pattern 8 is the desired drawing pattern.
  • Figure 3 is a view of the surrounding road surface 3 in Figure 2 as seen from above the vehicle M.
  • the positions of the solid line A and the dashed lines B and C in Figure 3 correspond to the positions of the solid line A and the dashed lines B and C in Figure 2.
  • the drawing pattern 9 on the surrounding road surface 3 which has a downward slope, ends up being a pattern that extends in the direction of travel of the vehicle M. In this way, even if the same pattern is projected from the light-projecting unit 22, the pattern that is actually drawn on the surrounding road surface 3 differs depending on the gradient of the surrounding road surface 3.
  • the road surface drawing device corrects the light projection pattern of the light-projecting unit 22 according to the gradient of the surrounding road surface 3 of the vehicle M, so that the desired pattern is drawn on the surrounding road surface 3 of any gradient.
  • FIG. 4 is a block diagram showing the configuration of a road surface drawing device 101 and its peripheral devices according to the first embodiment.
  • the road surface drawing device 101 corrects the light projection pattern of the light-projecting unit 22 mounted on the vehicle M in accordance with the gradient of the surrounding road surface 3, thereby suppressing distortion of the pattern actually drawn on the surrounding road surface 3 by the light projection of the light-projecting unit 22 (hereinafter referred to as the "actual drawing pattern").
  • the actual drawing pattern there may be one or more light-projecting units 22, and in FIGS. 1 to 3, the two light-projecting units 22 provided on the left front and right front of the vehicle M are shown as light-projecting units 22R and 22L.
  • the road surface drawing device 101 is configured with an image acquisition unit 11 and a light projection control unit 12.
  • the image acquisition unit 11 acquires an image captured by the front camera 21 of the vehicle M from the front camera 21.
  • the front camera 21 is an example of a camera that captures an image of the surrounding road surface 3 of the vehicle M, including the light projection range of the light projection unit 22.
  • the front camera 21 captures an image in front of the vehicle M.
  • the front camera 21 outputs captured image information, which is information on the captured image of the surrounding road surface 3, to the image acquisition unit 11.
  • the captured image of the front camera 21 includes the actual drawing pattern of the surrounding road surface 3 by the light projection of the light projection unit 22.
  • the front camera 21 may be one or multiple.
  • FIG. 1 illustrates one front camera 21A provided above the head of the driver 1. That is, the front camera 21 may be a single front camera 21A.
  • FIG. 2 and FIG. 3 illustrate two front cameras 21B provided near two light projecting units 22L, 22R of the vehicle M. That is, the front camera 21 may be multiple front cameras 21B provided at multiple locations on the vehicle M corresponding to the multiple light projecting units 22L, 22R.
  • the light projection control unit 12 first creates a first reference light projection pattern and causes the light projection unit 22 to project this.
  • the first reference light projection pattern is a light projection pattern of the light projection unit 22 that is projected onto the reference road surface to be able to draw a first reference drawing pattern on the reference road surface.
  • the first reference drawing pattern is a pattern that is compared with an actual drawing pattern, which will be described later, and is used to calculate the amount of correction according to the gradient of the surrounding road surface 3.
  • the first reference drawing pattern is, for example, the desired pattern that the road surface drawing device 101 wants to draw on the surrounding road surface 3.
  • the first reference drawing pattern may be a simpler pattern.
  • the actual drawing pattern drawn on the surrounding road surface 3 by the first reference light projection pattern is referred to as the actual drawing pattern.
  • An image of the actual drawing pattern is captured by the front camera 21 and acquired by the image acquisition unit 11.
  • the light projection control unit 12 extracts the actual drawing pattern from the captured image information acquired from the image acquisition unit 11, and calculates the amount of correction by comparing the actual drawing pattern with the first reference drawing pattern.
  • the light-projection control unit 12 creates a corrected light-projection pattern by correcting the second reference light-projection pattern using the calculated correction amount, and causes the light-projection unit 22 to project this corrected light-projection pattern.
  • the second reference light-projection pattern is a light-projection pattern that enables the road surface drawing device 101 to draw the desired pattern on the reference road surface. Therefore, when the desired pattern that the road surface drawing device 101 wants to draw on the surrounding road surface 3 is used as the first reference drawing pattern, the second reference light-projection pattern is the same pattern as the first reference light-projection pattern. Also, when a simpler pattern is used as the first reference drawing pattern, the second reference light-projection pattern is a pattern different from the first reference light-projection pattern.
  • the actual drawing pattern drawn on the surrounding road surface 3 by the first reference light projection pattern is also referred to as the first actual drawing pattern
  • the actual drawing pattern drawn on the surrounding road surface 3 by the corrected light projection pattern is also referred to as the second actual drawing pattern.
  • the first reference light projection pattern and the second reference light projection pattern are the same pattern
  • the deviation between the second actual drawing pattern and the reference drawing pattern is smaller than the deviation between the first actual drawing pattern and the reference drawing pattern. In this way, by correcting the light projection pattern of the light projection unit 22, it is possible to draw the desired pattern regardless of the gradient of the surrounding road surface 3.
  • the first reference drawing pattern is the desired pattern that the road surface drawing device 101 wishes to draw on the surrounding road surface 3, and the first reference light projection pattern and the second reference light projection pattern are the same pattern.
  • a circular drawing pattern 8 was the reference drawing pattern.
  • the shape of the reference drawing pattern is not limited to a circle.
  • Figure 5 shows reference drawing patterns of various shapes and their corresponding actual drawing patterns.
  • the dashed lines indicate the reference drawing patterns, and the matte-hatched areas indicate the actual drawing patterns.
  • the reference drawing patterns are patterns such as circles, straight arrows, or right-turn arrows. If the surrounding road surface 3 is a flat reference road surface, the actual drawing pattern matches the reference drawing pattern. However, if the surrounding road surface 3 has a downward slope, the actual drawing pattern is a pattern in which the reference drawing pattern is extended in the traveling direction of the vehicle M. Also, if the surrounding road surface 3 has an upward slope, the actual drawing pattern is a pattern in which the reference drawing pattern is shortened in the traveling direction of the vehicle M.
  • FIG. 6 is a flowchart showing the operation of the road surface drawing device 101. Below, the operation of the road surface drawing device 101 will be explained according to the flow in FIG. 8.
  • step S101 the light-projection control unit 12 creates a reference light-projection pattern.
  • the light-projection control unit 12 creates the reference light-projection pattern based on the reference drawing pattern to be drawn on the surrounding road surface 3.
  • step S102 the light-projection control unit 12 outputs the reference light-projection pattern to the light-projection unit 22 and instructs the light-projection unit 22 to project the reference light-projection pattern onto the surrounding road surface 3. This causes the light-projection unit 22 to project the reference light-projection pattern onto the surrounding road surface 3, and the actual drawing pattern is drawn. This actual drawing pattern is photographed by the front camera 21.
  • step S103 the image acquisition unit 11 acquires a captured image of the actual drawing pattern from the front camera 21.
  • step S104 the light projection control unit 12 compares the actual drawing pattern with the reference drawing pattern and determines whether or not the reference light projection pattern needs to be corrected. If correction is not required in step S104, the processing of the road surface drawing device 101 returns to step S103. If correction is required in step S104, the amount of correction to the reference light projection pattern is calculated in step S105.
  • step S106 the light projection control unit 12 corrects the reference light projection pattern using the correction amount calculated in step S105 to create a corrected light projection pattern.
  • step S107 the light-projection control unit 12 outputs a corrected light-projection pattern to the light-projection unit 22 and instructs the light-projection unit 22 to project the corrected light-projection pattern onto the surrounding road surface 3. This causes the light-projection unit 22 to project the corrected light-projection pattern onto the surrounding road surface 3. Then, the processing of the road surface drawing device 101 returns to step S103.
  • the light projection control unit 12 compares the area of the reference drawing pattern with the area of the actual drawing pattern, and creates a corrected light projection pattern if the difference between the areas is equal to or greater than a predetermined threshold.
  • a predetermined threshold the relative positions of the eyes of the driver 1 of the vehicle M, the front camera 21, and the light projection unit 22 are fixed. Also, it is assumed that the reference drawing pattern, the image captured by the front camera 21 of the reference drawing pattern, and the reference drawing pattern viewed by the driver 1 are similar in shape.
  • Figure 7 shows an arrow-shaped actual drawing pattern.
  • the light projection control unit 12 also compares the area ratio AK of the actual drawing pattern and the reference drawing pattern with a predetermined threshold value AKth, and determines that correction is not necessary if AK ⁇ AKth, and that correction is necessary if AK ⁇ AKth. In other words, the light projection control unit 12 creates a corrected light projection pattern when the difference in area between the actual drawing pattern and the reference drawing pattern is equal to or greater than a predetermined threshold value. If correction is necessary, the light projection control unit 12 creates a corrected light projection pattern by enlarging or reducing the area of the reference light projection pattern by 1/AK times.
  • the light projection control unit 12 may divide the actual drawing pattern into a plurality of regions as shown by the dashed lines in FIG. 8, and perform an area comparison for each divided region.
  • the actual drawing pattern is divided into six regions.
  • the areas of each divided region of the actual drawing pattern are designated as A1-A6.
  • the light projection control unit 12 divides the reference drawing pattern into six regions corresponding to each divided region of the actual drawing pattern.
  • the areas of each divided region of the reference drawing pattern are designated as Ao1-Ao6.
  • the light projection control unit 12 compares the area ratio AKi of the actual drawing pattern and the reference drawing pattern for each divided area with a predetermined threshold value AKit, and determines that correction is not necessary if AKi ⁇ AKit for all divided areas, and determines that correction is necessary if there is even one divided area where Aki ⁇ Ait. If correction is necessary, the light projection control unit 12 creates a corrected light projection pattern by enlarging or reducing the area of the reference light projection pattern area corresponding to the divided area of the reference drawing pattern where Aki ⁇ Ait by a factor of 1/Ait.
  • the road surface drawing device 101 includes an image acquisition unit 11 and a light projection control unit 12.
  • the image acquisition unit 11 acquires captured image information, which is information on a captured image of the surrounding road surface 3 of the vehicle M captured by at least one camera mounted on the vehicle M.
  • the light projection control unit 12 controls the projection of light onto the surrounding road surface 3 by at least one light projection unit 22 mounted on the vehicle M.
  • the light projection control unit 12 causes at least one light projection unit 22 to project light of a first reference light projection pattern that draws a first reference drawing pattern on the reference road surface, extracts a first actual drawing pattern drawn on the surrounding road surface 3 by the light of the first reference light projection pattern from the captured image information, calculates a correction amount based on the difference between the second actual drawing pattern and the first reference drawing pattern, draws a second reference drawing pattern that is the same as or different from the first reference drawing pattern on the reference road surface, creates a corrected light projection pattern by correcting the second reference light projection pattern that is the same as or different from the first reference light projection pattern with the correction amount, and projects the light of the corrected light projection pattern to at least one light projection unit 22 to draw the second actual drawing pattern on the surrounding road surface 3.
  • the desired pattern to be drawn on the surrounding road surface 3 is drawn regardless of the gradient of the surrounding road surface 3.
  • the road surface drawing device 102 of the second embodiment has the same configuration as the road surface drawing device 101 of the first embodiment.
  • the specific processes of steps S104-106 in Fig. 6 are different from those of the first embodiment as follows.
  • the light projection control unit 12 calculates the amount of correction by comparing the areas of the reference drawing pattern and the actual drawing pattern.
  • the light projection control unit 12 calculates the amount of correction by comparing the distance between the feature points of the reference drawing pattern and the actual drawing pattern.
  • Figure 9 shows feature points a, b, d, e, f, and g of the actual drawing pattern in the shape of an arrow.
  • the light projection control unit 12 can extract these feature points a, b, d, e, f, and g by using blinking of the image of the actual drawing pattern, pattern matching, or a detection algorithm such as SURF (Speed-Up Robust Features) or SIFT (Scale-Invariant Feature Transform).
  • SURF Speed-Up Robust Features
  • SIFT Scale-Invariant Feature Transform
  • the light projection control unit 12 calculates the distance L1 between feature points b and e, the distance L2 between feature points a and g or a and f, and the distance L3 between feature points f and g. Similarly, the light projection control unit 12 extracts feature points ao, bo, do, eo, fo, and go of the reference drawing pattern that correspond to feature points a, b, d, e, f, and g of the actual drawing pattern (not shown), and calculates the distance Lo1 between feature points ao and eo, the distance Lo2 between feature points ao and go or ao and fo, and the distance Lo3 between feature points fo and go.
  • step S104 the light projection control unit 12 compares the ratio LKi of the distances between the characteristic points with a predetermined threshold value LKth, and determines that correction is not required if LKi ⁇ LKth, and determines that correction is required if LKi ⁇ LKth.
  • step S106 the light projection control unit 12 enlarges or reduces the reference light projection pattern by multiplying the distance between the feature points of the reference light projection pattern corresponding to the feature points of the reference drawing pattern by 1/LKi, thereby creating a corrected light projection pattern.
  • distance comparison between feature points of the actual drawing pattern and the reference drawing pattern may be performed for each divided area.
  • the actual drawing pattern is divided into four areas horizontally and vertically, and the distances between feature points in each divided area are calculated as L11, L12, L21, L22, L23, L24, L31, and L32.
  • the captured image information includes information on the distance between feature points in the captured image.
  • the light projection control unit 12 creates a corrected light projection pattern based on a comparison between the distance between each feature point of the first reference drawing pattern and the distance between each feature point of the actual drawing pattern corresponding to each feature point of the first reference drawing pattern, which is obtained from the captured image information.
  • the road surface drawing device 103 of the embodiment 3 has a similar configuration to the road surface drawing devices 101 and 102 of the embodiments 1 and 2.
  • the specific processes of steps S104-106 in Fig. 6 are different from those of the embodiments 1 and 2 as follows.
  • the light projection control unit 12 can measure the distance between each feature point of the actual drawing pattern and the front camera 21B from the images of the actual drawing pattern captured by the two front cameras 21B. Therefore, the light projection control unit 12 can correct the reference light projection pattern so that the distance between each feature point of the actual drawing pattern and the front camera 21B matches the distance between each feature point of the reference drawing pattern and the front camera 21B.
  • FIG. 3 shows the distances D1, D2 between each feature point of the drawing pattern 9 (actual drawing pattern) and the front camera 21B.
  • the light projection control unit 12 calculates the distances Do1, Do2 between each feature point of the reference drawing pattern and the front camera 21B.
  • step S106 the light projection control unit 12 corrects the reference light projection pattern to obtain a corrected light projection pattern by multiplying the distance between each feature point of the reference drawing pattern and the front camera 21B by 1/DKi. With this method, the pattern can be always drawn at a constant distance from the vehicle M even if the gradient of the surrounding road surface 3 changes.
  • the road surface drawing device 104 of embodiment 4 has the same configuration as embodiment 1-3, as shown in Fig. 4. In embodiment 4, the specific processing of steps S104-106 in Fig. 6 differs from embodiment 1-3.
  • the road surface drawing device 104 creates a corrected light-projection pattern by correcting the reference light-projection pattern so that the azimuth angle and depression angle at which each feature point of the second actual drawing pattern is viewed from a specific viewpoint match the azimuth angle and depression angle at which each feature point of the first reference drawing pattern corresponding to each feature point of the second actual drawing pattern is viewed from the specific viewpoint.
  • Figure 12 shows a state in which a reference light projection pattern is projected from a light source O toward a flat road surface S0.
  • the xyz coordinates have their origin on the body of the vehicle M.
  • the x-axis direction is the forward direction of the vehicle body
  • the y-axis direction is the rightward direction
  • the z-axis is the height direction.
  • a perpendicular line drawn from the light source O to the frame surface f of the light projecting units 22L, 22R or the front camera 21 intersects with the frame surface at the center of the frame surface f.
  • the position within the frame is indicated by the angle with respect to the origin or the distance in the coordinate axis direction. This is called the frame coordinates.
  • the projection surface it is possible to convert the frame coordinates to the coordinates of the projection position using a formula.
  • an arrow-shaped drawing pattern having an outer frame ABCD is drawn on the frame coordinates of the light projecting units 22L, 22R, a figure with converted coordinates is drawn on the road surface S0.
  • the rectangle on the frame coordinates is projected into a trapezoid shape onto the road surface S0.
  • the road surface S0 is the reference road surface
  • the pattern drawn on the road surface S0 is the reference drawing pattern.
  • the projection point that constitutes the reference drawing pattern on the road surface S0 is P0 (x0, y0, z0).
  • Figure 13 shows the state when the same reference light projection pattern as in Figure 12 is projected from light source O towards road surface S1. Because road surface S1 is not flat, the actual drawing pattern drawn on road surface S1 differs from the reference drawing pattern in Figure 12 and is distorted.
  • the outer frame ABCD in the reference drawing pattern in Figure 12 is transformed into an outer frame A'B'C'D' in the actual drawing pattern in Figure 13. As with the outer frame, the arrow within the frame is also transformed in the same way.
  • the projection point constituting the actual drawing pattern on road surface S1 is P1 (x1, y1, z1).
  • the road surface drawing device 104 therefore creates a corrected light projection pattern by correcting the reference light projection pattern projected onto the road surface S1 so that the direction (azimuth and depression angle) in which each feature point of the actual drawing pattern on the road surface S1 is viewed from an arbitrary viewpoint V matches the direction (azimuth and depression angle) in which each feature point of the reference drawing pattern on the road surface S0 is viewed from the same viewpoint V. In this way, the road surface drawing device 104 corrects differences in the appearance of the drawing pattern depending on the road surface.
  • FIG. 14 is a flowchart of the light projection pattern correction process performed by the road surface drawing device 104. Below, the light projection pattern correction process performed by the road surface drawing device 104 will be explained according to the flow of FIG. 14.
  • the light projection control unit 12 acquires a projection image to be drawn on the road surface S1.
  • the projection image may be a simple two-dimensional figure, a three-dimensional figure, or a number of figures that change over time.
  • the light projection control unit 12 acquires the position of viewpoint V (hereinafter, viewpoint position) (Xv, Yv, Zv).
  • viewpoint position may be a fixed value, or may be a value acquired using a three-dimensional distance sensor such as Lidar (Light Detection and Ranging) or a camera.
  • step S203 the light projection control unit 12 creates a reference drawing pattern by correcting the projected image based on the viewpoint position.
  • the reference drawing pattern drawn on the reference road surface is viewed from the viewpoint position, it appears as a projected image.
  • the reference drawing pattern is determined based on the projected image, the position of the light projection unit 22, and the viewpoint position.
  • step S204 the light projection control unit 12 creates a reference light projection pattern based on the reference drawing pattern.
  • the reference light projection pattern is a pattern set in the light projection unit 22, and is created by calculating backwards from the reference drawing pattern. This step corresponds to step S101 in FIG. 6.
  • step S205 the light projection control unit 12 extracts feature points of the reference light projection pattern.
  • step S206 the light projection control unit 12 calculates the trajectory of the feature points of the reference drawing pattern when the depth of the projection surface (z-axis position in FIG. 13) is changed.
  • the trajectory of the feature points may be represented by a table or a formula, etc.
  • step S207 the light-projection control unit 12 controls the projection of the reference light-projection pattern.
  • This step corresponds to step S102 in FIG. 6.
  • the reference light-projection pattern is projected from the light-projection unit 22 onto the road surface S1, and the reference drawing pattern is drawn on the road surface S1.
  • step S207 the front camera 21 captures the actual drawing pattern drawn on the road surface S1.
  • step S208 the image acquisition unit 11 acquires captured image information including the actual drawing pattern from the front camera 21. This step corresponds to step S103 in FIG. 6.
  • step S209 the light projection control unit 12 extracts feature points of the actual drawing pattern.
  • the feature points are extracted by blinking the image of the actual drawing pattern, pattern matching, or a detection algorithm such as SURF (Speed-Up Robust Features) or SIFT (Scale-Invariant Feature Transform).
  • SURF Speed-Up Robust Features
  • SIFT Scale-Invariant Feature Transform
  • step S210 the light projection control unit 12 compares the reference drawing pattern with the actual drawing pattern, and calculates or modifies the amount of correction of the reference light projection pattern by changing the position of the feature point of the actual drawing pattern on the trajectory calculated in step S206. Specifically, the light projection control unit 12 moves the feature point of the actual drawing pattern on the trajectory calculated in step S206 so that the azimuth angle and depression angle of the actual drawing pattern as seen from viewpoint V approach the azimuth angle and depression angle of the reference drawing pattern as seen from viewpoint V.
  • the amount of correction of the reference light projection pattern is determined based on the movement direction and movement distance of the feature point of the actual drawing pattern.
  • step S211 the light-projection control unit 12 creates a corrected light-projection pattern by correcting the reference light-projection pattern based on the correction amount calculated in step S210.
  • step S212 the light projection control unit 12 determines whether there has been a change in the projected image. If there has been a change in the projected image, the processing of the road surface drawing device 104 returns to step S201. If there has been no change in the projected image, in step S213, the light projection control unit 12 determines whether there has been a change in the viewpoint V.
  • step S207 the light projection control unit 12 performs light projection control using the corrected light projection pattern created in step S211 as a new reference light projection pattern.
  • the light projection control unit 12 repeats the process of calculating the correction amount in step S210.
  • the second and subsequent calculations of the correction amount are performed based on the previous correction amount, the position of the feature point of the actual drawing pattern moved in the previous correction amount calculation process, and the position of the feature point of the current actual drawing pattern.
  • the azimuth angle and depression angle of the actual drawing pattern as viewed from the viewpoint V gradually approach the azimuth angle and depression angle of the reference drawing pattern as viewed from the viewpoint V.
  • the first reference light-projection pattern and the second reference light-projection pattern are the same. Then, the light-projection control unit 12 creates a new correction light-projection pattern by setting the most recent correction light-projection pattern as a new second reference light-projection pattern and setting the surrounding road surface at the time of creating the most recent correction light-projection pattern as a new reference road surface.
  • the light projection control unit 12 controls the azimuth angle and depression angle of the reference light projection pattern projected from the light projection unit 22 so that the azimuth angle and depression angle of the projection point P1 on the distorted road surface S1 as seen from an arbitrary viewpoint V match the azimuth angle and depression angle of the projection point P0 on the flat road surface S0.
  • the specific procedure is as follows.
  • Step 1 The light projecting unit 22 projects a reference light projection pattern in a direction of an azimuth angle ⁇ 0 and a depression angle ⁇ 0 .
  • Step 2 Calculate the amount of correction for the projection azimuth angle of the light projector 22 in the xy view.
  • Figure 15 shows the projection state in the xy view.
  • P1 is set as the feature point of the actual drawing pattern.
  • the amount of correction for the projection azimuth angle is calculated so that P1 moves to P1' on the extension of the straight line V-P0.
  • Step 3 Calculate the correction amount for the projection depression angle of the light projector 22 in the xz view.
  • Figure 16 shows the projection state in the xy view.
  • the correction amount for the projection depression angle is calculated so that P1 moves to P1' on the extension of the straight line V-P0.
  • Step 4 Correct the projection azimuth and depression angle of the light-projecting unit 22 using the correction amount calculated in step 3.
  • Step 5 Repeat steps 2, 3, and 4.
  • O (0,0,0), C (Xc,Yc,Zc), and V (Xv,Yv,Zv) are fixed points and respectively represent the positions of the light projector 22, the front camera 21, and the viewpoint.
  • the direction vector of O i.e., the projection direction vector of the light projector 22, is defined as ⁇ 0 ( ⁇ 0 , ⁇ 0 ).
  • the direction vector of C i.e., the direction vector seen from the front camera 21, is defined as ⁇ c ( ⁇ c , ⁇ c ).
  • the direction vector seen from V is defined as ⁇ V ( ⁇ V , ⁇ V ).
  • S0 is a flat road surface that serves as the reference road surface.
  • S0 is parallel to the xy plane, and the z-axis coordinate of S0 is -h.
  • S1 is the surrounding road surface that is not flat, and is the road surface onto which the light-projecting unit 22 actually projects light. The shape of S1 is unknown.
  • P represents the projection point of the light-projecting unit 22.
  • P0 (x0, y0, z0) represents the projection point on the road surface S0 by the reference light-projection pattern.
  • P1 (x1, y1, z1) represents the projection point on the road surface S1 by the reference light-projection pattern.
  • P1' (x1', y1', z1') represents the projection point on the road surface S1 by the corrected light-projection pattern.
  • P1' is determined so that the direction in which P1' is viewed from viewpoint V matches the direction in which P0 is viewed from viewpoint V. In other words, P1' is on the extension of the straight line V-P0.
  • is an azimuth angle, which is the angle between a straight line in the xy view and the positive y-axis direction.
  • ⁇ O is the projection azimuth angle of the light projector 22.
  • ⁇ c is the azimuth angle at which the projection point is viewed from the front camera 21.
  • ⁇ v is the azimuth angle at which the projection point is viewed from viewpoint V.
  • ⁇ o,0 is the projection azimuth angle from the light projector 22 to P0.
  • ⁇ o,1' is the projection azimuth angle from the light projector 22 to P1'.
  • ⁇ c,1 is the azimuth angle from the front camera 21 to P1.
  • ⁇ V,0 is the azimuth angle at which P0 is viewed from viewpoint V.
  • ⁇ V,1 ' is the azimuth angle at which P1' is viewed from viewpoint V.
  • is the depression angle, which is the angle with the xy plane.
  • ⁇ xz is the depression angle when viewed from the xz direction.
  • ⁇ xz is taken to be + in the counterclockwise direction.
  • - ⁇ O, xz, 0 is the depression angle when light is projected from O to P0 when viewed from the xz direction.
  • - ⁇ O, xz, 1 ' is the depression angle when light is projected from O to P1' when viewed from the xz direction.
  • - ⁇ C, xz, 1 is the depression angle when P1 is viewed from C when viewed from the xz direction.
  • - ⁇ V, xz, 0 is the depression angle when P0 is viewed from V when viewed from the xz direction.
  • X1_Azimuth is the x-position of P1' used in the calculation of ⁇ O,1' .
  • Z1_Elevation is the z-position of P1' used in the calculation of ⁇ O,1' .
  • the light projection control unit 12 manipulates the light projection azimuth of projection point P1 in an xy view to place it on an extension line of V-P0. In an xy view, the position in the z direction can be ignored.
  • the azimuth angles ⁇ V , ⁇ C, and ⁇ O when an arbitrary projection point P(x,y,z) is viewed from V, C , and O are expressed by the following equations.
  • light projecting unit 22 projects light in the ⁇ O,1 ( ⁇ O,1 , ⁇ O,1 ) direction and projects point P1.
  • Point P1 appears in the ⁇ C,1 ( ⁇ C,1 , ⁇ C,1 ) direction from position C of front camera 21.
  • ⁇ O,1 , ⁇ C,1 can be expressed by the following equations from FIG. 15.
  • the position of the corrected projection point P1' is determined so that the azimuth angle at which the corrected projection point P1' is viewed from viewpoint V is the same as the azimuth angle at which the projection point P0 on the road surface S0 is viewed from viewpoint V. Therefore, the y-coordinate y1' of the corrected projection point P1' is expressed as follows.
  • the light projection azimuth angle ⁇ O,1 ′ from the position O of the light projector 22 to the corrected projection point P1 ′ is expressed by the following equation.
  • the projection azimuth ⁇ O,1' to the corrected projection point P1' is determined by the formulas (8), (12), and (13).
  • the actually projected point P1' is often close to the extension of V-P0, but is not necessarily located exactly on the extension of V-P0.
  • the light projection control unit 12 operates the projection depression angle of the projection point P1 in the xz view to place it on the extension line of V-P0.
  • the depression angles ⁇ V ,xz , ⁇ C,xz, ⁇ O ,xz in the xz view when an arbitrary point P(x,y,z) is viewed from the viewpoint V, the position C of the front camera 21, and the position O of the light projection unit 22 are expressed by the following equations.
  • light projector 22 projects light from O in the ⁇ O,1 ( ⁇ O,1 , ⁇ O,1 ) direction to project point P1.
  • Point P1 is seen in the ⁇ C,1 ( ⁇ C,1 , ⁇ C,1 ) direction from position C of front camera 21.
  • the depression angles ⁇ O,x-z, 1 , ⁇ C,x-z,1 when viewing point P1 from O,C in an x-z perspective are expressed by the following equation from FIG. 16.
  • the depression angle ⁇ v,xz,0 from the viewpoint V to P0 and the depression angle ⁇ v,xz,1' from the viewpoint V to the corrected projection point P1' are respectively expressed by the following equations.
  • equation (25) transforms into equation (26).
  • the position of the corrected projection point P1' is determined so that the depression angle when viewing the corrected projection point P1' from viewpoint V is the same as the depression angle when viewing the projection point P0 on the road surface S0 from viewpoint V. Therefore, from equations (24) and (26), the x-coordinate x1' of the corrected projection point P1' is expressed by the following equation.
  • the depression angle ⁇ O,xz,1′ on the xz plane from the position O of the light projecting unit 22 to the corrected projection point P1′ is expressed by the following equation.
  • the light-projection azimuth angle ⁇ O,1' from the position O of the light-projecting unit 22 is as shown in formula (13). Therefore, the depression angle ⁇ O,1' from the position O of the light-projecting unit 22 to the corrected projection point P1' is expressed by the following formula by correcting formula (28) with the light-projection azimuth angle ⁇ O,1' .
  • the corrected light projection azimuth angle ⁇ O,1' to the projection point P1' is determined by equations (8), (12), and (13).
  • the corrected light projection depression angle ⁇ O,1' to the projection point P1' is determined by equations (12), (13), (27), and (29).
  • the light projection control unit 12 causes the light projection unit 22 to project light at the light projection azimuth angle ⁇ O,1' and the light projection depression angle ⁇ O,1' .
  • the light projection control unit 12 creates a corrected light projection pattern so that the azimuth angle and depression angle at which each feature point of the second actual drawing pattern is viewed from a specific viewpoint matches the azimuth angle and depression angle at which each feature point of the first reference drawing pattern corresponding to each feature point of the second actual drawing pattern is viewed from the specific viewpoint.
  • the road surface drawing device 105 of the fifth embodiment has the same configuration as those of the first to fourth embodiments, as shown in Fig. 4.
  • the second reference drawing pattern is a pattern that is stereoscopically viewed from a specific viewpoint.
  • Figures 17 to 20 show an example of creating a second reference drawing pattern that is viewed stereoscopically from a specific viewpoint, viewpoint V1.
  • the other projection points are determined in a similar manner. Note that FIG. 17 only shows projection points b1, c1, and d1, which are visible and not hidden by the rectangular parallelepiped.
  • Figure 18 shows Figure 17 viewed from the y-axis direction.
  • Figure 19 shows Figure 17 as seen from the z-axis direction.
  • This figure is a reference drawing pattern that appears as a rectangular parallelepiped from viewpoint V1 when drawn on the reference road surface.
  • Figures 21 to 24 show an example of creating a second reference drawing pattern that is viewed stereoscopically from a specific viewpoint, viewpoint V2.
  • Figure 22 shows Figure 21 as seen from the y-axis direction.
  • Figure 23 shows Figure 21 as seen from the z-axis direction.
  • This figure is the second reference drawing pattern, which appears as a rectangular parallelepiped from viewpoint V2 when drawn on the reference road surface.
  • the light projection control unit 12 uses the pattern shown in FIG. 20 or FIG. 24 as the reference drawing pattern for stereoscopic vision depending on the viewpoint.
  • the three-dimensional object viewed stereoscopically does not have to be a rectangular parallelepiped.
  • the surface of the three-dimensional object may have a pattern or letters drawn on it.
  • the road surface drawing device 106 of the sixth embodiment has the same configuration as those of the first to fifth embodiments, as shown in FIG.
  • the first reference light-projection pattern which is the reference light-projection pattern for calculating the correction amount of the light-projection pattern according to the surrounding road surface 3
  • the second reference light-projection pattern which is the reference light-projection pattern to which the correction amount is applied
  • the first reference light projection pattern and the second reference light projection pattern are different patterns.
  • the second reference drawing pattern is the desired pattern that the road surface drawing device 106 wants to draw on the surrounding road surface 3.
  • the second reference light projection pattern is a detailed pattern that is set by working backwards from the desired pattern, the second reference drawing pattern.
  • the first reference light projection pattern and the first reference drawing pattern are simple patterns with a small number of feature points, such as circles, rectangles, and triangles.
  • the first reference light projection pattern since the first reference light projection pattern is only for calculating the correction amount, it does not need to be visible to the driver 1, etc. Therefore, the first reference light projection pattern may be an invisible light pattern such as infrared light or ultraviolet light. In other words, the light of the first reference light projection pattern is invisible light, and the light of the second reference light projection pattern and the correction light projection pattern may be visible light.
  • ⁇ G. Seventh embodiment> 25 is a block diagram showing the configuration of a road surface drawing device 107 and its peripheral devices according to the seventh embodiment.
  • the road surface drawing device 107 is obtained by adding a warning control unit 13 to the configuration of the road surface drawing device 101 according to the first embodiment.
  • the vehicle M is provided with an output device 23 in addition to the front camera 21 and the light projecting unit 22.
  • the output device 23 is an LED light, a display, or a speaker mounted on the vehicle M.
  • Figure 26 shows an actual drawing pattern with defects.
  • An actual drawing pattern with defects is one that is missing feature points compared to the reference projection pattern.
  • the actual drawing pattern in Figure 26 is an arrow-shaped actual drawing pattern that is originally composed of feature points a, b, c, d, e, f, and g as shown in Figure 9, but is missing feature points a, d, and e.
  • a defect as shown in FIG. 26 will occur in the actual drawing pattern.
  • the area determination method of embodiment 1 will produce areas where the area cannot be calculated.
  • the distance determination method of embodiment 2 will produce areas where the distance cannot be extracted.
  • abnormalities such as a lack of feature points will occur in the method of embodiment 4 as well.
  • FIG. 27 is a flowchart showing the operation of the road surface drawing device 107.
  • the flow in FIG. 27 is obtained by adding step S103A and step S108 to the flow of the road surface drawing device 101 shown in FIG. 6.
  • the light projection control unit 12 compares the actual drawing pattern with the reference drawing pattern in step S103A and determines whether the number of feature points in the actual drawing pattern is insufficient.
  • the alarm control unit 13 determines that there is a defect in the actual drawing pattern, and performs alarm control in step S108. In this step, the alarm control unit 13 causes the output device 23 to output a light abnormality alarm. Upon receiving this alarm, the driver 1 can stop the vehicle M, visually inspect the light-projecting unit 22, and clean it if there is any adhesion such as snow or mud. Furthermore, if there is no adhesion on the light-projecting unit 22, the driver 1 can determine that there is an internal failure of the light-projecting unit 22, and take appropriate measures.
  • the alarm control unit 13 may operate a wiper (not shown) attached to the front of the light-projecting unit 22 to clean the front of the light-projecting unit 22.
  • the road surface drawing device 107 of the seventh embodiment includes an alarm control unit 13 that causes an output device mounted on the vehicle to output an alarm if the difference between the number of feature points of the first actual drawing pattern and the number of feature points of the first reference drawing pattern is equal to or greater than a predetermined threshold value. This allows the driver 1 of the vehicle M to notice a malfunction of the light projecting unit 22 or the front camera 21 and take the necessary measures.
  • the road surface drawing device 108 of the eighth embodiment has the same configuration as that of the first to sixth embodiments, as shown in FIG.
  • the specular reflection light component of the light projected by the light-projecting unit 22 on the surrounding road surface 3 becomes larger and the reflected light component to the vehicle M decreases, compared to when it is horizontal. Therefore, the actual drawing pattern appears dark to the driver 1 or pedestrians who are not in the direction of travel of the vehicle M, and the actual drawing pattern appears excessively bright to pedestrians 7 who are in the direction of travel of the vehicle M.
  • the light-projection control unit 12 increases the brightness of the corrected light-projection pattern. Also, when the person viewing the drawing pattern is a pedestrian 7 who is in the traveling direction of the vehicle M, the light-projection control unit 12 decreases the brightness of the corrected light-projection pattern.
  • the light-projection control unit 12 can calculate the gradient of the surrounding road surface 3 by comparing the actual drawing pattern with the reference drawing pattern. Furthermore, when the actual drawing pattern is divided into multiple regions and compared with the reference drawing pattern as in FIG. 8 or FIG. 11, if the correction amount of the forward divided region in the traveling direction of the vehicle M is greater than the correction amount of the published divided region, the gradient of the surrounding road surface 3 changes midway through the projection range of the light-projection unit 22. Therefore, the light-projection control unit 12 increases the brightness of the forward divided region in the traveling direction of the corrected light-projection pattern compared to the brightness of the rear divided region.
  • the light projection control unit 12 increases the brightness of the light of the corrected light projection pattern compared to when the downward gradient value does not exceed a predetermined threshold.
  • the road surface drawing device 109 of the ninth embodiment has the same configuration as those of the first to sixth and eighth embodiments, as shown in FIG.
  • the x-y-z coordinate system shown in Figures 12, 15, and 16 has its origin at the body of vehicle M, or more specifically, the light sources of light-projecting units 22L and 22R.
  • a coordinate system xc-yc-zc recognized by the camera image of front camera 21 with front camera 21 as its origin, a coordinate system xl-yl-zl of light-projecting unit 22L, and a coordinate system xr-yr-zr of light-projecting unit 22R.
  • the coordinate systems xl-yl-zl and xr-yr-zr are determined by the projection angles of light-projecting units 22L and 22R. Since each coordinate system is independent, even if coordinate values indicating a specific position in one coordinate system are input into another coordinate system, if the position or rotation of the origin of each coordinate system is unknown, it cannot be projected to the same specific position in the other coordinate system.
  • Figure 28 shows light projection points P1L and P1R, which are alignment patterns projected in the same direction from both light projection unit 22L and light projection unit 22R so that they project light onto the same point.
  • Figure 28 shows the xc-yc positions of light projection points P1L and P1R in an image captured by front camera 21.
  • Light projection point P1L by light projection unit 22L represented by a + mark
  • the light projection control unit 12 of the road surface drawing device 109 uses the light projection points P1L and P1R as an alignment pattern, and corrects the coordinate system xl-yl-zl of the light projection unit 22L and the coordinate system xr-yr-zr of the light projection unit 22R from the misalignment between them.
  • the correction is performed by rotating around the position of the origin of each coordinate system and each coordinate axis.
  • the coordinate has six adjustment parameters: the x position, y position, and z position of the origin, and x rotation, y rotation, and z rotation. It is not necessary to identify all adjustment parameters, and only those that are likely to have a large error or are easy to identify may be identified.
  • the direction of the z rotation is the same as ⁇ , which is expressed as the azimuth angle of the light projection direction of the light projection units 22L and 22R in embodiment 4.
  • the direction of the y rotation is the same as ⁇ , which is expressed as the depression angle of the light projection direction of the light projection units 22L and 22R in embodiment 4.
  • the rotation of the light projectors 22L, 22R in the depression angle direction and the y direction of the origin O in FIG. 12 are corrected based on the misalignment of P1L, P1R.
  • the position coordinates of P1L, P1R in the x and y directions in the frame coordinate system can be known from the image captured by the front camera 21. From these position coordinates, the direction angle and depression angle of P1L, P1R in the xc-yc-zc coordinate system can be calculated.
  • the azimuth angle and zx visual depression angle of P1L as viewed from the front camera 21 are ⁇ C (1,r) and ⁇ C,xz (1,r).
  • the azimuth angle and zx visual depression angle of P1R as viewed from the front camera 21 are ⁇ C (1,l) and ⁇ C,xz (1,l).
  • the xc and yc coordinates are expressed by the following formulas.
  • the xc coordinate xc(1,l) and yc coordinate yc(1,l) of P1L, and the xc coordinate xc(1,r) and xc coordinate yc(1,r) of P1R are expressed by the following equations.
  • P1R is located near P1L, so there is no major contradiction in assuming that P1L and P1R are on the same plane.
  • the misalignment ⁇ yc, (1) between P1L and P1R in the yc direction can be calculated using the following formula from formulas (33) and (35).
  • the y direction position of the light projector 22R which is the origin position of the coordinate system xr-yr-zr, is moved by - ⁇ yc, (1). This causes the yc direction positions of P1R and P1L to match.
  • the xr coordinate in the xr-yr-zr coordinate system of the light-projecting unit 22R is expressed by the following formula.
  • the correction amount ⁇ r (1) is determined by the following formula.
  • equation (41) which includes the differentiation of xr with respect to hr, is used instead of equation (40).
  • the parameters identified are the xc and yc coordinates. However, more parameters may be identified. Because the image captured by the front camera 21 has information in the x and y directions, two parameters can be identified simultaneously using one point on the captured image. The number of parameters that can be identified simultaneously is twice the number of points used in the alignment pattern. If the alignment pattern has two points, four parameters can be identified simultaneously, and if it has three points, six parameters can be identified simultaneously.
  • the light projection control unit 12 may perform the above correction once and then repeat the same correction again. In this way, correction is possible even if the road surface is not flat and level like S0. Correction is also possible if the road surface shape changes while driving.
  • Figure 29 shows grid-like alignment patterns P1L, P2L, P3L, P4L, P5L, P6L, P7L, P8L, P9L, P10L, P11L, P12L and P1R, P2R, P3R, P4R, P5R, P6R, P7R, P8R, 9R, P10R, P11R, P12R.
  • the front camera 21, which uses an image element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), captures images one frame at a time for each frame rate. While projecting a drawing pattern onto the driver 1, the light projecting unit 22 may project an alignment pattern instantaneously in time with the timing at which the front camera 21 captures an image. For example, the light projecting unit 22L may project an alignment pattern in one frame, and the light projecting unit 22R may project an alignment pattern in another frame.
  • an image element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor)
  • the light projecting unit 22 may project an alignment pattern instantaneously in time with the timing at which the front camera 21 captures an image.
  • the light projecting unit 22L may project an alignment pattern in one frame
  • the light projecting unit 22R may project an alignment pattern in another frame.
  • the light projection control unit 12 may also embed an alignment pattern in a drawing pattern for the driver 1, etc.
  • the light projection unit 22L and the light projection unit 22R may be made distinguishable from each other by projecting light at different frame timings.
  • light-projecting unit 22 may be composed of three or more light-projecting units.
  • the alignment pattern 28 and 29 show alignment patterns of simple shapes, but the alignment pattern may be an image such as a figure or a photograph.
  • the positions of points that have a specific relationship such as parallel, rotated, or enlarged.
  • the irradiation direction or irradiation position of the light projector 22 may be mechanically changed and adjusted.
  • the road surface drawing device 110 of the tenth embodiment has the same configuration as those of the first to sixth, eighth and ninth embodiments, as shown in FIG.
  • the correction of the deviation of the coordinate system caused by the different positions of the light projectors 22L and 22R was described.
  • the light projector control unit 12 corrects the deviation of the coordinate system caused by the different positions of the multiple front cameras 21B.
  • the correction method is the same as in the ninth embodiment.
  • two front cameras 21B are installed at the same positions as the light projecting units 22L, 22R. These two front cameras 21B capture the same object on the surrounding road surface.
  • the light projecting control unit 12 determines the coordinates of the object in a coordinate system viewed from the origin of each front camera 21B from each camera frame image of the front cameras 21B, and corrects at least one of the parameters that represent the positional or rotational relationship between the two front cameras 21B.
  • the object on the surrounding road surface used for correction does not have to be a pattern drawn on the surrounding road surface by the light projector 22.
  • a position detection sensor such as Lidar or radar may be used instead of the front camera 21B.
  • the road surface drawing device 111 of the eleventh embodiment has the same configuration as those of the first to sixth and eighth to tenth embodiments as shown in FIG.
  • two front cameras 21B are installed at the same position as the light projectors 22L and 22R.
  • the light projector 22L projects light in a certain direction and at a certain depression angle, and a pattern is drawn on the road surface.
  • the light projection control unit 12 can grasp the coordinates of the light projection point on the road surface in a coordinate system with the left front camera 21B installed at the same position as the light projector 22L as the origin.
  • the right front camera 21B captures the pattern drawn on the road surface.
  • the light projection control unit 12 can grasp the coordinates of the light projection point in the coordinate system with the right front camera 21B as the origin by converting the frame coordinates of the image captured by the right front camera 21B into the coordinates of the projection position.
  • the light projection control unit 12 can correct the deviation between the coordinate systems in the same manner as in the ninth embodiment based on the deviation between the coordinates on the coordinate system with the left front camera 21B as the origin and the coordinates on the coordinate system with the front camera 21B as the origin, which are obtained in this manner.
  • the tenth and eleventh embodiments may be combined.
  • the light-projection control unit 12 may simultaneously perform the correction described in the tenth embodiment and the correction described in the eleventh embodiment to adjust the coordinate system of a sensor such as a lidar or radar that measures the distance to an object and the coordinate system of the light-projection unit 22.
  • the light-projection control unit 12 may correct at least one of the positional relationship or the rotational relationship between the light-projection unit 22 and the front camera 21B.
  • the image acquisition unit 11, the light projection control unit 12, and the warning control unit 13 in the above-mentioned road surface drawing devices 101-111 are realized by a processing circuit 81 shown in Fig. 30. That is, the processing circuit 81 includes the image acquisition unit 11, the light projection control unit 12, and the warning control unit 13 (hereinafter referred to as "image acquisition unit 11, etc.”). Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in a memory may be applied.
  • the processor is, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), etc.
  • the processing circuit 81 When the processing circuit 81 is dedicated hardware, the processing circuit 81 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination of these.
  • Each function of each part, such as the image acquisition unit 11, may be realized by multiple processing circuits 81, or the functions of each part may be combined and realized by a single processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the image acquisition unit 11 etc. are realized by a combination of software etc. (software, firmware or software and firmware).
  • the software etc. is written as a program and stored in memory.
  • the processor 82 applied to the processing circuit 81 realizes the functions of each unit by reading and executing the program stored in the memory 83.
  • the road surface drawing device 101-108 includes a memory 83 for storing a program that, when executed by the processing circuit 81, results in the execution of the following steps: acquiring photographed image information of the surrounding road surface; having the light projector 22 project light of a first reference light projection pattern that projects a first reference drawing pattern on the reference road surface; extracting from the photographed image information an actual drawing pattern that is drawn on the surrounding road surface by the light of the first reference light projection pattern; calculating a correction amount based on the difference between the first drawing pattern and the first reference drawing pattern; creating a corrected light projection pattern by correcting, with the correction amount, a second reference light projection pattern that projects a second reference drawing pattern that is the same as or different from the first reference drawing pattern on the reference road surface; and having the light projector 22 project light of the corrected light projection pattern.
  • the memory 83 may be, for example, non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and their drive devices, or any storage medium to be used in the future.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and their drive devices, or any storage medium to be used in the future.
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • HDD Hard Disk Drive
  • the above describes a configuration in which the functions of the image acquisition unit 11, etc. are realized either by hardware or software, etc.
  • this is not limited to the above, and a configuration in which part of the image acquisition unit 11, etc. is realized by dedicated hardware and another part is realized by software, etc.
  • the function of the light projection control unit 12 can be realized by a processing circuit as dedicated hardware, and the other functions can be realized by the processing circuit 81 as the processor 82 reading and executing a program stored in the memory 83.
  • the processing circuit can realize each of the above-mentioned functions by hardware, software, etc., or a combination of these.
  • road surface drawing devices 101-108 have been described above as in-vehicle devices, they can also be applied to a system constructed by appropriately combining a PND (Portable Navigation Device), a communications terminal (e.g., a mobile phone, smartphone, tablet, or other mobile terminal), the functions of applications installed on these, and a server, etc.
  • PND Portable Navigation Device
  • communications terminal e.g., a mobile phone, smartphone, tablet, or other mobile terminal
  • each function or component of the road surface drawing devices 101-108 described above may be distributed among the devices that make up the system, or may be concentrated in one of the devices.
  • the origin of the coordinate system used to indicate the projection pattern position or calculate coordinates does not have to be the center of the vehicle width.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Le but de la présente divulgation est d'utiliser la projection de lumière d'un véhicule sur une surface de route pour former un motif souhaité sans distorsion indépendamment du gradient de la surface de route. Un dispositif de formation de motif sur la surface de la route (101) comprend : une unité d'acquisition d'image (11) qui acquiert des informations d'image capturée d'une surface de la route dans l'environnement d'un véhicule (M) ; et une unité de commande de projection de lumière (12) qui commande la projection de lumière sur la surface de route environnante par une unité de projection de lumière (22) qui est montée sur le véhicule (M). L'unité de commande de projection de lumière (12) : amène l'unité de projection de lumière (22) à projeter de la lumière ayant un premier motif de projection de référence pour former un premier motif de référence sur une surface de route de référence ; extrait, dans les informations d'image capturée, un premier motif réel qui a été formé sur la surface de la route environnante par la lumière ayant le premier motif de projection de lumière de référence ; calcule une quantité de correction sur la base de la différence entre le premier motif réel formé et le premier motif de référence ; corrige, par la quantité de correction, un second motif de projection de lumière de référence pour former un second motif de référence sur la surface de route de référence, de façon à préparer un motif de projection de lumière corrigé ; et amène l'unité de projection de lumière (22) à projeter de la lumière ayant le motif de projection de lumière corrigé.
PCT/JP2022/035638 2022-09-26 2022-09-26 Dispositif et procédé de formation de motif sur la surface de la route Ceased WO2024069682A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/035638 WO2024069682A1 (fr) 2022-09-26 2022-09-26 Dispositif et procédé de formation de motif sur la surface de la route
JP2024548814A JP7607842B2 (ja) 2022-09-26 2022-09-26 路面描画装置および路面描画方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/035638 WO2024069682A1 (fr) 2022-09-26 2022-09-26 Dispositif et procédé de formation de motif sur la surface de la route

Publications (1)

Publication Number Publication Date
WO2024069682A1 true WO2024069682A1 (fr) 2024-04-04

Family

ID=90476555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/035638 Ceased WO2024069682A1 (fr) 2022-09-26 2022-09-26 Dispositif et procédé de formation de motif sur la surface de la route

Country Status (2)

Country Link
JP (1) JP7607842B2 (fr)
WO (1) WO2024069682A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019197008A (ja) * 2018-05-10 2019-11-14 日立オートモティブシステムズ株式会社 撮像装置
JP2021046206A (ja) * 2015-04-10 2021-03-25 マクセル株式会社 車両

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021046206A (ja) * 2015-04-10 2021-03-25 マクセル株式会社 車両
JP2019197008A (ja) * 2018-05-10 2019-11-14 日立オートモティブシステムズ株式会社 撮像装置

Also Published As

Publication number Publication date
JPWO2024069682A1 (fr) 2024-04-04
JP7607842B2 (ja) 2024-12-27

Similar Documents

Publication Publication Date Title
US9225942B2 (en) Imaging surface modeling for camera modeling and virtual view synthesis
US8355565B1 (en) Producing high quality depth maps
JP6883608B2 (ja) 深度マップに対して画像位置合せを行って深度データを最適化することができる深度データ処理システム
CN112050751B (zh) 一种投影仪标定方法、智能终端及存储介质
CN107792179A (zh) 一种基于车载环视系统的泊车引导方法
CN104937634B (zh) 用于生成环绕视图的方法和系统
US11134204B2 (en) Image processing apparatus and image transformation method
CN103136720A (zh) 车载360度全景拼接方法
JPWO2019194255A1 (ja) 演算処理装置、オブジェクト識別システム、オブジェクト識別方法、自動車、車両用灯具
CN107240065A (zh) 一种3d全景图像生成系统和方法
CN111429531A (zh) 标定方法、标定装置和非易失性计算机可读存储介质
JP2016143381A (ja) 画像生成装置、座標変換テーブル作成装置および作成方法
TWI898822B (zh) 應用於環景拼接的魚眼相機外部參數優化方法
CN105365659B (zh) 照射系统
JP7607842B2 (ja) 路面描画装置および路面描画方法
TWM568376U (zh) 多攝像頭系統及多攝像頭模組
WO2013157184A1 (fr) Dispositif d'aide à la visibilité arrière pour véhicule et procédé d'aide à la visibilité arrière pour véhicule
CN118864241A (zh) 一种车灯显示融合拼接方法,系统及存储介质
CN118426621A (zh) 投影触控方法、车辆及计算机可读存储介质
CN106352847A (zh) 基于相位差的距离测量装置及距离测量方法
CN109993802B (zh) 一种城市环境中的混合相机标定方法
TWI908320B (zh) 應用於環景拼接的拼接縫的調整方法
TW202034024A (zh) 結構光校正系統
CN119672235B (zh) 一种基于无人机的带有浮雕的三维建筑生成方法及系统
CN120689428A (zh) 投影标定处理方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22960747

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024548814

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22960747

Country of ref document: EP

Kind code of ref document: A1