[go: up one dir, main page]

WO2019008716A1 - Dispositif de mesure de non-visible et procédé de mesure de non-visible - Google Patents

Dispositif de mesure de non-visible et procédé de mesure de non-visible Download PDF

Info

Publication number
WO2019008716A1
WO2019008716A1 PCT/JP2017/024748 JP2017024748W WO2019008716A1 WO 2019008716 A1 WO2019008716 A1 WO 2019008716A1 JP 2017024748 W JP2017024748 W JP 2017024748W WO 2019008716 A1 WO2019008716 A1 WO 2019008716A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
range
information
electromagnetic wave
visible
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/024748
Other languages
English (en)
Japanese (ja)
Inventor
白水 信弘
利樹 石井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maxell Ltd
Original Assignee
Maxell Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maxell Ltd filed Critical Maxell Ltd
Priority to PCT/JP2017/024748 priority Critical patent/WO2019008716A1/fr
Publication of WO2019008716A1 publication Critical patent/WO2019008716A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • An object of the present invention is to provide a technique capable of detecting an unvisible object which is blinded by a wall or the like with high accuracy.
  • the analysis unit includes a shape recognition unit, a range analysis unit, a signal analysis unit, and a position analysis unit.
  • the shape recognition unit generates three-dimensional shape information from the image information acquired by the image input unit.
  • the range analysis unit calculates the reflection direction of the electromagnetic wave irradiated by the radar irradiation unit from the shape information generated by the shape recognition unit by reflecting it on the reflection surface and expanding the range information indicating the irradiation range of the electromagnetic wave from the calculated reflection direction Calculate
  • Embodiment 1 Hereinafter, the embodiment will be described in detail.
  • FIG. 1 is an explanatory view showing an example of the configuration of the non-visible position measurement apparatus according to the first embodiment.
  • the non-visible measurement device 110 inputs, via the image input unit 103, image information of the obstacle 115 and the reflective surface 114 that hide the target object 111 to be measured in the measurement range and the periphery thereof, and outputs the image information to the analysis unit 104.
  • the radar irradiation unit 101 irradiates, for example, an electromagnetic wave as the irradiation wave 112, reflects the electromagnetic wave on the reflection surface 114, and irradiates the target object 111 to be measured.
  • the electromagnetic wave reflected by the object 111 reaches the radar detection unit 102 as a reflected wave 113 via the reflection surface 114.
  • the analysis unit 104 Based on the image information from the image input unit 103, the analysis unit 104 extracts the size and position of a structure that will be a reflection surface such as a wall or mirror around the object, and emits an electromagnetic wave to the reflection surface. By performing the position analysis, it measures the object outside the visible.
  • the analysis unit 104 includes a shape recognition unit 105, a range analysis unit 106, a position analysis unit 107, and a signal analysis unit 108.
  • the shape recognition unit 105 has an image recognition function, and an image of the image input unit 103 Three-dimensional shape information is created using information, that is, the above-described camera shot image, infrared camera image, stereo shot image, map image, or TOF image.
  • the range analysis unit 106 calculates range information by calculating the angle at which the irradiation wave is reflected and spread on the reflection surface using electromagnetic field analysis, a ray tracing method, or the like based on the shape information output from the shape recognition unit 105. Do.
  • the range information includes an irradiation angle range ⁇ t at which the irradiation wave hits the reflection surface, an arrival range of the irradiation wave spread at the reflection surface, and an arrival angle range ⁇ r of the reflection wave estimated to arrive at the radar detection unit 102 from the reflection surface. .
  • the collision warning signal may be output by predicting the risk of collision from the movement speed of the object and the movement speed of the outside-of-visible measurement device 110.
  • the radar irradiation unit 101 and the radar detection unit 102 may be independent or separated, or may be an integral module.
  • FIG. 2 is an explanatory view showing an example of a front view when the non-visible measurement device of FIG. 1 is mounted on a car.
  • the shape recognition unit 105 detects the reflection surface 114 or the curve mirror 206 from the image information acquired by the forward monitoring camera, that is, the image input unit 103, and the range analysis unit 106 detects the analysis range of the radar detection signal from the reflection surface Restrict to
  • the head-up display 205 or monitor displays the position of the object 111 such as a pedestrian based on the information output from the position analysis unit 107. As a result, the driver is alerted.
  • the front right wall in FIG. 2 may be used as the reflective surface 114a, or a curved mirror 206 installed at intersections or curves with poor visibility. Or the like may be used.
  • FIG. 3 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
  • the outside-of-visible measurement device 110 is installed, for example, at the center of the front of the car.
  • the example which installs the non-visible measurement device 110 in the center of the front of a car was shown, for example, you may make it install only the radar irradiation part 101 and the radar detection part 102 in the front center of a car.
  • the car is in a state of traveling and approaching an intersection.
  • the object 111 which is a pedestrian, is located on the roadside of the road on the left side of the intersection, which is closer to the car, with respect to the traveling direction of the car, and proceeds from the roadside to the intersection. Further, the object 111 which is a pedestrian is in a state of being hidden by the obstacle 115.
  • the reflected wave 113 reflected on the radar detection unit 102 via the irradiation wave 112 emitted from the radar irradiation unit 101 and the reflection surface 114
  • the wave propagation distance is about 21.2 m from the non-visible measurement device 110 to the reflection surface 114 and about 5.1 m from the reflection surface 114 to the object 111 which is a pedestrian.
  • the signal intensity of the reflected wave from the pedestrian, that is, the object 111 in the arrangement example shown in FIG. 3 was calculated.
  • FIG. 4 is an example of the level diagram used for calculating the signal strength of the reflected wave of FIG.
  • the solid line is the level diagram calculated for the example of FIG. 3, and the dotted line is the level diagram of the vehicle 100 m ahead within the visible range calculated by the standard specification value.
  • the output 10 dBm of the radar irradiation unit 101 is amplified to 33 dBm by the antenna gain 23 dBi.
  • the attenuation factor of the propagation path is 63.5 dB from the non-visible measurement device 110 to the object 111 which is a pedestrian.
  • the scattering cross section representing the reflectance at the object to be measured 111 was set to standard -15 dBsm.
  • the power of the reflected wave reflected by the object 111 is ⁇ 51.9 dBm.
  • the propagation attenuation of the reflected wave is also 63.5 dB and the attenuation due to reflection is 6.4 dB as in the case of the irradiation wave.
  • the attenuation factor during rainfall is known to be -20 dB / km, and the attenuation factor is 1 dB for 52.5 m of the entire path.
  • the reflected wave reaches the radar detection unit 102 through the gain 16 dBmi of the receiving antenna.
  • the power of the radio wave reaching the radar detection unit 102 is -117 dBm, and the margin of the radar detection unit 102 is 3 dB with respect to the minimum detection sensitivity of -120 dBm of the radar detection unit 102 It is possible. Further, in the case of a bicycle or an automobile having a larger scattering cross-sectional area than the pedestrian whose object is the object 111, the margin is further increased.
  • the distance d 15 m from the car to the intersection means 0.75 seconds of running distance until the driver senses that the warning is given and the driver steps on the brake. It is the distance that can be stopped including.
  • the speed of the vehicle is about 50 km / h If there is a distance that can be stopped.
  • the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle, that is, in the same direction as the position of the object 111. Therefore, the electromagnetic wave is emitted from the radar irradiation unit 101 to the left front side of the center line in the vehicle width direction of the vehicle as illustrated.
  • the irradiation range of the electromagnetic wave in this case is taken as a first range.
  • the first range is a range having a positive horizontal angle, for example, a horizontal angle of about 90 degrees.
  • the horizontal angle in the left direction is a positive horizontal angle
  • the horizontal angle in the right direction is a negative horizontal angle, with respect to the center of the entire irradiation range of the radar irradiation unit 101, in other words, the center line in the vehicle width direction of the vehicle. I assume.
  • the detection range becomes wide.
  • the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle as described above, in the example shown in FIG. 5, the reflecting surface 114a is the center in the vehicle width direction of the vehicle It is a wall on the opposite lane side, not to the right front side of the line, that is, not on the lane side of the vehicle.
  • the electromagnetic wave is emitted from the radar irradiation unit 101 on the right front side of the center line in the vehicle width direction of the vehicle.
  • the irradiation range of the electromagnetic wave in this case is taken as a second range.
  • the second range is a range having the above-described negative horizontal angle, for example, a horizontal angle of about -90 °.
  • the object may be detected using paths from a plurality of reflecting surfaces.
  • FIG. 6 is an explanatory view showing an example of detection of an object by the outside-of-visible measurement device of FIG.
  • FIG. 6 the example which measures the target object 111 by the path
  • One is a path of the electromagnetic wave by the reflecting surface 114 as in FIG. 3, and the reflecting surface 114 is on the left front side of the center line in the vehicle width direction of the vehicle.
  • the other is the electromagnetic wave path by the reflecting surface 114a as in FIG. 5, and the reflecting surface 114a is on the left front side of the center line in the vehicle width direction of the vehicle.
  • the path of the electromagnetic wave by the reflecting surface 114 is indicated by the irradiation wave 112 and the reflected wave 113
  • the path of the electromagnetic wave by the reflecting surface 114a is indicated by the irradiation wave 112a and the reflected wave 113a.
  • the position of the object 111 is measured a plurality of times from a plurality of paths of the electromagnetic wave by the two reflecting surfaces 114 and 114a. Then, the position of the target object 111 having the highest possibility may be calculated and output from the plurality of measured position information.
  • the measurement accuracy of the position can be further improved by performing measurement a plurality of times.
  • FIG. 7 is an explanatory view showing another example of detection of an object by the outside-of-visible measurement device of FIG.
  • FIG. 3, FIG. 5, and FIG. 6 mentioned above showed the example which installed the non-visible measuring device 110 in the center part of the front of a car
  • FIG. 7 the example which mounted two non-visible measuring devices in a car Is shown.
  • two non-visible measurement devices 110 and 110a are mounted, for example, at both front ends of the car.
  • the connection configuration of the outside-of-visible measurement device 110a is the same as that of the outside-of-visible measurement device 110 in FIG.
  • the non-visible measurement devices 110 and 110a are mounted in the vicinity of the headlights of a car.
  • the radar irradiation unit 101 and the radar detection unit 102 may be installed at both front end portions of the vehicle.
  • non-visible measurement device 110, 110a By mounting the non-visible measurement device 110, 110a in the vicinity of the headlight of the vehicle in this manner, measurement can be performed in a wider range.
  • the irradiation wave 112 and the reflected wave 113 can be a path shown in FIG. 7 to allow the electromagnetic wave to reach the object 111 at the intersection.
  • the non-visible measurement device 110 mounted on the left side When the non-visible measurement device 110 mounted on the left side is used, a wall surface on the right front side or the like with respect to the center line in the vehicle width direction of the vehicle serves as the reflection surface 114a as in FIG. Therefore, the irradiation wave 112a and the reflected wave 113a become a path
  • the non-visible measurement device 110a since the non-visible measurement device 110a is installed offset to the left with respect to the center of the vehicle, the paths of the irradiation wave 112a and the reflected wave 113a are different from those in FIG. As a result, the electromagnetic waves reach a position farther from the intersection than in FIG. As a result, it is possible to detect an object 111a further away from the intersection.
  • two radar irradiation units 101 and two radar detection units 102 may be provided, and the remaining analysis units 104 may be common.
  • the image input unit 103 outputs the acquired image information (step S101).
  • the shape recognition unit 105 recognizes the shape and position of the reflective surface from the image information output from the image input unit 103, and outputs the recognition result as shape information (step S102).
  • the range analysis unit 106 calculates the irradiation range and the detection range of the electromagnetic wave based on the shape information output from the shape recognition unit 105, and outputs the calculation result as the range information (step S103).
  • the position analysis unit 107 integrates the shape information and the reflection position and outputs the position information of the object (step S107). The position analysis unit 107 determines whether or not the measurement has been completed (step S108). If the measurement has not been completed, the process returns to the process of step S101 to perform measurement again.
  • step S106 if the range in which signal analysis is performed is the entire range of the field of view, the analysis time becomes large, and there is a possibility that the analysis may not be made before warning of the danger of collision. Therefore, in the process of step S105, the measurement time can be shortened by the range analysis unit 106 limiting the analysis range to, for example, only the part having the reflection surface. Thereby, analysis can be performed within a practical time.
  • the object 111 hidden by the obstacle 115 can be easily detected.
  • the reflection position on the object can be identified with high accuracy in a shorter time, the position of the object 111 which is out of visible can be detected at high speed with low calculation load.
  • the first embodiment shows an example of using the non-visible measurement device 110 in a vehicle to support driving safety for preventing collisions at intersections with poor visibility etc.
  • the application of the non-visible measurement device 110 is It is not limited to driving safety support.
  • monitoring of conditions of vehicles and robots For example, monitoring of conditions of vehicles and robots, safety support for pedestrians and bicycles, virtual reality (VR) / augmented reality (AR) / mixed reality (MR) perimeter monitoring sensors, monitoring devices installed in drone, shape recognition
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • the image information from the image input unit 103, the shape information calculated by the shape recognition unit 105, and the range information calculated by the range analysis unit 106 are accumulated and held as analysis information, and reused at the next measurement. May be
  • the analysis information is accumulated, for example, in a storage unit (not shown) of the non-visible measurement device 110 or the like.
  • the processing time of the signal analysis unit 108 can be speeded up or the calculation load can be increased by reading out the analysis information acquired earlier from the storage unit described above and performing range analysis and signal analysis. It can be reduced.
  • the danger position information can be stored and shared with other non-visible measurement devices to warn of the danger without analysis. Thereby, the safety can be improved.
  • the non-visible measurement device that detects the position of an object that is out of view at high speed with lower calculation load using a low-cost radar.
  • FIG. 9 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the second embodiment.
  • the map input unit 109 acquires map information from an internal storage device or recording medium (not shown), an external server connected by communication, or the like based on current position information acquired from a satellite positioning system (GPS) or the like, and analyzes it. It is output to the shape recognition unit 105 that the unit 104 has.
  • GPS satellite positioning system
  • the shape recognition unit 105 can estimate the existing position of the reflective surface by using the map information acquired by the map input unit 109.
  • the positional accuracy of the reflective surface is improved, and the analysis time of the image information can be shortened.
  • the map information acquired by the map input unit 109 may be information including a three-dimensional structure of a building, in which case the calculation time of shape recognition can be further shortened.
  • FIG. 10 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement apparatus according to the third embodiment.
  • the non-visible measurement device 110 shown in FIG. 10 is different from the non-visible measurement device 110 of FIG. 1 of the first embodiment in the irradiation technique by the radar irradiation unit 101.
  • the connection configuration is the same as the outside-of-visible measurement device 110 in FIG.
  • the radar irradiation unit 101 irradiates a diffracted wave as the irradiation wave 601.
  • the radar detection unit 102 also detects a diffracted wave as the reflected wave 602.
  • the irradiation wave 601, which is a diffracted wave irradiated from the radar irradiation unit 101, is diffracted by the wall surface of the obstacle 115 and the like, turns around, and reaches the object 111.
  • the reflected wave 602 of the diffracted wave reflected by the object of measurement 111 is diffracted again by the wall surface of the obstacle 115 or the like to wrap around and reach the radar detection unit 102. At this time, the paths of the irradiation wave and the reflected wave may not be straight.
  • the radar irradiation unit 101 can limit the irradiation range of the electromagnetic wave based on the irradiation range information output from the range analysis unit 106. By limiting the irradiation range of the radar irradiation unit 101, the measurement time can be shortened and the amount of analysis calculation can be reduced.
  • the signal analysis unit 108 determines whether or not the irradiation range of the electromagnetic wave covers the range information (step S206). When the range is covered, the signal analysis is performed (step S207). When not covering, it returns to the process of step S204, changes the irradiation range, and irradiates electromagnetic waves.
  • the processing time can be shortened, and low-cost non-visible measurement can be performed at high speed.
  • FIG. 13 is a flowchart showing an example of measurement processing in the non-visible measurement device according to the fifth embodiment.
  • FIG. 13 shows another example of the process shown in FIG. 12 of the fourth embodiment. Further, the connection configuration of the outside-of-visible measurement device 110 is the same as the outside-of-visible measurement device 110 of FIG.
  • step S304 irradiation with electromagnetic wave
  • step S305 electromagnétique wave detection
  • step S306 determination of range end
  • steps S301 to S303 is respectively the first step to the third step
  • step S304 is the fourth step
  • step S305 is the fifth step
  • step S306 is the fourth step. 9 steps.
  • steps S301 to S303 is the same as step S201 (image acquisition), step S202 (shape recognition), and step S203 (range analysis) in FIG.
  • step S201 image acquisition
  • step S202 shape recognition
  • step S203 range analysis
  • the electromagnetic wave irradiation in step S304 the electromagnetic wave detection in step S305
  • the range end determination in step S306 are simultaneously performed.
  • the range serving as the reference of the determination in the process of step S306 is the range set in advance by the process of step S303 in the previous measurement.
  • FIG. 14 is an explanatory drawing showing an example of the configuration of the outside-of-visible measurement device according to the sixth embodiment.
  • the speed input unit 141 acquires the moving speed of the mounting device on which the outside-of-visible measurement device 110 is mounted. For example, when the non-visible measurement device 110 is mounted on a car, the car becomes a mounting device.
  • the mounting device will be described as a car.
  • the speed input unit 141 acquires the moving speed from a speed measuring device or a satellite positioning system provided in the car.
  • the speed input unit 141 may be configured to include a speed measuring device.
  • the analysis unit 104 sets a measurement range, an analysis range, an irradiation range of the radar, a detection range, and the like based on the moving speed acquired from the speed input unit 141.
  • the position analysis unit 107 calculates the distance from the car to the object from the moving speed of the car and the moving direction of the object and the moving speed, and outputs a collision warning signal when the distance is less than a certain value.
  • the distance from the car to the object 111 is the distance from the car to the intersection of the moving direction of the car and the moving direction of the object 111.
  • step S401 speed acquisition
  • steps S402 to S410 after the processing of step S401 is the same as the processing of steps S201 to S209 of FIG.
  • the speed input unit 141 acquires the moving speed from a speed measuring device of a car that is a mounted device, a satellite positioning system, or the like.
  • the obtained moving speed is a narrow area at high speed traveling and a vicinity at low speed traveling within the range analysis of step S404, the area end determination of step S407, the range of signal analysis of step S408, and the position analysis of step S409. It is used to set a wide area of
  • the movement direction and movement speed are calculated from the difference between the measured reflection position of the object 111 and the reflection position measured one time ago, and in the position analysis of step S409 the movement of the vehicle
  • the distance from the car to the object 111 at the intersection of the moving direction of the car and the moving direction of the object is calculated from the speed and the moving direction and moving speed of the object 111, and the distance is less than a certain value In case of collision output warning signal.
  • a collision warning signal between the mounting apparatus and the object 111 can be output with high accuracy.
  • FIG. 16 is an explanatory drawing showing an example of the configuration of the image input unit 103 included in the outside-of-visible measurement device 110 according to the seventh embodiment.
  • the non-visible measurement device 110 is assumed to be the same as that shown in FIG. 1 of the first embodiment.
  • the image input unit 103 includes a light irradiation unit 161, a light detection unit 162, and a control calculation unit 163, as shown in FIG.
  • the light irradiator 161 irradiates, for example, light of a pulse waveform to the object and the periphery thereof in accordance with the light control signal of the control calculator 163.
  • the light is reflected by an obstacle or a wall or a structure serving as the reflective surface 114, and the reflected light is incident on the light detection unit 162.
  • the light detection unit 162 detects the light of the reflected pulse waveform.
  • the control calculation unit 163 uses the TOF (Time Of Flight) method from the time difference ⁇ T between the irradiated pulse wave and the light detection signal of the reflected pulse wave and the light velocity, and the reflecting surface from the image input unit 103 The distance L to 114 is calculated.
  • TOF Time Of Flight
  • the light to be irradiated may be a continuous wave, and in this case, the control calculation unit 163 may calculate the distance L using a phase difference or a frequency difference.
  • the control calculation unit 163 scans the irradiation direction so as to cover the object and the range around it, and maps the value of the distance L to a two-dimensional image for each irradiation direction, thereby generating image information including distance information.
  • the analysis unit 104 can reduce the calculation load of shape recognition and range analysis by using the image information including the distance information, and as a result, it is possible to realize speeding up of the non-visible measurement.
  • the present invention is not limited to the above-mentioned embodiment, and can be variously changed in the range which does not deviate from the gist. Needless to say.
  • the present invention is not limited to the above-described embodiment, but includes various modifications.
  • the above-described embodiments are described in detail to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .
  • the image input unit is an imaging device for capturing a video, and outputs the captured image data as image information.
  • the non-visible measurement device according to (1) includes a plurality of radar irradiation units and a plurality of radar detection units.
  • the analysis unit generates a three-dimensional shape information from the image information acquired by the image input unit, and a radar from the shape information generated by the shape recognition unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

La présente invention détecte, avec une grande précision, un objet non visible qui est dans un endroit masqué en raison d'une paroi ou similaire. Un dispositif de mesure de non-visible (110) comprend une unité d'émission radar (101), une unité de détection radar (102), une unité d'entrée d'image (103) et une unité d'analyse (104). L'unité d'émission radar (101) émet des ondes électromagnétiques vers un objet (111). L'unité de détection radar (102) détecte des ondes électromagnétiques qui ont été réfléchies par l'objet (111), et délivre un signal de détection pour les ondes électromagnétiques détectées. L'unité d'entrée d'image (103) acquiert des informations d'image topographique. L'unité d'analyse (104) analyse le signal de détection qui a été délivré par l'unité de détection radar (102) et les informations d'image qui ont été acquises par l'unité d'entrée d'image (103), et calcule des informations de localisation de l'objet (111). L'unité d'analyse (104) calcule les informations de localisation d'un objet non visible (111) qui n'est pas inclus dans les informations d'image par extraction, sur la base des informations d'image qui ont été acquises par l'unité d'entrée d'image (103), de la localisation d'une surface de réflexion qui réfléchit des ondes électromagnétiques, en émettant des ondes électromagnétiques vers la surface de réflexion extraite, et en effectuant une analyse de localisation.
PCT/JP2017/024748 2017-07-06 2017-07-06 Dispositif de mesure de non-visible et procédé de mesure de non-visible Ceased WO2019008716A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/024748 WO2019008716A1 (fr) 2017-07-06 2017-07-06 Dispositif de mesure de non-visible et procédé de mesure de non-visible

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/024748 WO2019008716A1 (fr) 2017-07-06 2017-07-06 Dispositif de mesure de non-visible et procédé de mesure de non-visible

Publications (1)

Publication Number Publication Date
WO2019008716A1 true WO2019008716A1 (fr) 2019-01-10

Family

ID=64949780

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024748 Ceased WO2019008716A1 (fr) 2017-07-06 2017-07-06 Dispositif de mesure de non-visible et procédé de mesure de non-visible

Country Status (1)

Country Link
WO (1) WO2019008716A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190329768A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation Based on Detected Size of Occlusion Zones
WO2020054108A1 (fr) * 2018-09-14 2020-03-19 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
WO2020070909A1 (fr) * 2018-10-05 2020-04-09 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
WO2020070908A1 (fr) * 2018-10-05 2020-04-09 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
WO2021010083A1 (fr) * 2019-07-18 2021-01-21 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme de traitement d'informations
JP2021018121A (ja) * 2019-07-18 2021-02-15 古河電気工業株式会社 レーダ装置、レーダ装置の物標検出方法、および、物標検出システム
CN113138660A (zh) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 信息的获取方法、装置、移动终端及存储介质
JP2021135149A (ja) * 2020-02-26 2021-09-13 Jrcモビリティ株式会社 動的物標検出システム、動的物標検出方法、及びコンピュータが実行可能なプログラム
CN113763383A (zh) * 2021-11-09 2021-12-07 常州微亿智造科技有限公司 钢筋延伸率测量方法和装置
CN113994404A (zh) * 2019-06-21 2022-01-28 松下电器产业株式会社 监视装置及监视方法
CN114674256A (zh) * 2022-04-02 2022-06-28 四川豪智融科技有限公司 一种基于雷达极化方向判定目标旋转角度的方法
JPWO2023199873A1 (fr) * 2022-04-14 2023-10-19

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301649A (ja) * 2003-03-31 2004-10-28 Kitakyushu Foundation For The Advancement Of Industry Science & Technology 見通し外車両等の検知レーダシステム
WO2006123628A1 (fr) * 2005-05-17 2006-11-23 Murata Manufacturing Co., Ltd. Radar et systeme de radar
JP2015230566A (ja) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 運転支援装置
JP2016110629A (ja) * 2014-11-27 2016-06-20 パナソニックIpマネジメント株式会社 物体検出装置および道路反射鏡
JP2017097581A (ja) * 2015-11-24 2017-06-01 マツダ株式会社 物体検出装置
JP2017162178A (ja) * 2016-03-09 2017-09-14 パナソニックIpマネジメント株式会社 判定装置、判定方法、および判定プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004301649A (ja) * 2003-03-31 2004-10-28 Kitakyushu Foundation For The Advancement Of Industry Science & Technology 見通し外車両等の検知レーダシステム
WO2006123628A1 (fr) * 2005-05-17 2006-11-23 Murata Manufacturing Co., Ltd. Radar et systeme de radar
JP2015230566A (ja) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 運転支援装置
JP2016110629A (ja) * 2014-11-27 2016-06-20 パナソニックIpマネジメント株式会社 物体検出装置および道路反射鏡
JP2017097581A (ja) * 2015-11-24 2017-06-01 マツダ株式会社 物体検出装置
JP2017162178A (ja) * 2016-03-09 2017-09-14 パナソニックIpマネジメント株式会社 判定装置、判定方法、および判定プログラム

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190329768A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation Based on Detected Size of Occlusion Zones
US11738741B2 (en) * 2017-01-12 2023-08-29 Mobileye Vision Technologies Ltd. Navigation based on detected occlusion overlapping a road entrance
WO2020054108A1 (fr) * 2018-09-14 2020-03-19 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
JP2020046755A (ja) * 2018-09-14 2020-03-26 オムロン株式会社 検知装置、移動体システム、及び検知方法
JP7063208B2 (ja) 2018-09-14 2022-05-09 オムロン株式会社 検知装置、移動体システム、及び検知方法
WO2020070909A1 (fr) * 2018-10-05 2020-04-09 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
WO2020070908A1 (fr) * 2018-10-05 2020-04-09 オムロン株式会社 Dispositif de détection, système de corps mobile, et procédé de détection
JP2020060864A (ja) * 2018-10-05 2020-04-16 オムロン株式会社 検知装置、移動体システム、及び検知方法
JP2020060863A (ja) * 2018-10-05 2020-04-16 オムロン株式会社 検知装置、移動体システム、及び検知方法
US12148303B2 (en) 2018-10-05 2024-11-19 Omron Corporation Sensing device, moving body system and sensing method
JP7070307B2 (ja) 2018-10-05 2022-05-18 オムロン株式会社 検知装置、移動体システム、及び検知方法
JP7067400B2 (ja) 2018-10-05 2022-05-16 オムロン株式会社 検知装置、移動体システム、及び検知方法
CN113994404A (zh) * 2019-06-21 2022-01-28 松下电器产业株式会社 监视装置及监视方法
CN113994404B (zh) * 2019-06-21 2023-11-07 松下控股株式会社 监视装置及监视方法
JP7328043B2 (ja) 2019-07-18 2023-08-16 古河電気工業株式会社 レーダ装置、レーダ装置の物標検出方法、および、物標検出システム
JP2021018121A (ja) * 2019-07-18 2021-02-15 古河電気工業株式会社 レーダ装置、レーダ装置の物標検出方法、および、物標検出システム
WO2021010083A1 (fr) * 2019-07-18 2021-01-21 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme de traitement d'informations
CN113138660A (zh) * 2020-01-17 2021-07-20 北京小米移动软件有限公司 信息的获取方法、装置、移动终端及存储介质
JP2021135149A (ja) * 2020-02-26 2021-09-13 Jrcモビリティ株式会社 動的物標検出システム、動的物標検出方法、及びコンピュータが実行可能なプログラム
CN113763383A (zh) * 2021-11-09 2021-12-07 常州微亿智造科技有限公司 钢筋延伸率测量方法和装置
CN114674256A (zh) * 2022-04-02 2022-06-28 四川豪智融科技有限公司 一种基于雷达极化方向判定目标旋转角度的方法
CN114674256B (zh) * 2022-04-02 2023-08-22 四川豪智融科技有限公司 一种基于雷达极化方向判定目标旋转角度的方法
JPWO2023199873A1 (fr) * 2022-04-14 2023-10-19
WO2023199873A1 (fr) * 2022-04-14 2023-10-19 三菱電機株式会社 Dispositif d'estimation de position cible, dispositif radar et procédé d'estimation de position cible
JP7551033B2 (ja) 2022-04-14 2024-09-13 三菱電機株式会社 目標位置推定装置、レーダ装置、及び目標位置推定方法

Similar Documents

Publication Publication Date Title
WO2019008716A1 (fr) Dispositif de mesure de non-visible et procédé de mesure de non-visible
JP7645341B2 (ja) 緊急車両の検出
CN111837085B (zh) 根据地图、车辆状态和环境调整传感器发射功率
US9759812B2 (en) System and methods for intersection positioning
US11608055B2 (en) Enhanced autonomous systems with sound sensor arrays
CN103874931B (zh) 用于求取车辆的环境中的对象的位置的方法和设备
US11011063B2 (en) Distributed data collection and processing among vehicle convoy members
EP3273423B1 (fr) Dispositif et procédé pour un véhicule permettant de reconnaître un piéton
CN109631782B (zh) 用于测量桥梁间隙的系统和方法
US10823844B2 (en) Method and apparatus for analysis of a vehicle environment, and vehicle equipped with such a device
US10783384B2 (en) Object detection using shadows
KR20200102004A (ko) 충돌 방지 장치, 시스템 및 방법
EP4102251A1 (fr) Détermination de la visibilité atmosphérique dans des applications de véhicule autonome
US10906542B2 (en) Vehicle detection system which classifies valid or invalid vehicles
US11731622B2 (en) Prediction of dynamic objects at concealed areas
WO2015009218A1 (fr) Détermination de position de voie
JP2015161968A (ja) 走行車線識別装置、車線変更支援装置、走行車線識別方法
EP4060373A2 (fr) Détection multispectrale d'objets par imagerie thermique
JP6414539B2 (ja) 物体検出装置
CN116087955A (zh) 使用多路径雷达反射和地图数据检测和定位非视线对象
WO2020039840A1 (fr) Dispositif de traitement de radar
KR102017958B1 (ko) 철도차량용 증강현실 헤드업 디스플레이 시스템
JP2022060075A (ja) 運転支援装置
CN111025332B (zh) 用于机动车的环境感测系统
KR102185743B1 (ko) 차량의 전방에 위치하는 객체의 실존 여부를 판단하는 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17917118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17917118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP