[go: up one dir, main page]

CN111164456B - Prompted automotive sensor fusion - Google Patents

Prompted automotive sensor fusion Download PDF

Info

Publication number
CN111164456B
CN111164456B CN201880064466.4A CN201880064466A CN111164456B CN 111164456 B CN111164456 B CN 111164456B CN 201880064466 A CN201880064466 A CN 201880064466A CN 111164456 B CN111164456 B CN 111164456B
Authority
CN
China
Prior art keywords
sensor
module
data
object sensor
detection module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880064466.4A
Other languages
Chinese (zh)
Other versions
CN111164456A (en
Inventor
马修·费特曼
阿雷特·卡尔森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anzher Software Co ltd
Original Assignee
Anzher Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anzher Software Co ltd filed Critical Anzher Software Co ltd
Publication of CN111164456A publication Critical patent/CN111164456A/en
Application granted granted Critical
Publication of CN111164456B publication Critical patent/CN111164456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

用于涉及提示的传感器融合的物体的远程检测的方法和系统。在一些具体实施中,可以使用第一物体传感器产生第一组感测数据,并且使用第二物体传感器产生第二组感测数据。然后可以使用来自第一物体传感器的感测数据检测物体。当使用来自第一物体传感器的感测数据检测到物体时,可以改变与第二物体传感器相关联的设置,以增加使用来自第二物体传感器的第二组感测数据检测到物体的可能性。

Methods and systems for remote detection of objects involving sensor fusion of cues. In some implementations, the first set of sensed data can be generated using a first object sensor, and the second set of sensed data can be generated using a second object sensor. Objects can then be detected using the sensing data from the first object sensor. When an object is detected using sensing data from the first object sensor, settings associated with the second object sensor may be changed to increase the likelihood that the object is detected using the second set of sensing data from the second object sensor.

Description

提示的汽车传感器融合Tips for Automotive Sensor Fusion

发明内容Contents of the invention

物体级融合通常与传感器系统结合使用以从主车辆检测远程物体。利用这种类型的架构,可以将来自第一传感器的数据与来自第二传感器的数据进行组合,以提供更多信息来识别和跟踪物体。这种类型的架构还可以允许将传感器组合为“黑匣子”。换句话说,代替可能需要针对每种可能的传感器组合对检测和/或跟踪模块进行大量调整和调节的检测级融合,传感器可以被组合并且基本上在即插即用型架构中一起使用。但是,如果一个传感器不能识别目标,则此物体级融合将失败。换句话说,对于典型的物体级融合,如果一个传感器不能识别由另一传感器识别的物体,则系统的融合跟踪器或融合跟踪模块将没有来自一个传感器的数据进行组合并且用于跟踪物体。Object-level fusion is often used in conjunction with sensor systems to detect remote objects from the host vehicle. With this type of architecture, data from a first sensor can be combined with data from a second sensor to provide more information to identify and track objects. This type of architecture can also allow the combination of sensors as a "black box". In other words, instead of detection-level fusion, which may require extensive tuning and tuning of detection and/or tracking modules for every possible combination of sensors, sensors can be combined and used together in essentially a plug-and-play type architecture. However, this object-level fusion will fail if one sensor cannot identify the target. In other words, with typical object-level fusion, if one sensor cannot recognize an object recognized by another sensor, the system's fused tracker or fused tracking module will have no data from one sensor to combine and use to track the object.

因此,本发明人已经确定,期望提供克服现有技术中的前述限制和/或其他限制中的一个或多个的系统和方法。因此,在一些实施方案中,本文公开的发明构思可以用于提供提示的融合架构,出于本公开的目的,可以将其视为修改的物体级融合架构。在一些实施方案中,来自第一传感器的提示可以用于对第二传感器触发一个或多个设置或参数变化(诸如以特殊模式运行),以增加第二传感器检测到的可能性。Accordingly, the present inventors have determined that it would be desirable to provide systems and methods that overcome one or more of the foregoing limitations and/or other limitations of the prior art. Thus, in some embodiments, the inventive concepts disclosed herein may be used to provide a suggested fusion architecture, which for the purposes of this disclosure may be considered a modified object-level fusion architecture. In some embodiments, a cue from a first sensor may be used to trigger one or more setting or parameter changes (such as operating in a special mode) to a second sensor to increase the likelihood of detection by the second sensor.

例如,在一些实施方案中,可以响应于第一传感器检测到物体而降低第二传感器的一个或多个阈值,诸如检测阈值和/或跟踪器阈值。该变化可以是暂时的和/或可以仅位于由第一传感器检测到的物体的方向或大体方向(例如,在角度范围内)上。因此,特殊操作模式可以降低检测阈值和/或需要较少的一致检测来维持跟踪。优选地,这不需要第二传感器的任何物理旋转。这可以提供增加第二传感器检测到物体的可能性而不会增加第二传感器错误警报/检测的可能性。例如,在其中特殊模式仅在相对较短的时间段内和/或在相对较小的空间区域(与第一传感器检测到的物体相对应)内操作的实施方案中,总误报率应保持恒定,或可能仅在相对较小的时间和/或角度窗口内略有增加,并且因此提供忽略不计的总误报率增加,同时增加了感兴趣目标的真实误报率。For example, in some embodiments, one or more thresholds of a second sensor, such as a detection threshold and/or a tracker threshold, may be lowered in response to a first sensor detecting an object. The change may be temporary and/or may be only in the direction or general direction (eg, within an angular range) of the object detected by the first sensor. Thus, special modes of operation may lower the detection threshold and/or require fewer consistent detections to maintain tracking. Preferably, this does not require any physical rotation of the second sensor. This may provide an increased likelihood of the second sensor detecting an object without increasing the likelihood of false alarms/detections by the second sensor. For example, in embodiments where the special mode operates only for a relatively short period of time and/or within a relatively small spatial region (corresponding to objects detected by the first sensor), the overall false alarm rate should remain Constant, or possibly only a slight increase over relatively small windows of time and/or angles, and thus providing a negligible increase in the overall false alarm rate while increasing the true false alarm rate for the target of interest.

有利地,一些实施方案可以被配置为保留物体级传感器融合的黑匣子优点,换句话说,可以将传感器组合在一起而无需大量定制的调整/调节。在一些实施方案中,传感器可以以即插即用的方式组合在一起而无需任何这种定制。Advantageously, some embodiments can be configured to retain the black box advantages of object-level sensor fusion, in other words, sensors can be combined without extensive custom tuning/tuning. In some embodiments, sensors can be put together in a plug-and-play fashion without any such customization.

在根据一些具体实施的用于在主车辆内检测物体的方法的更具体的示例中,该方法可以包括使用第一物体传感器(诸如,雷达传感器、视觉传感器/相机、激光雷达传感器等)来产生、接收和/或传输第一组感测数据。该方法可以还包括使用第二物体传感器产生、接收和/或传输第二组感测数据。在优选实施方案和具体实施中,第一物体传感器可以不同于第二物体传感器,以允许组合不同传感器技术的有利方面。然后可以使用来自第一物体传感器的感测数据来检测物体或可疑物体。在使用来自第一物体传感器的感测数据检测到物体时,可以改变与第二物体传感器相关联的设置,以将第二物体传感器置于特殊模式,以增加使用来自第二物体传感器的第二组感测数据检测到物体的可能性。In a more specific example of a method for detecting objects within a host vehicle according to some implementations, the method may include using a first object sensor (such as a radar sensor, vision sensor/camera, lidar sensor, etc.) to generate 1. Receive and/or transmit the first set of sensing data. The method may further include generating, receiving and/or transmitting a second set of sensory data using a second object sensor. In preferred embodiments and implementations, the first object sensor may be different from the second object sensor to allow the advantages of different sensor technologies to be combined. Objects or suspicious objects may then be detected using the sensed data from the first object sensor. When an object is detected using sensing data from a first object sensor, the settings associated with the second object sensor can be changed to place the second object sensor in a special mode to increase the use of the second object sensor from the second object sensor. The likelihood of an object being detected by the group sensing data.

在一些具体实施中,改变与第二物体传感器相关联的设置以增加使用来自第二物体传感器的第二组感测数据检测到物体的可能性的步骤可以包括改变第二物体传感器的阈值,诸如降低第二物体传感器的阈值。在一些这样的具体实施中,改变与第二物体传感器相关联的设置以增加使用来自第二物体传感器的第二组感测数据检测到物体的可能性的步骤可以包括改变远程物体检测和/或跟踪系统的检测模块的检测阈值和跟踪器或跟踪模块的跟踪阈值中的至少一者,在一些这样的具体实施中,改变与第二物体传感器相关联的设置以增加使用来自第二物体传感器的第二组感测数据检测到物体的可能性的步骤可以包括改变检测模块的信噪比和跟踪模块的用于启动或维持对由检测模块检测到的物体的跟踪的阈值中的至少一者。In some implementations, changing a setting associated with the second object sensor to increase the likelihood of detecting an object using the second set of sensing data from the second object sensor can include changing a threshold of the second object sensor such as Lower the threshold of the second object sensor. In some such implementations, changing a setting associated with the second object sensor to increase the likelihood of an object being detected using the second set of sensing data from the second object sensor can include changing the remote object detection and/or At least one of the detection threshold of the detection module of the tracking system and the tracking threshold of the tracker or tracking module, in some such implementations, changing a setting associated with the second object sensor to increase the use of The second set of sensing data may include varying at least one of a signal-to-noise ratio of the detection module and a threshold of the tracking module for initiating or maintaining tracking of an object detected by the detection module.

一些具体实施还可以包括,在使用来自第二物体传感器的感测数据检测到物体时,改变与第一物体传感器相关联的设置,以增加使用来自第一物体传感器的第一组感测数据检测到物体的可能性。Some implementations may also include, when an object is detected using sensing data from the second object sensor, changing a setting associated with the first object sensor to increase detection using the first set of sensing data from the first object sensor. to the possibility of the object.

在根据一些实施方案的用于从主车辆内远程检测物体的系统的示例中,该系统可以包括被配置为接收第一组感测数据的第一物体传感器和被配置为接收第二组感测数据的第二物体传感器。如前所述,优选地,物体传感器是不同的类型,此外,应当理解,在一些实施方案中或在本文公开的任何方法/具体实施中可以使用两个以上的这样的物体传感器。该系统还可以包括被配置为从第一物体传感器接收数据的物体检测模块。物体检测模块可以被配置为使用来自第一物体传感器和/或第二物体传感器的数据来从主车辆内识别远程物体,并且在物体检测模块使用来自第一物体传感器的第一组感测数据检测到物体时,改变与第二物体传感器相关联的设置,以增加物体检测模块使用来自第二物体传感器的数据检测到物体的可能性。In an example of a system for remotely detecting objects from within a host vehicle according to some embodiments, the system may include a first object sensor configured to receive a first set of sensed data and a second set of sensed data configured to receive data from the second object sensor. As previously stated, preferably the object sensors are of different types, furthermore it should be understood that more than two such object sensors may be used in some embodiments or in any method/implementation disclosed herein. The system may also include an object detection module configured to receive data from the first object sensor. The object detection module may be configured to identify a remote object from within the host vehicle using data from the first object sensor and/or the second object sensor, and detect the Upon arrival of the object, a setting associated with the second object sensor is changed to increase the likelihood that the object detection module will detect the object using data from the second object sensor.

在一些实施方案中,物体检测模块可以是包括第一物体传感器的第一传感器模块的一部分。在一些这样的实施方案中,第一传感器模块还可以包括第一跟踪模块,该第一跟踪模块被配置为从物体检测模块接收处理后的数据,以跟踪由物体检测模块所识别的远程物体。In some embodiments, the object detection module may be part of a first sensor module that includes a first object sensor. In some such embodiments, the first sensor module may also include a first tracking module configured to receive processed data from the object detection module to track remote objects identified by the object detection module.

一些实施方案还可以包括第二传感器模块。第二物体传感器可以是第二传感器模块的一部分。第二传感器模块还可以包括第二物体检测模块,该第二物体检测模块被配置为从第二物体传感器接收数据。第二物体检测模块可以被配置为使用来自第二物体传感器的数据来从主车辆内识别远程物体,在一些实施方案中,第二传感器模块还可以包括第二跟踪模块,该第二跟踪模块被配置为从第二物体检测模块接收处理后的数据,以跟踪由第二物体检测模块所识别的远程物体。Some embodiments may also include a second sensor module. The second object sensor may be part of the second sensor module. The second sensor module may also include a second object detection module configured to receive data from the second object sensor. The second object detection module may be configured to use data from a second object sensor to identify a remote object from within the host vehicle, and in some embodiments, the second sensor module may also include a second tracking module that is tracked by configured to receive processed data from the second object detection module to track a remote object identified by the second object detection module.

一些实施方案还可以包括融合跟踪模块,该融合跟踪模块被配置为从第一跟踪模块和第二跟踪模块两者接收数据,以跟踪由第一跟踪模块和第二跟踪模块两者检测到的物体。Some embodiments may also include a fused tracking module configured to receive data from both the first tracking module and the second tracking module to track objects detected by both the first tracking module and the second tracking module .

在一些实施方案中,该系统可以被配置为:在物体检测模块使用来自第一物体传感器的第一组感测数据检测到物体时,降低与第二物体传感器相关联的检测阈值和跟踪器阈值中的至少一者,以增加使用来自第二物体传感器的数据检测到物体的可能性。类似地,一些实施方案可以被配置为在物体检测模块使用来自第二物体传感器的第二组感测数据检测物体时,降低与第一物体传感器相关联的检测阈值和跟踪器阈值中的至少一者,以增加使用来自第一物体传感器的数据检测到物体的可能性。In some embodiments, the system can be configured to lower the detection threshold and the tracker threshold associated with the second object sensor when the object detection module detects an object using the first set of sensing data from the first object sensor at least one of , to increase the likelihood of an object being detected using data from the second object sensor. Similarly, some embodiments may be configured to lower at least one of the detection threshold and the tracker threshold associated with the first object sensor when the object detection module detects an object using the second set of sensing data from the second object sensor. or to increase the likelihood of an object being detected using data from the first object sensor.

在一些实施方案中,该系统可以被配置为:在物体检测模块使用来自第一物体传感器的第一组感测数据检测到物体时,仅在与检测到的物体相关联的角度范围内,降低与第二物体传感器相关联的检测阈值和跟踪器阈值中的至少一者。In some embodiments, the system can be configured to: when an object is detected by the object detection module using the first set of sensing data from the first object sensor, reduce the At least one of a detection threshold and a tracker threshold associated with the second object sensor.

在用于从主车辆内远程检测物体的系统的另一示例中,该系统可以包括被配置为接收第一组感测数据的第一物体传感器和被配置为接收第二组感测数据的第二物体传感器。同样,第一物体传感器可以包括与第二物体传感器不同类型的物体传感器。因此,在一些实施方案中,第一物体传感器可以包括相机,并且第二物体传感器可以包括雷达传感器。In another example of a system for remotely detecting an object from within a host vehicle, the system may include a first object sensor configured to receive a first set of sensory data and a second sensor sensor configured to receive a second set of sensory data. Two object sensors. Likewise, the first object sensor may include a different type of object sensor than the second object sensor. Thus, in some embodiments, the first object sensor may include a camera and the second object sensor may include a radar sensor.

该系统还可以包括第一物体检测模块,该第一物体检测模块被配置为从第一物体传感器接收原始数据。第一物体检测模块可以被配置为尝试使用来自第一物体传感器的原始数据来识别远程物体。该系统还可以包括第二物体检测模块,该第二物体检测模块被配置为从第二物体传感器接收原始数据。第二物体检测模块可以被配置为尝试使用来自第二物体传感器的原始数据来识别远程物体。The system may also include a first object detection module configured to receive raw data from the first object sensor. The first object detection module may be configured to attempt to identify a remote object using raw data from the first object sensor. The system may also include a second object detection module configured to receive raw data from the second object sensor. The second object detection module may be configured to attempt to identify a remote object using raw data from the second object sensor.

该系统还可以包括:第一跟踪模块,该第一跟踪模块被配置为从第一物体检测模块接收处理后的数据,以跟踪第一物体检测模块所识别的远程物体;第二跟踪模块,该第二跟踪模块被配置为从第二物体检测模块接收处理后的数据,以跟踪第二物体检测模块所识别的远程物体。The system may also include: a first tracking module configured to receive processed data from the first object detection module to track a remote object identified by the first object detection module; a second tracking module that The second tracking module is configured to receive the processed data from the second object detection module to track the remote object identified by the second object detection module.

该系统还可以包括融合跟踪模块,该融合跟踪模块被配置为从第一跟踪模块和第二跟踪模块两者接收数据,以跟踪由第一跟踪模块和第二跟踪模块两者检测到的物体。The system may also include a fused tracking module configured to receive data from both the first tracking module and the second tracking module to track objects detected by both the first tracking module and the second tracking module.

用于远程检测的系统可以被配置为:在确认第一物体检测模块和第一跟踪模块中的至少一者检测到物体时,降低第二物体检测模块的检测阈值和第二跟踪模块的跟踪器阈值中的至少一者,以增加使用来自第二物体传感器的数据检测到物体的可能性。The system for remote detection may be configured to lower the detection threshold of the second object detection module and the tracker of the second tracking module upon confirmation that at least one of the first object detection module and the first tracking module detected the object At least one of the thresholds to increase the likelihood of an object being detected using data from the second object sensor.

在一些实施方案中,该系统可以进一步被配置为:在确认第一物体检测模块和第一跟踪模块中的至少一者检测到物体时,降低与雷达传感器相关联的检测阈值和跟踪器阈值中的至少一者,以增加使用来自雷达传感器的数据检测到物体的可能性。In some embodiments, the system may be further configured to: upon confirmation that at least one of the first object detection module and the first tracking module has detected an object, decrease the detection threshold and the tracker threshold associated with the radar sensor to increase the likelihood of an object being detected using data from the radar sensor.

在一些实施方案中,用于远程检测的系统可以被配置为:在第一物体检测模块和第一跟踪模块中的至少一者检测到物体时,仅在与相机/第一传感器检测到的物体相关联的区域内(诸如在与由相机检测到的物体相关联的角度下或角度范围内),降低与雷达传感器相关联的检测阈值和跟踪器阈值中的至少一者。In some embodiments, the system for remote detection can be configured to: when at least one of the first object detection module and the first tracking module detects an object, only when the object detected by the camera/first sensor Within an associated region, such as at an angle or range of angles associated with objects detected by the camera, at least one of a detection threshold and a tracker threshold associated with the radar sensor is decreased.

本文结合一个实施方案公开的特征、结构、步骤或特性可以在一个或多个替代实施方案中以任何合适的方式组合。Features, structures, steps or characteristics disclosed herein in connection with one embodiment may be combined in any suitable manner in one or more alternative embodiments.

附图说明Description of drawings

描述了本公开的非限制性和非穷举性实施方案,包括本公开参照附图的各种实施方案,其中:Non-limiting and non-exhaustive embodiments of the present disclosure are described, including various embodiments of the disclosure with reference to the accompanying drawings, in which:

图1是描绘根据一些实施方案的用于从主车辆内远程检测物体的系统的示例的示意图;1 is a schematic diagram depicting an example of a system for remotely detecting objects from within a host vehicle, according to some embodiments;

图2描绘了根据一些实施方案的在检测物体的车辆,该车辆具有用于从主车辆内远程检测物体的系统;2 depicts a vehicle detecting an object having a system for remotely detecting an object from within a host vehicle, according to some embodiments;

图3描绘了根据其他实施方案的具有用于远程检测物体的系统的另一车辆;并且FIG. 3 depicts another vehicle having a system for remotely detecting objects according to other embodiments; and

图4是描绘根据一些具体实施的用于在主车辆内远程检测物体的方法的示例的流程图。4 is a flowchart depicting an example of a method for remotely detecting objects within a host vehicle, according to some implementations.

具体实施方式Detailed ways

下面提供与本公开的各种实施方案一致的设备、系统和方法的详细描述。虽然描述了若干实施方案,但应理解,本公开不限于所公开的任何特定实施方案,而是涵盖许多替代方案、修改和等效物。另外,尽管在以下描述中阐述了许多具体细节以便提供对本文公开的实施方案的透彻理解,但是可以在没有这些细节中的一些或全部的情况下实践一些实施方案。此外,为了清楚起见,没有详细描述相关领域中已知的某些技术材料,以避免不必要地模糊本公开。A detailed description of devices, systems, and methods consistent with various embodiments of the present disclosure is provided below. While several embodiments have been described, it should be understood that this disclosure is not limited to any particular embodiment disclosed, but instead covers numerous alternatives, modifications, and equivalents. Additionally, although numerous specific details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed herein, some embodiments may be practiced without some or all of these details. Also, for the purpose of clarity, certain technical material that is known in the related arts has not been described in detail to avoid unnecessarily obscuring the present disclosure.

本文公开了涉及用于在远程物体检测中组合传感器数据的新方法和系统的设备、方法和系统。在一些实施方案中,本文公开的发明原理可以用于允许以使得每个传感器是黑匣子并且不需要大量修改/定制以融合其数据的方式将两个或更多个传感器组合在一起。这可以通过使用来自一个传感器的触发器来提示一个或多个其他传感器的操作模式的变化来实现。因此,例如,检测物体的雷达传感器可以用作触发器以暂时改变相机的操作模式,以便增加相机也将检测物体的可能性。类似地,在一些实施方案中,检测物体的相机或其他传感器可以用作触发器来暂时改变雷达传感器或其他第二传感器的操作模式,以便增加雷达传感器或其他第二传感器将检测物体的可能性。由此与常规的物体级融合相比,可以降低一个传感器丢失物体的可能性。Devices, methods, and systems are disclosed herein that relate to new methods and systems for combining sensor data in remote object detection. In some embodiments, the inventive principles disclosed herein can be used to allow two or more sensors to be combined together in such a way that each sensor is a black box and does not require extensive modification/customization to fuse its data. This can be accomplished by using a trigger from one sensor to signal a change in the operating mode of one or more other sensors. So, for example, a radar sensor that detects an object could be used as a trigger to temporarily change the camera's mode of operation in order to increase the likelihood that the camera will also detect an object. Similarly, in some embodiments, a camera or other sensor that detects an object can be used as a trigger to temporarily change the operating mode of a radar sensor or other second sensor in order to increase the likelihood that the radar sensor or other second sensor will detect the object . This reduces the probability of a sensor missing an object compared to conventional object-level fusion.

作为其中可见光传感器/相机可以提示雷达传感器的实施方案的更具体示例,在一些系统中,相机可能需要约六帧来声明目标。然而,系统可以被配置为使得相机可以在更少的帧之后(例如,在一些情况下,在单个帧之后)声明目标,但是具有较低的品质因数。因此,根据利用本文公开的原理的某些实施方案的一些系统可以被配置为使用较低的品质因数阈值来提示雷达传感器。到雷达传感器已形成跟踪时,相机可能已经改善了其跟踪的品质因数。然后,融合跟踪模块可以被配置为忽略来自相机的较低质量的初始物体数据,直到从相机获得较高质量的跟踪为止。以这种方式,远程物体检测系统可以被配置为通过相机或另一第一传感器利用较低的检测阈值,并且使用特殊的操作模式将雷达传感器更快地提示到检测到的物体。这可以允许在确认跟踪和/或将数据发送到融合跟踪器或跟踪模块之前,对潜在的物体/目标进行更加的确定。As a more specific example of an implementation where the visible light sensor/camera may cue the radar sensor, in some systems the camera may require about six frames to declare a target. However, the system can be configured such that the camera can declare a target after fewer frames (eg, after a single frame in some cases), but with a lower figure of merit. Accordingly, some systems according to certain embodiments utilizing the principles disclosed herein may be configured to use a lower figure of merit threshold to cue a radar sensor. By the time the radar sensor has formed a track, the camera may have improved the figure of merit of its track. The fused tracking module can then be configured to ignore the lower quality initial object data from the camera until a higher quality track is obtained from the camera. In this way, the remote object detection system can be configured to utilize a lower detection threshold via the camera or another first sensor, and use a special mode of operation to cue the radar sensor to detected objects more quickly. This may allow for greater determination of potential objects/targets before confirming the track and/or sending the data to a fusion tracker or tracking module.

通过参照附图可以最好地理解本公开的实施方案,其中相同的部分可以由相同的标号表示。将容易理解的是,如本文附图中总体描述和说明的,所公开的实施方案的部件可以各种不同的配置来布置和设计。因此,对本公开的设备和方法的实施方案的以下详细描述并非旨在限制所要求保护的本公开的范围,而仅表示本公开的可能实施方案,此外,除非另有说明,否则方法的步骤不一定需要以任何特定的顺序执行,甚至不必顺序执行,也不必仅将步骤执行一次。现在将参照附图更详细地描述关于某些优选实施方案和实现方式的附加细节。Embodiments of the present disclosure are best understood by referring to the drawings, wherein like parts are designated by like numerals. It will be readily understood that the components of the disclosed embodiments may be arranged and designed in various configurations, as generally described and illustrated in the drawings herein. Accordingly, the following detailed descriptions of embodiments of the apparatus and methods of the present disclosure are not intended to limit the scope of the claimed disclosure, but merely represent possible implementations of the disclosure, and, unless otherwise stated, steps of the methods are not It definitely needs to be done in any particular order, and it doesn't even have to be sequential, and the steps don't have to be done just once. Additional details regarding certain preferred embodiments and implementations will now be described in greater detail with reference to the accompanying drawings.

图1描绘了用于从主车辆内远程检测物体的系统100的示意图。如该图所描绘的,系统100包括第一物体传感器模块110,该第一物体传感器模块包括第一传感器112。第一物体传感器模块110还包括第一物体检测子模块120,该第一物体检测子模块被配置为从第一传感器112接收第一组原始的感测数据。第一物体传感器模块110还包括第一跟踪子模块130,该第一跟踪子模块被配置为从第一物体检测模块120接收处理后的数据,以跟踪第一物体检测模块120使用来自第一传感器112的原始数据所识别的远程物体。FIG. 1 depicts a schematic diagram of a system 100 for remotely detecting objects from within a host vehicle. As depicted in the figure, system 100 includes a first object sensor module 110 that includes a first sensor 112 . The first object sensor module 110 also includes a first object detection sub-module 120 configured to receive a first set of raw sensing data from the first sensor 112 . The first object sensor module 110 also includes a first tracking sub-module 130 configured to receive processed data from the first object detection module 120 to track the first object detection module 120 using data from the first sensor 112 remote objects identified from the raw data.

类似地,系统100包括第二物体传感器模块150,该第二物体传感器模块包括第二传感器152。第二物体传感器模块150还包括第二物体检测子模块160,该第二物体检测子模块被配置为从第二传感器152接收第二组原始的感测数据。第二物体传感器模块150还包括第二跟踪子模块170,该第二跟踪子模块被配置为从第二物体传感器模块150接收处理后的数据,以跟踪第二物体传感器模块150使用来自第二传感器152的原始数据所识别的远程物体。Similarly, system 100 includes a second object sensor module 150 that includes a second sensor 152 . The second object sensor module 150 also includes a second object detection sub-module 160 configured to receive a second set of raw sensing data from the second sensor 152 . The second object sensor module 150 also includes a second tracking sub-module 170 configured to receive processed data from the second object sensor module 150 to track the second object sensor module 150 using data from the second sensor 152 remote objects identified from raw data.

物体传感器模块110和150中的一者或两者可以被配置为检测和/或跟踪分别代表所识别的物体(诸如物体115和155)的一组或多组数据。然后可以使用融合跟踪模块180组合来自模块110和150的这些物体,该融合跟踪模块可以被配置为从第一跟踪模块130或者另外从第一物体传感器模块110接收数据和从第二跟踪模块170或者另外从第二物体传感器模块150接收数据,以跟踪第一传感器模块110和第二传感器模块150两者检测到的一个或多个物体。融合跟踪模块180然后可以被配置为融合这些跟踪的物体并且处理所组合的数据以到达一个或多个融合的物体185。因此,系统100是物体级融合系统的修改版本的示例。One or both of object sensor modules 110 and 150 may be configured to detect and/or track one or more sets of data representative of identified objects, such as objects 115 and 155 , respectively. These objects from modules 110 and 150 can then be combined using a fused tracking module 180, which can be configured to receive data from the first tracking module 130 or otherwise from the first object sensor module 110 and from the second tracking module 170 or Data is additionally received from the second object sensor module 150 to track one or more objects detected by both the first sensor module 110 and the second sensor module 150 . Fused tracking module 180 may then be configured to fuse these tracked objects and process the combined data to arrive at one or more fused objects 185 . Thus, system 100 is an example of a modified version of an object-level fusion system.

然而,与已知的物体级融合系统不同,系统100被配置为响应于传感器模块110和150中的一者识别物体而改变传感器模块110和150中的另一者的操作参数。因此,如图1中的箭头所示,在通过第一传感器模块110检测到物体115时,系统100可以被配置为改变第二传感器模块150(诸如检测模块160和跟踪模块170中的一者或两者)的一个或多个操作参数。However, unlike known object-level fusion systems, the system 100 is configured to change an operating parameter of the other of the sensor modules 110 and 150 in response to one of the sensor modules 110 and 150 identifying an object. Thus, as indicated by the arrows in FIG. 1 , upon detection of an object 115 by the first sensor module 110, the system 100 may be configured to change the second sensor module 150 (such as one of the detection module 160 and the tracking module 170 or One or more operational parameters for both).

因此,在一些实施方案中,系统100可以被配置为降低第二检测模块160的检测阈值和第二跟踪模块170的跟踪器阈值,以增加由传感器112和/或传感器模块110检测的物体使用来自第二物体传感器152和/或第二传感器模块150的数据被检测到的可能性。Accordingly, in some embodiments, system 100 may be configured to lower the detection threshold of second detection module 160 and the tracker threshold of second tracking module 170 to increase the detection of objects by sensor 112 and/or sensor module 110 using data from The probability that the data of the second object sensor 152 and/or the second sensor module 150 is detected.

为了提供更具体的示例,在一些实施方案中,传感器112和152可以彼此不同。例如,传感器112可以包括相机,并且传感器152可以包括雷达传感器,在一些这样的实施方案中,系统100可以被配置为在确认相机112和/或模块110检测到物体时,降低与雷达传感器152和/或模块150相关联的检测阈值和跟踪器阈值中的至少一者,以增加相机112检测的物体也使用来自雷达传感器152的数据被检测到的可能性。To provide a more specific example, in some embodiments, sensors 112 and 152 may be different from each other. For example, sensor 112 may include a camera, and sensor 152 may include a radar sensor, and in some such embodiments, system 100 may be configured to reduce contact with radar sensor 152 and At least one of a detection threshold and a tracker threshold associated with/or module 150 to increase the likelihood that an object detected by camera 112 is also detected using data from radar sensor 152 .

因此,作为更具体的示例,如果检测模块被配置为根据信噪比(“SNR”)应用检测阈值以建立物体检测,则在另一传感器的检测模块检测到物体之后,可以暂时减小此SNR阈值。类似地,在一些实施方案中,作为应用检测阈值降低的替代或补充,可以应用跟踪阈值降低。因此,例如,如果跟踪模块被配置为根据在四个连续周期中检测物体而应用用于建立跟踪的阈值(4/4跟踪器),则可以将该阈值暂时降低到诸如3/3跟踪器。在一些实施方案中,该一个/多个改变可以是暂时的,并且可以涉及附加阈值,诸如时间阈值。这样的阈值可以根据系统的期望错误率根据需要而变化。So, as a more specific example, if a detection module is configured to apply a detection threshold based on the signal-to-noise ratio ("SNR") to establish object detection, this SNR can be temporarily reduced after another sensor's detection module detects an object threshold. Similarly, in some embodiments, instead of or in addition to applying a detection threshold reduction, a tracking threshold reduction may be applied. Thus, for example, if the tracking module is configured to apply a threshold for establishing tracking (4/4 tracker) based on detection of objects in four consecutive cycles, this threshold may be temporarily lowered to, for example, a 3/3 tracker. In some embodiments, the change/changes may be temporary and may involve additional thresholds, such as time thresholds. Such thresholds may vary as desired depending on the desired error rate of the system.

在一些实施方案中,两个传感器模块110/150可以被配置为在检测到物体时改变另一模块/传感器的设置。因此,关于图1中描绘的实施方案,模块150还可以或替代地被配置为在检测到物体时改变模块110的一个或多个操作参数,以增加模块110检测和/或跟踪同一物体的可能性。In some embodiments, both sensor modules 110/150 may be configured to change the settings of the other module/sensor when an object is detected. Thus, with respect to the embodiment depicted in FIG. 1 , module 150 may also or alternatively be configured to alter one or more operating parameters of module 110 upon detection of an object to increase the likelihood that module 110 will detect and/or track the same object sex.

在一些实施方案中,传感器模块110/150中的一者或两者可以被配置为向另一模块应用设置变化,该设置变化可以仅在特定角度下或在特定角度范围内应用。因此,例如,如果相机检测在给定方向上相对于主车辆成三十度的角度的物体,则雷达传感器可以被配置为降低在该角度下或在该角度的期望范围内的检测阈值持续预定量的时间。例如,如果检测阈值为18dB SNR,则该阈值可以在或约30度下降低到15dB SNR持续100ms。另外或替代地,雷达可以降低跟踪阈值。因此,例如,跟踪阈值可以暂时从4/4降低到3/3。在一些实施方案中,非检测传感器/模块的一个或多个设置可以仅在与检测到的物体相关联的角度下或角度范围内被调节。In some embodiments, one or both sensor modules 110/150 may be configured to apply a setting change to the other module, which may only be applied at certain angles or within a certain range of angles. Thus, for example, if a camera detects an object at an angle of thirty degrees relative to the host vehicle in a given direction, the radar sensor may be configured to lower the detection threshold at that angle or within a desired range of that angle for a predetermined amount of time. For example, if the detection threshold is 18dB SNR, the threshold can be lowered to 15dB SNR for 100ms at or about 30 degrees. Additionally or alternatively, the radar may lower the tracking threshold. So, for example, the tracking threshold could be temporarily lowered from 4/4 to 3/3. In some embodiments, one or more settings of a non-detecting sensor/module may only be adjusted at an angle or range of angles associated with a detected object.

图2描绘了根据一些实施方案的包括用于远程检测物体的系统的主车辆200。如该图所描绘的,如上所述,车辆200可以包括两个分离的传感器和两个对应的传感器模块。因此,第一传感器模块210可以包括雷达传感器212,并且第二传感器模块250可以包括相机252。因此,如上所述,在通过雷达传感器212和相机252中的一者检测到物体(诸如行人10)时,系统可以被配置为暂时改变非检测传感器/模块的一个或多个设置,以增加非检测传感器/模块检测到行人10的可能性。FIG. 2 depicts a host vehicle 200 including a system for remotely detecting objects, according to some embodiments. As depicted in this figure, vehicle 200 may include two separate sensors and two corresponding sensor modules, as described above. Accordingly, the first sensor module 210 may include a radar sensor 212 and the second sensor module 250 may include a camera 252 . Thus, as described above, upon detection of an object, such as a pedestrian 10, by one of the radar sensor 212 and the camera 252, the system may be configured to temporarily change one or more settings of the non-detection sensor/module to increase the non-detection Detect the likelihood that the sensor/module detects the pedestrian 10 .

图3描绘了根据一些实施方案的另一车辆300,该车辆包括用于在车辆300内远程检测物体的系统305。系统310包括第一物体传感器模块310。第一物体传感器模块310包括第一传感器312、检测模块320和跟踪模块330。类似地,系统305包括第二物体传感器模块350,该第二物体传感器模块包括第二传感器352、第二物体检测模块360和第二跟踪模块370。FIG. 3 depicts another vehicle 300 including a system 305 for remotely detecting objects within the vehicle 300 , according to some embodiments. System 310 includes a first object sensor module 310 . The first object sensor module 310 includes a first sensor 312 , a detection module 320 and a tracking module 330 . Similarly, system 305 includes a second object sensor module 350 including a second sensor 352 , a second object detection module 360 and a second tracking module 370 .

检测模块320/360分别被配置为从传感器312/352接收原始的感测数据,并且尝试使用这种数据来识别/检测远程物体。类似地,跟踪模块330/370分别被配置为从检测模块320/360接收处理后的数据,以跟踪检测模块320/360使用来自传感器312/352的原始数据所识别的远程物体。Detection modules 320/360 are configured to receive raw sensory data from sensors 312/352, respectively, and attempt to use such data to identify/detect remote objects. Similarly, tracking modules 330/370 are configured to receive processed data from detection modules 320/360, respectively, to track remote objects identified by detection modules 320/360 using raw data from sensors 312/352.

系统305还包括控制器340,该控制器可以被配置为处理来自传感器312/352的数据。如本文所使用的,术语控制器是指包括处理器并且优选地还包括存储器元件的硬件装置。存储器可以被配置为存储本文中所引用的模块中的一个或多个,并且控制器340和/或处理器可以被配置为执行该模块以执行本文中所描述的一个或多个过程。System 305 also includes a controller 340, which may be configured to process data from sensors 312/352. As used herein, the term controller refers to a hardware device comprising a processor and preferably also memory elements. The memory may be configured to store one or more of the modules referenced herein, and the controller 340 and/or the processor may be configured to execute the modules to perform one or more processes described herein.

如本文所使用的,软件模块或部件可以包括位于存储器装置和/或机器可读存储介质内的任何类型的计算机指令或计算机可执行代码。软件模块可以例如包括计算机指令的一个或多个物理或逻辑块,它们可以被组织为执行一个或多个任务或实现特定抽象数据类型的例程、程序、对象、部件、数据结构等。As used herein, a software module or component may comprise any type of computer instructions or computer-executable code residing in a memory device and/or machine-readable storage medium. A software module may, for example, comprise one or more physical or logical blocks of computer instructions organized as routines, programs, objects, components, data structures, etc. to perform one or more tasks or implement particular abstract data types.

在某些实施方案中,特定的软件模块可以包括存储在存储器装置的不同位置中的不同的指令,这些指令一起实现模块的所描述的功能。实际上,模块可以包括单个指令或多个指令,并且可以分布在若干不同的代码段上、不同的程序之间以及跨若干存储器装置分布。一些实施方案可以在分布式计算环境中实践,其中任务由通过通信网络链接的远程处理装置执行。在分布式计算环境中,软件模块可以位于本地和/或远程存储器存储装置中。另外,在数据库记录中绑定或呈现在一起的数据可以驻留在同一存储器装置中或跨若干存储器装置驻留,并且可以跨网络在数据库中的记录字段中链接在一起。In some embodiments, a particular software module may include different instructions stored in different locations of the memory device, which together implement the described functionality of the module. Indeed, a module may comprise a single instruction, or many instructions, and may be distributed over several different code segments, between different programs and across several memory devices. Some embodiments may be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, software modules may be located in local and/or remote memory storage devices. Additionally, data bound or presented together in a database record may reside in the same memory device or across several memory devices, and may be linked together across a network in record fields in the database.

此外,本文公开的本发明的实施方案和具体实施可以包括各种步骤,这些步骤可以体现在由通用或专用计算机(或其他电子装置)执行的机器可执行指令中。替代地,步骤可以由包括用于执行步骤的特定逻辑的硬件部件来执行,或者由硬件、软件和/或固件的组合来执行。Furthermore, the embodiments and implementations of the invention disclosed herein may include various steps that may be embodied in machine-executable instructions executed by a general purpose or special purpose computer (or other electronic device). Alternatively, steps may be performed by hardware components including specific logic for performing the steps, or by a combination of hardware, software, and/or firmware.

实施方案和/或具体实施还可以作为计算机程序产品提供,该计算机程序产品包括其上存储有指令的机器可读存储介质,该指令可以用于对计算机(或其他电子装置)进行编程以执行本文描述的过程。机器可读存储介质可以包括但不限于:硬盘驱动器、软盘、光盘、CD-ROM、DVD-ROM、ROM、RAM、EPROM、EEPROM、磁卡或光卡、固态存储器装置,或适用于存储电子指令的其他类型的介质/机器可读介质。还可以提供存储器和/或数据存储,它们在一些情况下可以包括非暂时性机器可读存储介质,该介质包含配置为由处理器、控制器/控制单元等执行的可执行程序指令。Embodiments and/or implementations may also be provided as a computer program product comprising a machine-readable storage medium having stored thereon instructions that may be used to program a computer (or other electronic device) to perform the described process. Machine-readable storage media may include, but are not limited to, hard drives, floppy disks, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, solid-state memory devices, or other devices suitable for storing electronic instructions. Other Types of Media/Machine Readable Media. Memory and/or data storage may also be provided, which in some cases may include a non-transitory machine-readable storage medium containing executable program instructions configured for execution by a processor, controller/control unit, or the like.

系统305还包括融合跟踪模块380,该融合跟踪模块可以被配置为从模块310和350接收数据并且通过处理该组合数据来融合来自检测到的和/或被跟踪的物体的数据。控制器340可以被配置为处理来自融合跟踪模块380的数据。当然,如本领域普通技术人员将理解的,可以根据需要使用任何合适数量的控制器和/或处理器。System 305 also includes a fused tracking module 380 that may be configured to receive data from modules 310 and 350 and to fuse data from detected and/or tracked objects by processing the combined data. Controller 340 may be configured to process data from fused tracking module 380 . Of course, any suitable number of controllers and/or processors may be used as desired, as will be understood by those of ordinary skill in the art.

图4是描绘根据一些具体实施的用于在主车辆内检测远程物体的方法400的示例的流程图。如该图所示,方法400可以在步骤405开始,在该点处,可以使用第一物体传感器诸如雷达传感器、激光雷达传感器、相机等来产生和/或接收第一组感测数据。类似地,在步骤410处,可以使用第二物体传感器来产生和/或接收第二组感测数据,在一些具体实施中,第二物体传感器可以与第一物体传感器不同。在一些具体实施中,步骤405和/或410可以包括从这些传感器中的一者或两者接收原始数据。FIG. 4 is a flowchart depicting an example of a method 400 for detecting remote objects within a host vehicle, according to some implementations. As shown in the figure, method 400 may begin at step 405, at which point a first set of sensory data may be generated and/or received using a first object sensor, such as a radar sensor, lidar sensor, camera, or the like. Similarly, at step 410, a second object sensor may be used to generate and/or receive a second set of sensory data, which in some implementations may be different from the first object sensor. In some implementations, steps 405 and/or 410 can include receiving raw data from one or both of these sensors.

步骤415可以包括使用物体传感器中的一者或两者来检测一个或多个物体。在一些具体实施中,可以使用例如检测模块(诸如检测模块120/160或320/360)来执行步骤415。在使用来自给定传感器的数据检测到物体时,方法400然后可以包括在420处改变传感器和/或与非检测传感器相关联的模块的一个或多个设置。因此,例如,如果第一物体传感器已经检测到物体,则步骤420可以包括改变第二物体传感器或与第二物体传感器相关联的一个或多个模块的操作参数和/或设置,以增加物体也被第二物体传感器检测到的可能性。在优选实施方案和具体实施中,步骤420可以包括暂时改变这种参数和/或设置。然后可以在例如预定时间段之后或者在不再检测和/或跟踪物体之后,将该设置/参数重置为其原始状态。Step 415 may include detecting one or more objects using one or both of the object sensors. In some implementations, step 415 may be performed using, for example, a detection module such as detection module 120/160 or 320/360. Upon detection of an object using data from a given sensor, method 400 may then include, at 420 , changing one or more settings of the sensor and/or modules associated with the non-detecting sensor. Thus, for example, if the first object sensor has detected an object, step 420 may include changing the operating parameters and/or settings of the second object sensor or one or more modules associated with the second object sensor to increase the number of objects also detected by the second object sensor. Likelihood of being detected by the second object sensor. In preferred embodiments and implementations, step 420 may involve temporarily changing such parameters and/or settings. This setting/parameter may then be reset to its original state after eg a predetermined period of time or after objects are no longer detected and/or tracked.

在一些具体实施中,步骤420可以包括改变非检测物体传感器的阈值。在一些这样的具体实施中,步骤420可以包括降低非检测物体传感器的检测阈值和跟踪阈值中的至少一者或另外降低与非检测物体传感器相关联的检测阈值和跟踪阈值中的至少一者,在一些这样的具体实施中,步骤420可以包括降低非检测传感器的信噪比和用于启动或维持对非检测传感器的物体的跟踪的跟踪阈值中的至少一者。In some implementations, step 420 may include changing the threshold of the non-detecting object sensor. In some such implementations, step 420 may include lowering at least one of a detection threshold and a tracking threshold of a non-detecting object sensor or otherwise lowering at least one of a detection threshold and a tracking threshold associated with a non-detecting object sensor, In some such implementations, step 420 may include reducing at least one of a signal-to-noise ratio of the non-detection sensor and a tracking threshold for initiating or maintaining tracking of an object of the non-detection sensor.

在步骤425处,可以确定在改变的设置/参数内,非检测传感器(最初没有检测到物体的传感器)是否已经检测到由检测传感器检测到的物体。如果否,则在430处可以恢复非检测传感器和/或相关模块的先前设置。当然,如前所述,在一些具体实施中,如果需要,可以替代地施加时间限制以维持较低阈值设置,如果非检测传感器检测到由检测传感器检测到的物体,则在435处可以使用例如融合跟踪模块等融合来自两个传感器的数据。作为另一替代,在一些具体实施中,步骤425可以包括为非检测传感器施加时间限制以检测由检测传感器检测到的物体。如果该时间限制到期,则方法400可以前进至步骤430,并且可以重置一个或多个暂时设置变化。如果在该时间限制内检测到物体,则在步骤435处,可以通过融合跟踪器等融合两个传感器检测到的物体。At step 425, it may be determined whether a non-detecting sensor (a sensor that initially did not detect an object) has detected an object detected by a detecting sensor within the changed settings/parameters. If not, then at 430 the previous settings of the non-detecting sensors and/or associated modules may be restored. Of course, as previously mentioned, in some implementations, a time limit may be imposed instead to maintain the lower threshold setting if desired, and if a non-detection sensor detects an object detected by a detection sensor, then at 435 one may use, for example A fusion tracking module etc. fuses data from two sensors. As another alternative, in some implementations, step 425 may include imposing a time limit for the non-detection sensor to detect an object detected by the detection sensor. If the time limit expires, method 400 may proceed to step 430 and one or more temporary setting changes may be reset. If an object is detected within this time limit, then at step 435 the objects detected by the two sensors may be fused by a fusion tracker or the like.

已经参照各种实施方案和实现方式描述了前述说明书。然而,本领域普通技术人员将理解,在不脱离本公开的范围的情况下,可以进行各种修改和改变。例如,各种操作步骤以及用于执行操作步骤的部件可以根据特定应用或考虑与系统操作相关的任何数量的成本功能而以各种方式实现。因此,步骤中的任何一个或多个可以删除、修改或与其他步骤组合。此外,本公开应被视为说明性而非限制性意义,并且所有这些修改旨在包括在其范围内。同样,以上已经关于各种实施方案描述了益处、其他优点和问题的解决方案。然而,益处、优点、问题的解决方案以及可能使得任何利益、优点或解决方案发生或变得更加明显的任何一个或多个元件不应被解释为关键、必需或必要的特征或元件。The foregoing specification has been described with reference to various embodiments and implementations. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure. For example, the various operational steps and the means for performing the operational steps may be implemented in various ways depending on the particular application or considering any number of cost functions associated with the operation of the system. Accordingly, any one or more of the steps may be deleted, modified or combined with other steps. Furthermore, the disclosure is to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within its scope. Likewise, benefits, other advantages, and solutions to problems have been described above with respect to various embodiments. However, benefits, advantages, solutions to problems, and any element or elements that may cause or make apparent any benefit, advantage or solution, should not be construed as a critical, required or essential feature or element.

本领域技术人员将理解,在不脱离本发明的基本原理的情况下,可以对上述实施方案的细节进行许多改变。因此,本发明的范围应当仅由以下权利要求书确定。Those skilled in the art will appreciate that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. Accordingly, the scope of the invention should be determined only by the following claims.

Claims (15)

1. A method for remotely detecting an object from within a host vehicle, the method comprising the steps of:
generating a first set of sensed data using a first object sensor;
generating a second set of sensed data using a second object sensor;
detecting an object using the sensing data from the first object sensor;
when the object is detected using the sensing data from the first object sensor, changing a setting associated with the second object sensor to increase a likelihood of detecting the object using the second set of sensing data from the second object sensor, wherein the step of changing the setting associated with the second object sensor to increase the likelihood of detecting the object using the second set of sensing data from the second object sensor comprises: at least one of a detection threshold and a tracking threshold associated with the second object sensor is reduced only within an angular range associated with the detected object to increase a likelihood of detecting the object using the second set of sensed data from the second object sensor.
2. The method of claim 1, wherein the first object sensor comprises a different type of object sensor than the second object sensor.
3. The method of claim 2, wherein one of the first object sensor and the second object sensor comprises a radar sensor, and wherein the other of the first object sensor and the second object sensor comprises a camera.
4. The method of claim 1, wherein the step of changing settings associated with the second object sensor to increase a likelihood of detecting the object using the second set of sensed data from the second object sensor comprises: at least one of a detection threshold of the detection module and a tracking threshold of the tracking module is lowered.
5. The method of claim 4, wherein the step of changing settings associated with the second object sensor to increase a likelihood of detecting the object using the second set of sensed data from the second object sensor comprises: at least one of a signal-to-noise ratio of the detection module and a threshold of the tracking module for initiating or maintaining tracking of an object detected by the detection module is changed.
6. The method of claim 1, further comprising: upon detecting an object using the sensing data from the second object sensor, changing a setting associated with the first object sensor to increase a likelihood of detecting the object using the first set of sensing data from the first object sensor.
7. A system for remotely detecting an object from within a host vehicle, comprising:
a first object sensor configured to receive a first set of sensed data;
a second object sensor configured to receive a second set of sensed data;
an object detection module configured to receive data from the first object sensor, wherein the object detection module is configured to use sensed data from the first object sensor and the second object sensor to identify a remote object from within the host vehicle, wherein the system for remote detection is configured to: changing a setting associated with the second object sensor to increase a likelihood that the object detection module detects an object using the sensed data from the second object sensor when the object detection module detects the object using the first set of sensed data from the first object sensor, and wherein the system for remotely detecting is configured to: when the object detection module detects an object using the first set of sensed data from the first object sensor, at least one of a detection threshold and a tracker threshold associated with the second object sensor is reduced only within an angular range associated with the detected object to increase a likelihood of detecting the object using sensed data from the second object sensor.
8. The system of claim 7, wherein the object detection module is part of a first sensor module that includes the first object sensor.
9. The system of claim 8, wherein the first sensor module further comprises a first tracking module configured to receive processed data from the object detection module to track a remote object identified by the object detection module.
10. The system of claim 9, further comprising a second sensor module, wherein the second sensor module comprises the second object sensor, wherein the second sensor module further comprises a second object detection module configured to receive data from the second object sensor, wherein the second object detection module is configured to use the data from the second object sensor to identify a remote object from within the host vehicle, and wherein the second sensor module further comprises a second tracking module configured to receive processed data from the second object detection module to track the remote object identified by the second object detection module.
11. The system of claim 10, further comprising a fusion tracking module configured to receive data from both the first tracking module and the second tracking module to track objects detected by both the first tracking module and the second tracking module.
12. The system of claim 7, wherein the first object sensor comprises a camera, and wherein the second object sensor comprises a radar sensor.
13. The system of claim 7, wherein the system for remote detection is further configured to: when the object detection module detects an object using the second set of sensed data from the second object sensor, a setting associated with the first object sensor is changed to increase a likelihood that the object detection module detects the object using data from the first object sensor.
14. A system for remotely detecting an object from within a host vehicle, comprising:
a first object sensor configured to receive a first set of sensed data;
a second object sensor configured to receive a second set of sensed data, wherein the first object sensor comprises a different type of object sensor than the second object sensor;
a first object detection module configured to receive raw data from the first object sensor, wherein the first object detection module is configured to attempt to identify a remote object using the raw data from the first object sensor;
a second object detection module configured to receive raw data from the second object sensor, wherein the second object detection module is configured to attempt to identify a remote object using the raw data from the second object sensor;
a first tracking module configured to receive processed data from the first object detection module to track a remote object identified by the first object detection module;
a second tracking module configured to receive processed data from the second object detection module to track the remote object identified by the second object detection module; and
a fusion tracking module configured to receive data from both the first tracking module and the second tracking module to track objects detected by both the first tracking module and the second tracking module,
wherein the system for remote detection is configured to: upon confirming that at least one of the first object detection module and the first tracking module detects an object, at least one of a detection threshold of the second object detection module and a tracker threshold of the second tracking module is lowered only within an angular range associated with the detected object to increase a likelihood of detecting the object using data from the second object sensor.
15. The system of claim 14, wherein the first object sensor comprises a camera, and wherein the second object sensor comprises a radar sensor.
CN201880064466.4A 2017-11-06 2018-11-05 Prompted automotive sensor fusion Active CN111164456B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15/804,821 US10641888B2 (en) 2017-11-06 2017-11-06 Cued automobile sensor fusion
US15/804,821 2017-11-06
PCT/US2018/059289 WO2019090276A1 (en) 2017-11-06 2018-11-05 Cued automobile sensor fusion

Publications (2)

Publication Number Publication Date
CN111164456A CN111164456A (en) 2020-05-15
CN111164456B true CN111164456B (en) 2023-08-01

Family

ID=66327000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880064466.4A Active CN111164456B (en) 2017-11-06 2018-11-05 Prompted automotive sensor fusion

Country Status (5)

Country Link
US (1) US10641888B2 (en)
EP (1) EP3707527A4 (en)
JP (1) JP6946570B2 (en)
CN (1) CN111164456B (en)
WO (1) WO2019090276A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009060A1 (en) * 2018-07-02 2020-01-09 ソニーセミコンダクタソリューションズ株式会社 Information processing device, information processing method, computer program, and moving body device
JP7185547B2 (en) * 2019-02-07 2022-12-07 株式会社デンソー vehicle detector
DE102019006243A1 (en) * 2019-09-04 2021-03-04 Man Truck & Bus Se Method for operating a turning assistance system, turning assistance system and motor vehicle with such a turning assistance system
CN111025959B (en) * 2019-11-20 2021-10-01 华为技术有限公司 A data management method, device, equipment and smart car
US11593597B2 (en) * 2020-11-16 2023-02-28 GM Global Technology Operations LLC Object detection in vehicles using cross-modality sensors
US12493118B2 (en) * 2022-06-20 2025-12-09 Honeywell International Inc. Integrated surveillance radar system
CN116824322A (en) * 2023-06-30 2023-09-29 联想(北京)有限公司 A data fusion method, device and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012024334A1 (en) * 2012-12-13 2013-08-01 Daimler Ag Method for determining existence and e.g. position of objects for operating driver assistance device of motor car, involves performing retrodiction of existence probabilities of objects during supply of unactual sensor data to fusion system
CN103823208A (en) * 2012-11-16 2014-05-28 通用汽车环球科技运作有限责任公司 Method and apparatus for state of health estimation of object sensing fusion system
DE102014208006A1 (en) * 2014-04-29 2015-11-26 Bayerische Motoren Werke Aktiengesellschaft Method for detecting the surroundings of a vehicle
DE102016201814A1 (en) * 2016-02-05 2017-08-10 Bayerische Motoren Werke Aktiengesellschaft Method and device for sensory environment detection in a vehicle

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7596328B2 (en) * 2003-04-22 2009-09-29 Hewlett-Packard Development Company, L.P. Efficient sensing system
DE10339645A1 (en) 2003-08-28 2005-04-14 Robert Bosch Gmbh Method and device for determining the size and position of a parking space
US7483551B2 (en) 2004-02-24 2009-01-27 Lockheed Martin Corporation Method and system for improved unresolved target detection using multiple frame association
JP4396400B2 (en) * 2004-06-02 2010-01-13 トヨタ自動車株式会社 Obstacle recognition device
JP2007232653A (en) * 2006-03-02 2007-09-13 Alpine Electronics Inc On-vehicle radar system
JP4211809B2 (en) 2006-06-30 2009-01-21 トヨタ自動車株式会社 Object detection device
JP5101075B2 (en) * 2006-10-11 2012-12-19 アルパイン株式会社 Perimeter monitoring system
WO2010042483A1 (en) 2008-10-08 2010-04-15 Delphi Technologies, Inc. Integrated radar-camera sensor
US8416123B1 (en) 2010-01-06 2013-04-09 Mark Resources, Inc. Radar system for continuous tracking of multiple objects
US8775064B2 (en) * 2011-05-10 2014-07-08 GM Global Technology Operations LLC Sensor alignment process and tools for active safety vehicle applications
US9429650B2 (en) 2012-08-01 2016-08-30 Gm Global Technology Operations Fusion of obstacle detection using radar and camera
JP5835243B2 (en) * 2013-02-07 2015-12-24 株式会社デンソー Target recognition device
SE537621C2 (en) * 2013-09-10 2015-08-11 Scania Cv Ab Detection of objects using a 3D camera and a radar
JP5991332B2 (en) * 2014-02-05 2016-09-14 トヨタ自動車株式会社 Collision avoidance control device
US9563808B2 (en) 2015-01-14 2017-02-07 GM Global Technology Operations LLC Target grouping techniques for object fusion
US9599706B2 (en) 2015-04-06 2017-03-21 GM Global Technology Operations LLC Fusion method for cross traffic application using radars and camera
EP3158412B8 (en) * 2015-05-23 2019-05-22 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
JP6645074B2 (en) * 2015-08-25 2020-02-12 株式会社Ihi Obstacle detection system
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
US20170242117A1 (en) * 2016-02-19 2017-08-24 Delphi Technologies, Inc. Vision algorithm performance using low level sensor fusion
US10317522B2 (en) * 2016-03-01 2019-06-11 GM Global Technology Operations LLC Detecting long objects by sensor fusion
US10445928B2 (en) * 2017-02-11 2019-10-15 Vayavision Ltd. Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
US10602242B2 (en) * 2017-06-14 2020-03-24 GM Global Technology Operations LLC Apparatus, method and system for multi-mode fusion processing of data of multiple different formats sensed from heterogeneous devices
US10788583B2 (en) * 2017-10-16 2020-09-29 Raytheon Company Rapid robust detection decreaser

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823208A (en) * 2012-11-16 2014-05-28 通用汽车环球科技运作有限责任公司 Method and apparatus for state of health estimation of object sensing fusion system
DE102012024334A1 (en) * 2012-12-13 2013-08-01 Daimler Ag Method for determining existence and e.g. position of objects for operating driver assistance device of motor car, involves performing retrodiction of existence probabilities of objects during supply of unactual sensor data to fusion system
DE102014208006A1 (en) * 2014-04-29 2015-11-26 Bayerische Motoren Werke Aktiengesellschaft Method for detecting the surroundings of a vehicle
DE102016201814A1 (en) * 2016-02-05 2017-08-10 Bayerische Motoren Werke Aktiengesellschaft Method and device for sensory environment detection in a vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Fusion of Velodyne and camera data for scene parsing;Zhao G 等;2012 15th International Conference of Information Fusion.IEEE;1-8 *
基于毫米波雷达和机器视觉信息融合的障碍物检测;翟光耀;陈蓉;张剑锋;张继光;吴澄;汪一鸣;;物联网学报(第02期);80-87 *
自主视觉导航方法综述;黄显林 等;吉林大学学报(信息科学版);第28卷(第2期);158-162 *

Also Published As

Publication number Publication date
CN111164456A (en) 2020-05-15
JP2021501344A (en) 2021-01-14
US10641888B2 (en) 2020-05-05
EP3707527A4 (en) 2021-08-25
JP6946570B2 (en) 2021-10-06
WO2019090276A1 (en) 2019-05-09
EP3707527A1 (en) 2020-09-16
US20190137619A1 (en) 2019-05-09

Similar Documents

Publication Publication Date Title
CN111164456B (en) Prompted automotive sensor fusion
JP2021501344A5 (en)
US10600197B2 (en) Electronic device and method for recognizing object by using plurality of sensors
US10571913B2 (en) Operation-security system for an automated vehicle
US8554462B2 (en) Unmanned aerial vehicle and method for controlling the unmanned aerial vehicle
US10586102B2 (en) Systems and methods for object tracking
US20190285726A1 (en) Malfunction detecting device
US8878696B2 (en) Parking control apparatus and method for providing an alarm thereof
WO2018068771A1 (en) Target tracking method and system, electronic device, and computer storage medium
JP2018510398A (en) Event-related data monitoring system
JP2014137288A (en) Device and method for monitoring surroundings of vehicle
JP5904069B2 (en) Image processing apparatus, object detection method, and object detection program
US20170357850A1 (en) Human detection apparatus and method using low-resolution two-dimensional (2d) light detection and ranging (lidar) sensor
WO2014031056A3 (en) Surveillance system with motion detection and suppression of alarms in non-alarm areas
KR101767237B1 (en) Apparatus and method for detecting object
EP3205084A1 (en) Image processing method
US12093348B2 (en) Systems and methods for bayesian likelihood estimation of fused objects
KR102535618B1 (en) Driver assistance system for detecting obstacle using fusion object sensing information and method thereof
US11209517B2 (en) Mobile body detection device, mobile body detection method, and mobile body detection program
US20200189569A1 (en) Driver verified self parking
US11341773B2 (en) Detection device and control method of the same
CN105518490A (en) Object detection method, device, remote control mobile device, and aircraft
EP4607463A1 (en) Object tracking method, smart device, and computer-readable storage medium
US10545230B2 (en) Augmented reality view activation
US11175148B2 (en) Systems and methods to accommodate state transitions in mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220831

Address after: Michigan USA

Applicant after: Anzher Software Co.,Ltd.

Address before: Michigan, USA

Applicant before: Vennell America

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant