[go: up one dir, main page]

WO2018068771A1 - Systeme et procédé de suivi de cible, dispositif électronique et support de memoire informatique - Google Patents

Systeme et procédé de suivi de cible, dispositif électronique et support de memoire informatique Download PDF

Info

Publication number
WO2018068771A1
WO2018068771A1 PCT/CN2017/110745 CN2017110745W WO2018068771A1 WO 2018068771 A1 WO2018068771 A1 WO 2018068771A1 CN 2017110745 W CN2017110745 W CN 2017110745W WO 2018068771 A1 WO2018068771 A1 WO 2018068771A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
tracking
electronic device
target
time period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/110745
Other languages
English (en)
Chinese (zh)
Inventor
卿明
陈鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninebot Beijing Technology Co Ltd
Original Assignee
Ninebot Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninebot Beijing Technology Co Ltd filed Critical Ninebot Beijing Technology Co Ltd
Publication of WO2018068771A1 publication Critical patent/WO2018068771A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/166Mechanical, construction or arrangement details of inertial navigation systems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Definitions

  • the application is based on a Chinese patent application with the application number of 201611001901.8, the application date is November 14, 2016, and the Chinese patent application with the application number of 201610891388.8 and the application date of October 12, 2016, and the Chinese patent application is required.
  • the entire contents of this Chinese patent application are hereby incorporated herein by reference.
  • the present invention relates to the field of target tracking, and in particular to a target tracking method, system, electronic device and computer storage medium.
  • target tracking is an indispensable part of most visual systems, and has applications in video surveillance, human-computer interaction, and military fields.
  • the current target tracking technology can detect the accuracy of the tracking system in the case of the target motion mode, the target motion scene, the sudden change of the target external features, and the sudden changes in the shooting of the target when tracking the target.
  • Problems such as the decrease in stability, for example, the vision-based tracking technique has poor stability during long-term tracking, and is easy to lose when the target object suddenly accelerates.
  • the vision-based motion behavior analysis technology it is easy to cause the tracking process.
  • the error is accumulated in the tracking value, so that the closed loop adjustment cannot be formed.
  • Embodiments of the present invention are expected to provide a target tracking method, system, electronic device, and computer Storage media to solve at least the technical problem of low tracking accuracy for long-term visual tracking.
  • a target tracking method applied to a first electronic device, the method comprising: performing a vision-based target tracking to obtain a first tracking result for a tracked target; Determining motion information of the tracked target motion state, and combining the first tracking result, analyzing to obtain a second tracking result for the tracked target; wherein the second tracking result is the same as the first tracking result Or different; the motion information for describing the motion state of the tracked target is obtained based on a measurement by a second electronic device disposed on the tracked target.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters are recorded in a predetermined storage space of the second electronic device, and the motion measurement parameters include at least: Acceleration and/or angular angular velocity; the first electronic device obtains motion information, including: periodically or aperiodically accessing a predetermined storage space of the second electronic device, and acquiring the second electronic device within a predetermined time period Acquiring the obtained motion measurement parameter; according to the motion information for describing the motion state of the tracked target, combining the first tracking result, analyzing and obtaining the second tracking result for the tracked target, including: Performing statistical analysis on the motion measurement parameter in the time period, obtaining motion description information of the tracked target in a predetermined time period, and correcting the first tracking result according to the motion description information to obtain the second Tracking the result; wherein the motion description information includes at least a motion speed and/or a motion direction.
  • performing statistical analysis on the motion measurement parameter in the predetermined time period to obtain motion description information of the tracked target within a predetermined time period including: combining the second electronic device in a predetermined The motion measurement parameter in the time period and the self-motion measurement parameter of the first electronic device in the predetermined time period, and the analysis obtains the tracked target relative to the first electronic device within the predetermined time period Motion description information.
  • the motion information includes motion description information for the tracked target
  • the motion description information is obtained by the second electronic device according to the motion measurement parameter in the predetermined time period obtained by the acquisition, and the motion measurement parameter includes at least: motion acceleration and/or motion angular velocity, and the motion description information.
  • the motion description information being recorded in a predetermined storage space of the second electronic device; the first electronic device obtaining motion information, including: a periodic or non-periodic access station a predetermined storage space of the second electronic device, acquiring motion description information of the second electronic device in a predetermined time period; and combining the first tracking result according to the motion information for describing the motion state of the tracked target,
  • the obtaining the second tracking result for the tracked target includes: correcting the first tracking result according to the motion description information in the predetermined time period to obtain the second tracking result.
  • the motion description information includes a state probability of the tracked target in different motion directions; and correcting the first tracking result according to the motion description information in the predetermined time period to obtain a second tracking a result, comprising: determining the tracked state according to a state probability of the tracked target in different moving directions, and the tracked target represented by the first tracking result is at least one candidate tracking area in a tracking image The target is in a confidence tracking area in the tracking image, and a second tracking result is generated based on the confidence tracking area; wherein the state probability is used to indicate a probability that the tracked object moves in different directions of motion.
  • the method further includes: when the motion description information in the predetermined time period indicates that the moving direction of the tracked target in a predetermined time period is a first direction, the first time in the predetermined time period a tracking result characterizing that the moving direction of the tracked target is within a predetermined time period is a second direction, the first direction is different from the second direction, and the tracked target is in the first direction within a predetermined time period
  • the moving distance is greater than a preset first distance threshold
  • an alarm prompt message is generated, the alarm prompt The information is used to prompt the first electronic device to drop the tracked target.
  • the method further comprises: re-detecting the tracked target, and performing the vision-based target tracking again.
  • an electronic device comprising: a target tracking unit configured to perform vision-based target tracking to obtain a first tracking result for a tracked target; a processing unit, Configuring, according to the motion information used to describe the motion state of the tracked target, in combination with the first tracking result, analyzing to obtain a second tracking result for the tracked target; wherein the second tracking result is The first tracking result is the same or different; the motion information for describing the motion state of the tracked target is obtained based on measurement by another electronic device disposed on the tracked target.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters being recorded in a predetermined storage space of the another electronic device, the motion measurement parameters including at least: An acceleration and/or a moving angular velocity; the electronic device further comprising a first reading unit configured to periodically or aperiodically access a predetermined storage space of the another electronic device to acquire the another electronic device at a predetermined time
  • the motion measurement parameter is collected in the segment; the processing unit is configured to perform statistical analysis on the motion measurement parameter in the predetermined time period acquired by the first reading unit, to obtain the tracked target
  • the motion description information in the predetermined time period, the first tracking result is corrected according to the motion description information, and the second tracking result is obtained; wherein the motion description information includes at least a motion speed and/or a motion direction .
  • the processing unit is configured to combine the motion measurement parameter of the another electronic device acquired by the first reading unit with a predetermined time period, and the first electronic device is in the predetermined
  • the self-motion measurement parameter in the time period is analyzed, and the motion description information of the tracked target relative to the first electronic device during the predetermined time period is obtained.
  • the motion information includes motion description information of the tracked target, and the motion description information is used by the another electronic device according to the acquisition for a predetermined time period.
  • the dynamic measurement parameter is obtained by statistical analysis, and the motion measurement parameter includes at least: motion acceleration and/or motion angular velocity, the motion description information includes at least a motion speed and/or a motion direction, and the motion description information is recorded in the other a predetermined storage space of the electronic device;
  • the electronic device further includes a second reading unit configured to periodically or aperiodically access a predetermined storage space of the another electronic device, and acquire the another electronic device at a predetermined The motion description information in the time period;
  • the processing unit is configured to correct the first tracking result according to the motion description information in the predetermined time period acquired by the second reading unit, to obtain the first Second tracking results.
  • the motion description information includes a state probability of the tracked target in different motion directions;
  • the processing unit is configured to be based on a state probability of the tracked target in different motion directions, and Determining, by the first tracking result, the tracked target is at least one candidate tracking area in the tracking image, determining a trusted tracking area of the tracked target in the tracking image, and generating a second tracking based on the trusted tracking area a result; wherein the state probability is used to indicate a probability of the tracked target moving in different directions of motion.
  • the electronic device further includes an alarm unit configured to: when the motion description information in the predetermined time period indicates that the direction of motion of the tracked target within a predetermined time period is a first direction, the predetermined The first tracking result in the time period represents that the moving direction of the tracked target in the predetermined time period is the second direction, the first direction is different from the second direction, and the tracked target is in the predetermined time period
  • the moving distance in the first direction is greater than a preset first distance threshold
  • an alarm prompt message is generated.
  • the alarm prompt information is used to prompt the first electronic device to drop the tracked target.
  • the target tracking unit is further configured to re-detect the tracked target and re-execute the vision-based target tracking after the alarm unit generates the alarm prompt information.
  • a target tracking system includes: an electronic device according to an embodiment of the present invention; and another electronic device disposed on the tracked target, configured The motion information for describing the motion state of the tracked target is obtained based on the measurement.
  • a computer storage medium having stored therein computer executable instructions for performing target tracking according to an embodiment of the present invention method.
  • the first electronic device in the process of tracking the target, may obtain motion information of the target motion state measured by the second electronic device, according to the The motion information is combined with the first tracking result obtained by the first electronic device in real time to obtain a second tracking result of the tracked target, and the motion state of the tracked target is detected in real time.
  • the second tracking result is obtained by combining the motion information of the tracked target measured by the second electronic device, so that the obtained tracking result is more accurate, and the accuracy of the tracking target in the prior art is solved.
  • the problem increases the robustness of visual tracking.
  • FIG. 1 is a flow chart 1 of an optional target tracking method according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of an alternative electronic device in accordance with an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of an optional target tracking system in accordance with an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an implementation scenario of an optional target tracking method according to an embodiment of the present invention.
  • FIG. 5 is a flow chart 2 of an alternative target tracking method in accordance with an embodiment of the present invention.
  • Visual tracking refers to the detection, extraction, recognition and tracking of moving targets in the image sequence, and obtains the motion parameters of the moving target, such as position, velocity, acceleration and motion trajectory, so as to carry out the next processing and analysis to achieve the motion
  • the behavior of the target is understood to complete a higher level of testing tasks.
  • the Inertial Measurement Unit is a device that measures the three-axis attitude angle (or angular velocity) and acceleration of an object.
  • Gyros and accelerometers are the main components of the IMU, and their accuracy directly affects the accuracy of the inertial system.
  • an IMU consists of three single-axis accelerometers and three single-axis gyroscopes.
  • the accelerometer detects the acceleration of an object's independent three-axis acceleration signal in the carrier coordinate system, while the gyroscope detects the angular velocity of the carrier relative to the navigational coordinate system.
  • the signal measures the angular velocity and acceleration of the object in three dimensions and solves the attitude of the object.
  • an embodiment of a target tracking method is provided, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and, although The logical order is shown in the flowcharts, but in some cases the steps shown or described may be performed in a different order than the ones described herein.
  • FIG. 1 is a flowchart 1 of an optional target tracking method according to an embodiment of the present invention. As shown in FIG. 1 , the method includes the following steps:
  • Step S102 performing vision-based target tracking to obtain a first tracking result for the tracked target.
  • Step S104 according to the motion information for describing the motion state of the tracked target, combined with the first tracking result, the analysis obtains the second tracking result for the tracked target.
  • the method is applied to a first electronic device, the first electronic device has a target tracking unit, and the target tracking unit is configured to perform vision-based target tracking; the second tracking result is the same as or different from the first tracking result; The motion information of the target motion state is obtained based on the measurement by the second electronic device disposed on the tracked target.
  • the first electronic device in the process of tracking the target, may receive the motion information of the target motion state measured by the second electronic device, and combine the motion information of the target motion state acquired by the second electronic device.
  • the first tracking result obtained by the first electronic device in real time is analyzed to obtain the second tracking result of the tracking target, and the motion state of the tracking target can be detected in real time, and the tracking target is obtained by real-time measurement by the second electronic device set on the tracking target.
  • the first tracking result of the tracking target acquired by the first electronic device, and the second tracking result is obtained by using the motion information to correct the first tracking result, and the second tracking result is obtained by using the foregoing solution, instead of directly
  • the first tracking result of the first electronic device is used as the final result, and the second tracking result is obtained by analyzing the motion information of the tracking target measured by the second electronic device, and the obtained result is more accurate, and the tracking target in the prior art is solved.
  • Low accuracy time increasing the robustness of visual tracking .
  • a mobile device ie, a second electronic device capable of detecting a target motion state may be disposed on the tracking target without cumbersome peripheral devices, and the motion information obtained by the second electronic device is combined with the second The first tracking result obtained by the electronic device obtains the second tracking result, which not only can obtain an accurate tracking result, but also can reduce the tracking cost.
  • the embodiment of the present application is applied to the first electronic device, and the first electronic device may be a mobile device that detects the motion state of the tracking target, and the mobile device may be provided with a target tracking unit, and the target tracking unit may be used for Perform vision-based target tracking.
  • the first electronic device in the above embodiment may have a corresponding tracking system, which may be installed on the first electronic device, and is used to control the operation of each unit of the first electronic device, and the tracking system is directed to the first electronic device.
  • Each unit issues an instruction, which can instruct each unit of the first electronic device to work, such that the first electronic device can track the target, wherein the representation of the instruction can be a data packet or code, each unit in the first electronic device After receiving the instruction, the corresponding task is executed according to the instruction information, so that the second electronic device can be interacted to achieve the function of accurately tracking the target.
  • the first electronic device can be a robot.
  • the vision-based target tracking is performed by the target tracking unit, and the first tracking result for the tracking target is obtained
  • the target tracking unit may be provided with a detecting device, and the detecting device may detect the first position of the tracking target Information and tracking the motion state of the target, which includes the direction of motion and the angle of motion.
  • the detection device detects the position of the tracking target, and acquires the first position of the target. After the target tracking unit acquires the first position of the tracking target, the position information of the first position forms the first data.
  • the data may include coordinate information of the tracking target, a current time, and a distance between the tracking target and the first electronic device, where the coordinate information may be acquired according to a positioning unit in the first electronic device, and the target tracking unit And storing the acquired first data; the target tracking unit transmits the first data to the storage unit of the first electronic device, where the first electronic device acquires After the first data, the detection instruction is sent to the target tracking unit again, and the target tracking unit acquires the second location information of the tracking target again according to the received detection instruction, and the target tracking unit forms the second location information of the acquired tracking target into a second.
  • the target tracking unit sends the acquired second data to the first electronic device, and the first electronic device compares and analyzes according to the first data and the second data to obtain a moving direction, a moving speed, and a current tracking target of the tracking target.
  • Current coordinate data The moving direction of the tracking target may be determined according to the coordinate of the tracking target of the second data relative to the coordinate of the tracking target of the first data, and the moving speed of the tracking target may be determined according to where the first data is located.
  • the time and the time in the second data and the moving distance of the tracking target determine the moving speed of the tracking target when moving between the first data and the second data.
  • the first electronic device forms a first tracking result by using the analyzed moving direction, coordinate information, moving speed, moving time, and the distance between the tracking target and the first electronic device, and the first electronic device stores the first tracking result. And then send the first tracking result to the target tracking unit.
  • the first electronic device may transmit data in a form of wireless transmission, and the system of the first electronic device sends an instruction to the target tracking unit of the first electronic device according to the first tracking result, so that the first electronic device moves,
  • the direction and speed of the movement may be consistent with the moving direction and the moving speed of the tracking target according to the first tracking result, and the tracking speed and direction may be appropriately adjusted, for example, if the distance between the first electronic device and the tracking target is long,
  • the first electronic device can control the tracking speed to exceed the moving speed of the tracking target in the first tracking result when starting, and the tracking speed of the first electronic device can be reduced when the distance between the first electronic device and the tracking target reaches a preset distance It is consistent with the moving speed in the first tracking result.
  • the moving direction of the moving target includes at least one of: forward, backward, stop, left shift, right shift.
  • the target tracking unit in the foregoing embodiment may include radio frequency identification (RFID).
  • RFID radio frequency identification
  • the tracking target in the above embodiment may be mobile, such as a moving person or item.
  • step S104 is performed, and according to the motion information for describing the motion state of the tracking target, combined with the first tracking result, the second tracking result for the tracking target is obtained by analyzing, wherein
  • the motion information for describing the motion state of the tracking target is acquired by a second electronic device disposed on the tracking target, and the second electronic device may include a detecting unit, a processing unit, and a storage unit, and the detecting unit may detect the movement of the tracking target.
  • the detecting operation may be performed according to an instruction issued by a system of the second electronic device, after receiving the instruction, the detecting unit starts detecting the coordinate information of the current second electronic device, and simultaneously acquires information of the tracking target, and the detecting unit Transmitting the information of the tracking target to the storage unit to form first data, where the first data includes the coordinates of the current tracking target, the current time, and the moving direction of the tracking target, and the detecting unit transmits the coordinate information and the current time to form the first data to the second data.
  • Storage unit of electronic device, second electronic device After acquiring the first data, the system sends an instruction to the detecting unit again.
  • the detecting unit can detect the coordinates of the second electronic device and the information of the tracking target again, and the detecting unit transmits the information to the storage unit to form the first data.
  • the second electronic device analyzes the moving angle and the moving speed of the tracking target according to the first data and the second data, and then the second electronic device continuously detects the moving information of the tracking target according to the above manner, and the second electronic device will track the target Information such as the direction of movement, the speed of movement, and the time of change are stored to form second motion data.
  • the second electronic device may read the second motion data by using the reading unit, and after acquiring the second motion data, the first electronic device A tracking result is compared, and a second tracking result is obtained.
  • the second tracking result may be the same as or different from the first tracking result.
  • the first tracking result may not be updated.
  • An electronic device can move according to the first tracking result, and can predict the moving direction and shift of the tracking target according to the first tracking result.
  • the speed of the prediction may be the same as or different from the first tracking result; when the first tracking result is different from the second tracking result, the first electronic device determines the magnitude of the deviation of the result, and the deviation value of the comparison result may be given here.
  • a threshold is set, which is determined according to the motion state of the tracking target (for example, the moving speed is set to 5 m/min), and the difference between the first tracking result and the second tracking result is large (for example, comparison)
  • the first tracking result needs to be changed, wherein the moving direction of the tracking target, the moving speed, the coordinate, and the moving change angle of the tracking target may be updated, and the updated data may be the data in the second tracking result.
  • the data may be adjusted according to the second tracking result. If the difference between the first tracking result and the second tracking result is small, for example, when the difference of the comparison result is lower than the threshold, the first tracking result may not be Update.
  • the result of the “following” of the first electronic device during the tracking process may be obtained, thereby updating the data and changing the first electronic device.
  • target modeling is divided into offline target modeling and online target modeling.
  • Offline target modeling refers to the model that has already trained the tracking target before starting tracking, and whether to identify the tracking target in the real-time tracking process;
  • online target Modeling refers to learning and updating the target model during the tracking process, and simultaneously identifying the tracking target.
  • the second electronic device in the foregoing embodiment may be a mobile communication device (eg, a remote controller, a mobile phone) with an IMU chip built therein, and the device may communicate the IMU information to the first electronic device in real time, where The IMU chip of the second electronic device may be the detecting unit of the above embodiment.
  • a mobile communication device eg, a remote controller, a mobile phone
  • the IMU chip of the second electronic device may be the detecting unit of the above embodiment.
  • the second electronic device may be disposed on the body of the tracking target, and may be an item worn by the tracking target or a hand held by the tracking target.
  • the number of wires may be According to the transmission, it can also be wireless data transmission.
  • the first electronic device may include a photographing device that can capture a tracking target, and the photographing device can transmit the photographed information to the first electronic device in real time, and the photographing device can be a camera.
  • the tracking in the above embodiment may be visual tracking, which may be obtained by a photographing device in the first electronic device, and the photographing device captures a current environment of the first electronic device, wherein the current environment includes Tracking the walking path of the target, the obstacle in the walking direction, the tracking target, the light intensity, if there is an obstacle in the walking direction of the first electronic device, analyzing the coordinates of the obstacle and the size of the obstacle, the current tracking environment
  • the environment between the first electronic device and the tracking target is further included, and the environment between the first electronic device and the tracking target includes an obstacle, a light intensity, and a background when the first electronic device moves toward the tracking target.
  • the motion information includes motion measurement parameters for the tracking target, the motion measurement parameters are recorded in a predetermined storage space of the second electronic device, and the motion measurement parameters include at least: motion acceleration and/or motion angular velocity, the first electronic device Acquiring the motion information, including: periodically or aperiodically accessing a predetermined storage space of the second electronic device, and acquiring a motion measurement parameter collected by the second electronic device in a predetermined time period, where the periodicity refers to the first electronic device A time interval is set, for example, 0.5 seconds, 1 second, 5 seconds, 10 seconds, 20 seconds, etc., after a time interval, the first electronic device acquires tracking target motion change data stored in a predetermined storage space of the second electronic device.
  • the analyzing obtains the second tracking result for the tracking target, including: performing statistical analysis on the motion measurement parameter in the predetermined time period, and obtaining Tracking the motion description information of the target within a predetermined time period, correcting the first tracking result according to the motion description information, and obtaining a second tracking result; wherein the motion description information includes at least a motion speed and/or a motion direction, and may be combined with the second electronic
  • the motion measurement parameter of the device in the predetermined time period and the self-motion measurement parameter of the first electronic device within a predetermined time period, and the comprehensive analysis obtains the tracking target relative to the predetermined time period.
  • the motion description information of the first electronic device includes at least a motion speed and/or a motion direction
  • the first electronic device obtains the second tracking result by continuously acquiring the first tracking result and the motion data analysis of the second electronic device, so that the accuracy of the tracking can be increased by the motion data acquired in real time, thereby accurately Tracking the target also increases the stability of the tracking target.
  • the motion measurement parameter may be continuously updated.
  • the data is newly stored in the predetermined storage space of the second electronic device, and the original data may be The deletion may also be temporarily retained. For example, when the predetermined storage space of the second electronic device is about to be full, the data of the original second electronic device may be deleted.
  • the acceleration in the above embodiment it can be obtained by measuring the speed change information of the tracking target, and after the tracking target changes the traveling speed, the motion acceleration of the tracking target can be calculated.
  • the statistical measurement of the motion measurement parameter of the first electronic device may be performed, and the statistical analysis includes analyzing the position change, the moving direction, and the moving speed of the tracking target measured by the first electronic device, and may pass the second time.
  • the obtained position is compared with the first obtained position to obtain a motion measurement parameter, for example, a screen with a length and a width of 10 meters is set, and the first tracking target is at the center, and the coordinates are (5, 5), and the second The tracking target is at the northernmost center, and its coordinates are (5, 10). It can be judged by analysis that the target moves in the true north direction during the time interval between the first measurement and the second measurement. For 5 meters, the moving distance and the moving time are calculated to calculate the moving speed.
  • the first electronic device can store the motion change data, and after comprehensive analysis, finally obtain the motion description information of the tracking target within a predetermined time period, the predetermined time.
  • the time interval between the first electronic device and the second electronic device is transmitted.
  • the data is acquired by the first electronic device every time interval (for example, 5 seconds).
  • the motion information includes motion description information for the tracking target, the first power
  • the child device obtains motion description information by periodically or aperiodically accessing a predetermined storage space of the second electronic device, and the predetermined storage space of the second electronic device is determined relative to the size of the storage device of the second electronic device. If the predetermined storage space is large, the second electronic device may store motion change information of the plurality of second electronic devices, and if the predetermined storage space is small, for example: 10, the second electronic device may store less second electronic The motion change information of the device, for example, two.
  • the first tracking result is corrected according to the motion description information in the predetermined time period, and finally the second tracking result is obtained.
  • the motion change parameter may be adopted, and the motion change parameter may be adopted.
  • the second tracking result is obtained.
  • the value of the motion change angle may be set within 10 degrees. If the motion direction in the second tracking result changes within the value relative to the motion direction of the first tracking result, the motion direction or angle information of the first tracking result is not used.
  • the motion direction or angle information of the first tracking result may be corrected, and the motion direction or the motion angle in the first tracking result may be changed. Modified to a motion direction or a motion angle in the second motion data acquired according to the second electronic device.
  • a large error may occur when tracking the target, for example, tracking the target in a vision-based tracking technique, and the first tracking result and the second electronic device that can be measured at the first electronic device when the time is extended
  • the second motion data obtained by the measurement is compared with the error, the data of the first tracking result is corrected to ensure the stability during the long-term tracking process.
  • the tracking target when the tracking target suddenly accelerates, the tracking target will be It is easy to lose, when the measured first tracking result is compared with the acquired second motion data, a large error value occurs, and the data of the first tracking result can be corrected (for example, the first electronic device also accelerates the motion, and And correspondingly change the direction of motion) Re-track the target, this will ensure that the tracking target is re-tracked to the target after losing.
  • the first tracking result is corrected according to the motion description information in the predetermined time period, and the second tracking result is obtained, which may be according to the state probability of the tracking target in different motion directions, and the first tracking result.
  • the motion description information includes the state probability of the tracking target in different motion directions, and refers to estimating the target motion direction or speed when performing motion estimation on the tracking target, wherein the target motion direction or speed
  • the estimation may be obtained by the first electronic device according to the second tracking result or the first tracking result, and the estimated target motion direction or speed may be the same as or different from the data of the second tracking result. After tracking the moving direction of the target, the moving area and the moving line of the corresponding tracking target in the next time interval can be obtained.
  • the area in which the tracking target is active may be in a fixed screen, and the next candidate activity area may be sent to the first electronic device according to the second tracking result, where the first electronic device is based on the candidate active area and the tracking target.
  • the active area and the moving route in a certain time interval in the future determine the traveling direction and the traveling speed, thereby improving the accuracy of the tracking.
  • the method may further include: the motion description information in the predetermined time period indicates that the motion direction of the tracking target in the predetermined time period is the first direction, and the first tracking result in the predetermined time period indicates that the tracking target is in the The moving direction in the predetermined time period is the second direction, the first direction is different from the second direction, and the moving distance of the tracking target in the first direction is greater than the preset first distance threshold in the predetermined time period, and the tracking target is at the predetermined time.
  • the moving distance in the second direction in the segment is greater than the preset second distance threshold, an alarm prompt message is generated, and the prompt information is used.
  • the first electronic device is prompted to drop the tracking target.
  • the alarm information is obtained according to the second tracking result
  • the alarm information may be an instruction sent by the processing center of the first electronic device to the target tracking unit, and the target tracking unit may modify the first time after receiving the instruction.
  • Motion data of an electronic device wherein the motion data may be derived according to the second motion data, and if the second tracking result is different from the data of the first tracking result, the first electronic device adjusts its moving direction and motion The speed, for example, speeds up the movement of the first electronic device.
  • the electronic device includes: a target tracking unit 21 configured to perform vision-based target tracking to obtain a tracking target. a first tracking result; the processing unit 22 is configured to: according to the motion information used to describe the motion state of the tracking target, combine the first tracking result, and obtain a second tracking result for the tracking target; wherein, the second tracking result and the first tracking The results are the same or different; the motion information for describing the motion state of the tracking target is obtained based on the measurement by another electronic device disposed on the tracking target.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters being recorded in a predetermined storage space of the another electronic device, the motion measurement parameters including at least: Acceleration and/or angular velocity of motion;
  • the electronic device further includes a first reading unit configured to periodically or aperiodically access a predetermined storage space of the another electronic device, and acquire the collected by the another electronic device within a predetermined time period.
  • Motion measurement parameters ;
  • the processing unit 22 is configured to perform statistical analysis on the motion measurement parameter in the predetermined time period acquired by the first reading unit, to obtain motion description information of the tracked target in a predetermined time period, And correcting the first tracking result according to the motion description information to obtain the second tracking result; wherein the motion description information includes at least a motion speed and/or a motion direction.
  • the processing unit 22 is configured to combine the other acquired by the first reading unit. Obtaining, by the electronic device, the motion measurement parameter within the predetermined time period and the self-motion measurement parameter of the first electronic device within the predetermined time period, and obtaining the tracked target for the predetermined time period relative to the The motion description information of the first electronic device.
  • the motion information includes motion description information for the tracked target, and the motion description information is statistically analyzed by the another electronic device according to motion measurement parameters within a predetermined time period obtained by the acquisition.
  • the motion measurement parameter comprises at least: motion acceleration and/or motion angular velocity
  • the motion description information includes at least a motion speed and/or a motion direction
  • the motion description information is recorded in a predetermined storage space of the another electronic device in;
  • the electronic device further includes a second reading unit configured to periodically or non-periodically access a predetermined storage space of the another electronic device, and acquire motion description information of the another electronic device within a predetermined time period;
  • the processing unit 22 is configured to correct the first tracking result according to the motion description information in a predetermined time period acquired by the second reading unit, to obtain the second tracking result.
  • the electronic device may receive the motion information of the target motion state measured by the other electronic device, and combine the first electronic information according to the motion information of the target motion state acquired by the other electronic device.
  • the first tracking result obtained by the device in real time is analyzed by the processing unit 22 to obtain a second tracking result of the tracking target, and the target tracking unit 21 is used to detect the motion state of the tracking target in real time, and real time is performed by using another electronic device set on the tracking target.
  • the second tracking result is obtained by combining the motion information of the tracking target measured by another electronic device, and the obtained result is more accurate, which solves the problem of low accuracy when tracking the target in the prior art, and increases the robustness of the visual tracking.
  • the motion description information includes a state of the tracked target in different directions of motion
  • the processing unit 22 is configured to: according to the state probability of the tracked target in different moving directions, and the at least one candidate tracking area of the tracked image represented by the first tracking result in the tracking image Determining a confidence tracking area of the tracked object in the tracking image, and generating a second tracking result based on the confidence tracking area; wherein the state probability is used to indicate that the tracked object is moving in different motion directions Probability.
  • the electronic device further includes an alarm unit configured to: when the motion description information in the predetermined time period indicates that the moving direction of the tracked target within a predetermined time period is a first direction, the first in the predetermined time period
  • the tracking result characterizes that the moving direction of the tracked target in the predetermined time period is a second direction, the first direction is different from the second direction, and the tracked target is in the first direction within a predetermined time period
  • the moving distance is greater than a preset first distance threshold, and when the moving distance of the tracked target in the second direction is greater than a preset second distance threshold within a predetermined time period, an alarm prompt message is generated, and the alarm prompt information is generated. And for prompting the first electronic device to drop the tracked target.
  • the target tracking unit 21 is further configured to re-detect the tracking target and re-execute the vision-based target tracking after the alarm unit generates the alarm prompt information.
  • the target tracking unit 21, the processing unit 22, and the alarm unit in the electronic device may be implemented by a central processing unit (CPU), a digital signal processor (DSP, Digital Signal Processor). a micro-control unit (MCU) or a field-programmable gate array (FPGA); the first reading unit and the second reading unit in the electronic device can be used in practical applications.
  • Communication module including: basic communication suite, operating system, communication module, standardized interface and protocol, etc.
  • transceiver antenna implementation including: basic communication suite, operating system, communication module, standardized interface and protocol, etc.
  • the electronic device provided by the foregoing embodiment is only illustrated by the division of each of the foregoing program modules. In actual applications, the processing allocation may be completed by different program modules as needed.
  • the internal structure of the electronic device is divided into different Program modules to perform all or part of the processing described above.
  • the electronic device and the target tracking method are provided in the same embodiment. For details, refer to the method embodiment, and details are not described herein.
  • Embodiments of the present invention also provide an electronic device including: a processor and a memory for storing a computer program executable on the processor. among them,
  • the memory can be implemented by any type of volatile or non-volatile storage device, or a combination thereof.
  • the non-volatile memory may be a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), or an Erasable Programmable Read (EPROM). Only Memory), Electrically Erasable Programmable Read-Only Memory (EEPROM), Ferromagnetic Random Access Memory (FRAM), Flash Memory, Magnetic Surface Memory , CD-ROM, or Compact Disc Read-Only Memory (CD-ROM); the magnetic surface memory can be a disk storage or a tape storage.
  • the volatile memory can be a random access memory (RAM) that acts as an external cache.
  • RAM Static Random Access Memory
  • SSRAM Synchronous Static Random Access Memory
  • SSRAM Dynamic Random Access
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM enhancement Synchronous Dynamic Random Access Memory
  • SLDRAM Synchronous Dynamic Random Access Memory
  • DRAM Direct Memory Bus Random Access Memory
  • DRRAM Direct) Rambus Random Access Memory
  • the method disclosed in the foregoing embodiments of the present invention may be applied to a processor or implemented by a processor.
  • the processor may be an integrated circuit chip with signal processing capabilities.
  • each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
  • the above described processor may be a general purpose processor, a DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, or the like.
  • the processor may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present invention.
  • a general purpose processor can be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in the embodiment of the present invention may be directly implemented as a hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a storage medium, the storage medium being located in the memory, the processor reading the information in the memory, and completing the steps of the foregoing methods in combination with the hardware thereof.
  • the processor when the processor is used to run the computer program, performing: performing vision-based target tracking to obtain a first tracking result for the tracked target; according to describing the motion state of the tracked target Motion information, in combination with the first tracking result, analyzing to obtain a second tracking result for the tracked target; wherein the second tracking result is the same as or different from the first tracking result;
  • the motion information of the tracked target motion state is obtained based on the measurement by the second electronic device disposed on the tracked target.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters are recorded in a predetermined storage space of the second electronic device, and the motion measurement parameters include at least: Acceleration and/or angular velocity of movement;
  • the processor is configured to: periodically or aperiodically access a predetermined storage space of the second electronic device, and acquire the second electronic device at a predetermined time when the computer program is executed Collecting the motion measurement parameters obtained in the segment; performing statistical analysis on the motion measurement parameters in the predetermined time period, Obtaining motion description information of the tracked target within a predetermined time period, correcting the first tracking result according to the motion description information, and obtaining the second tracking result; wherein the motion description information includes at least motion Speed and / or direction of movement.
  • the processor is configured to: when the computer program is executed, perform motion measurement parameters in combination with the second electronic device for a predetermined period of time, and the first electronic device is in the predetermined time period
  • the self-motion measurement parameter within the analysis obtains motion description information of the tracked target relative to the first electronic device during the predetermined time period.
  • the motion information includes motion description information of the tracked target, and the motion description information is obtained by the second electronic device according to the motion measurement parameter in the predetermined time period obtained by the acquisition.
  • the motion measurement parameter includes at least: motion acceleration and/or motion angular velocity
  • the motion description information includes at least a motion speed and/or a motion direction
  • the motion description information is recorded in a predetermined storage space of the second electronic device.
  • the processor is configured to: periodically or non-periodically access a predetermined storage space of the second electronic device, and acquire motion description information of the second electronic device within a predetermined time period; And correcting the first tracking result according to the motion description information in the predetermined time period to obtain the second tracking result.
  • the processor when the processor is configured to run the computer program, performing: a state probability according to the tracked target in different motion directions, and the tracked target represented by the first tracking result Determining, in at least one candidate tracking area in the tracking image, a confidence tracking area of the tracked object in the tracking image, generating a second tracking result based on the trusted tracking area; wherein the state probability is used to represent the Track the probability that a target will move in different directions of motion.
  • the processor when the processor is configured to run the computer program, performing: when the motion description information in the predetermined time period indicates that the moving direction of the tracked target in a predetermined time period is a first direction, The first tracking result within the predetermined time period characterizes the tracked target
  • the moving direction marked in the predetermined time period is a second direction, the first direction is different from the second direction, and the moving distance of the tracked target in the first direction is greater than a preset number in a predetermined time period a distance threshold, when the moving distance of the tracked target in the second direction is greater than a preset second distance threshold in a predetermined time period, generating alarm prompt information, wherein the alarm prompt information is used to prompt the first
  • the electronic device loses the tracked target.
  • the processor when the processor is configured to run the computer program, performing: re-detecting the tracked target after the generating the alert prompt information, and performing the vision-based target tracking again.
  • an embodiment of the present invention further provides a computer readable storage medium, such as a memory including a computer program, which may be executed by a processor of an electronic device to perform the steps of the foregoing method.
  • the computer readable storage medium may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories, such as Mobile phones, computers, tablet devices, personal digital assistants, etc.
  • Embodiments of the present invention provide a computer storage medium, where the computer storage medium stores computer executable instructions for performing: performing vision-based target tracking to obtain a first target for a tracked target Tracking the result; according to the motion information for describing the motion state of the tracked target, combining with the first tracking result, analyzing and obtaining a second tracking result for the tracked target; wherein the second tracking result and the The first tracking result is the same or different; the motion information for describing the motion state of the tracked target is obtained based on the measurement by the second electronic device disposed on the tracked target.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters are recorded in a predetermined storage space of the second electronic device, and the motion measurement parameters include at least: Acceleration and/or angular velocity of movement; the computer executable instructions for performing: periodically or aperiodically accessing a predetermined storage space of the second electronic device Obtaining the motion measurement parameter collected by the second electronic device in a predetermined time period; performing statistical analysis on the motion measurement parameter in the predetermined time period to obtain the tracked target in a predetermined time period The motion description information is performed, and the first tracking result is corrected according to the motion description information to obtain the second tracking result; wherein the motion description information includes at least a motion speed and/or a motion direction.
  • the computer executable instructions are configured to: combine motion measurement parameters of the second electronic device for a predetermined period of time, and self-motion of the first electronic device during the predetermined time period Measuring parameters, the analysis obtains motion description information of the tracked target relative to the first electronic device during the predetermined time period.
  • the motion information includes motion description information of the tracked target, and the motion description information is obtained by the second electronic device according to the motion measurement parameter in the predetermined time period obtained by the acquisition.
  • the motion measurement parameter includes at least: motion acceleration and/or motion angular velocity
  • the motion description information includes at least a motion speed and/or a motion direction
  • the motion description information is recorded in a predetermined storage space of the second electronic device.
  • the computer executable instructions are configured to: periodically or non-periodically access a predetermined storage space of the second electronic device, and acquire motion description information of the second electronic device within a predetermined time period; The motion description information in a predetermined time period, the first tracking result is corrected, and the second tracking result is obtained.
  • the computer executable instructions are configured to perform: a state probability according to the tracked target in different motion directions, and the tracked target represented by the first tracking result is in a tracking image Determining at least one candidate tracking area, determining a confidence tracking area of the tracked object in the tracking image, generating a second tracking result based on the trusted tracking area; wherein the state probability is used to indicate that the tracked target is different The probability of motion in the direction of motion.
  • the computer executable instructions are for executing: when the predetermined time period
  • the motion description information inside indicates that the moving direction of the tracked target in the predetermined time period is the first direction
  • the first tracking result in the predetermined time period represents that the moving direction of the tracked target in the predetermined time period is a second direction
  • the first direction is different from the second direction
  • the moving distance of the tracked target in the first direction is greater than a preset first distance threshold in a predetermined time period
  • the tracked target is
  • the alarm prompt information is generated, and the alarm prompt information is used to prompt the first electronic device to drop the tracked target.
  • the computer executable instructions are configured to: after the generating the alert prompt information, re-detect the tracked target and re-execute the vision-based target tracking.
  • FIG. 3 is a schematic diagram of an optional target tracking system according to an embodiment of the present invention.
  • the target tracking system includes: the first electronic device 31 and the second electronic device 32 in the above embodiment.
  • the first electronic device 31 is configured to perform a vision-based target tracking, obtain a first tracking result for the tracking target, and according to the motion information for describing the tracking target motion state, combined with the first tracking result, the analysis obtains the tracking target a second tracking result of the target; the second electronic device 32, disposed on the tracking target, configured to obtain motion information for describing a motion state of the tracking target based on the measurement.
  • the motion information includes motion measurement parameters for the tracked target, the motion measurement parameters are recorded in a predetermined storage space of the second electronic device, and the motion measurement parameters include at least:
  • the first electronic device 31 is configured to periodically or non-periodically access a predetermined storage space of the second electronic device, and acquire the second electronic device to acquire the data within a predetermined time period.
  • a tracking result is corrected to obtain the second tracking result; wherein the motion description information includes at least a motion speed and/or a motion direction.
  • the first electronic device 31 is configured to combine the motion measurement parameters of the second electronic device in a predetermined time period and the self-motion measurement parameters of the first electronic device in the predetermined time period, and analyze Obtaining motion description information of the tracked target relative to the first electronic device during the predetermined time period.
  • the motion information includes motion description information of the tracked target, and the motion description information is obtained by the second electronic device according to the motion measurement parameter in the predetermined time period obtained by the acquisition.
  • the motion measurement parameter includes at least: motion acceleration and/or motion angular velocity
  • the motion description information includes at least a motion speed and/or a motion direction
  • the motion description information is recorded in a predetermined storage space of the second electronic device.
  • the first electronic device 31 is configured to periodically or non-periodically access a predetermined storage space of the second electronic device, and acquire motion description information of the second electronic device within a predetermined time period; The motion description information in a predetermined time period, the first tracking result is corrected, and the second tracking result is obtained.
  • the motion description information includes a state probability of the tracked target in different motion directions; the first electronic device 31 is configured to be based on a state probability of the tracked target in different motion directions. And the at least one candidate tracking area in the tracking image of the tracked target represented by the first tracking result, determining a confidence tracking area of the tracked target in the tracking image, and generating a first Two tracking results; wherein the state probability is used to indicate a probability of the tracked target moving in different directions of motion.
  • the first electronic device 31 is configured to: when the motion description information in the predetermined time period indicates that the moving direction of the tracked target in a predetermined time period is a first direction, the predetermined time
  • the first tracking result in the segment characterizes that the moving direction of the tracked target in the predetermined time period is a second direction, the first direction is different from the second direction, and the tracked target is in the predetermined time period
  • the moving distance in the first direction is greater than a preset first distance threshold, and the moving distance of the tracked target in the second direction is greater than the pre-predetermined time period
  • the second distance threshold is set, an alarm prompt message is generated, and the alarm prompt information is used to prompt the first electronic device to drop the tracked target.
  • the first electronic device 31 is configured to re-detect the tracked target and re-execute the vision-based target tracking after the generating the alert prompt information.
  • the first electronic device 31 can obtain the motion information of the target motion state measured by the second electronic device 32, and the motion information of the target motion state acquired according to the second electronic device 32.
  • the second tracking result of the tracking target is obtained, and the motion state of the tracking target is detected in real time, and the second electronic device 32 set on the tracking target is used to measure and obtain the tracking in real time.
  • the motion information of the target is combined with the first tracking result of the tracking target acquired by the first electronic device 31 to obtain a second tracking result. If the first tracking result is corrected by using the motion information, the second tracking result is obtained.
  • the first tracking result of the first electronic device 31 is used as the final result, and the second tracking result is obtained by analyzing the motion information of the tracking target measured by the second electronic device 32, and the obtained result is more accurate, and the prior art is solved.
  • the problem of low accuracy when tracking targets increases the robustness of visual tracking .
  • the tracking target of the solution is a person in the screen, and the measurement environment in which the device is located is a relative screen.
  • the first electronic device is a robot, and the second electronic device is a device that uses an IMU chip.
  • the solution completes the assistance of visual tracking by collecting feedback information sent by the IMU.
  • the robot continuously detects the motion state of the person in the screen by continuously scanning, and the robot has a memory therein, and the memory of the robot can store the state of the motion data of the detected person in the screen and form inside the robot storage space.
  • the first motion data, the second electronic device including the IMU chip is placed on the person inside the screen, and the IMU chip can be checked in real time.
  • the second electronic device includes a storage unit, and the storage unit of the IMU device can store the motion data of the detected motion state of the person to form second motion data, and the robot performs the second motion data returned by the IMU.
  • the robot learns a statistical result through statistical analysis.
  • the robot can analyze the human motion behavior in the screen through the result data, and obtain the target motion information based on the probability, for example, estimating the current movement direction of the sports person.
  • the moving direction includes: forward, backward, stop, left shift, and right shift.
  • the effective motion data in the motion information is integrated into the visual tracking algorithm to achieve the purpose of increasing the tracking target.
  • the position motion direction and the motion speed of the person in the screen can be predicted, and the robot analyzes the motion of the second electronic device carried by the person relative to the robot according to the data returned by the IMU, and the robot can estimate by motion analysis.
  • the movement of the target in the screen which can generate a confidence area, thereby increasing the stability and accuracy of the tracking target.
  • the robot predicts that the pedestrian is moving to the right according to the obtained IMU feedback information, the confidence area appears on the right.
  • the confidence of Rectangle2 is higher, Rectangle1 can be suppressed, and the middle or final filter result of visual tracking is Rectangle2, that is, the person inside the prediction screen will move to the right.
  • the machine can be corrected in time when the visual tracking algorithm of the machine fails. For example, if the robot obtains that the target motion direction is seriously inconsistent with the motion direction estimated based on the visual technology, the robot can make a “follow” judgment. At this time, the robot can re-determine the moving direction and moving speed of the target in the screen through the information returned by the second electronic device of the IMU, so that the robot can correct the future moving direction of the robot through the controller.
  • the visual tracking result shows that the person has been moving to the left in the n frames of video, and the moving distance to the left is greater than the distance threshold.
  • the IMU feedback message indicates that the target has been moving to the right.
  • the visual tracking can be basically judged as “following”, the visual tracking algorithm can be re-initialized, and the moving direction of the robot is re-determined, so that the robot can continue to track the target.
  • the visual tracking algorithm of the above embodiment may include at least the following algorithms: a camshift algorithm, an optical flow tracking algorithm, and a particle filtering algorithm.
  • the robot in the process of tracking the person in the screen, can receive the motion information of the target motion state measured by the second electronic device on the person in the screen, according to the person acquired by the second electronic device
  • the motion information of the motion state is combined with the first tracking result obtained by the robot in real time to obtain the second tracking result of the person, and the motion state of the person is detected in real time, and the motion of the tracking target is obtained by real-time measurement using the second electronic device set on the person.
  • the information is combined with the first tracking result of the person acquired by the robot, and the second tracking result is obtained by analyzing the first tracking result by using the motion information to obtain the second tracking result.
  • the first tracking result of the robot is not directly used.
  • the second tracking result is obtained by analyzing the motion information of the person measured by the second electronic device, and the obtained result is more accurate, which solves the problem of low accuracy when tracking the target in the prior art, and increases the vision. Tracking robustness.
  • FIG. 5 is a second flowchart of an optional target tracking method according to an embodiment of the present invention.
  • a visual tracking algorithm is run on a ground mobile robot with a dynamic balance vehicle as a sports chassis, and the user's motion information is carried through the portable body.
  • IMU's mobile devices (such as mobile phones, wristbands, remote controls, tablets, etc.) feed back to the robot to complete the tracking of the visual tracking and track the mobile device.
  • the method includes:
  • step S501 the robot visual tracking algorithm and the IMU device held by the user are initialized.
  • Step S502 the visual tracking algorithm establishes a target model, and tracks the user online in real time.
  • step S503 the IMU data buffer is periodically accessed during the tracking process, and certain statistical learning is performed on the IMU data in this stage to obtain user motion information based on the probability.
  • the user motion information can be divided into five states: front, back, left, right, and stop.
  • step S504 an image confidence area is obtained.
  • step S505 it is determined whether the confidence region and the user's final motion result are completely different or meet a certain "following" criterion.
  • step S506 If yes, go to step S506, if no, go to step S508.
  • step S506 the visual algorithm model is no longer updated.
  • step S507 the visual algorithm model is reinitialized or pedestrian detected, and the process returns to step S503.
  • step S508 the intermediate or final result of the visual tracking algorithm is corrected.
  • the intermediate or final result of the visual tracking algorithm is corrected by using the confidence region of step S504, thereby improving the accuracy of the visual tracking algorithm.
  • step S509 the visual algorithm updates the model, and returns to step S503.
  • the traveling robot in the process of tracking the target, can receive the motion information of the motion state of the user measured by the IMU-containing mobile device on the user, and combine the motion information of the target motion state acquired by the mobile device.
  • the first tracking result obtained by the robot in real time is analyzed to obtain the second tracking result of the tracking target, and the accurate information for detecting the motion state of the user in real time is obtained, and the moving information is extracted by the above-mentioned IMU-containing mobile device for object motion analysis, which can be predicted.
  • the user's motion direction or motion angle can also find errors in the predicted target motion state that occur during the tracking process, thereby solving the technical problem of low tracking accuracy caused by long-term visual tracking, and increasing the robustness of visual tracking.
  • embodiments of the present invention are not limited to ground mobile robots, but are also applicable to space flight type robots (such as drones).
  • the disclosed technical contents may be implemented in other manners.
  • the embodiment of the electronic device described above is only schematic.
  • the division of the unit may be a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a ROM, a RAM, a mobile hard disk, a magnetic disk, or an optical disk.
  • the technical solution of the embodiment of the present invention combines the motion information of the tracked target measured by the second electronic device to obtain a second tracking result, so that the obtained tracking result is more accurate, and the problem of low accuracy when tracking the target in the prior art is solved. , increased the robustness of visual tracking.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un procédé et un système de suivi de cible, un dispositif électronique et un support de mémoire informatique. Le procédé consiste à : exécuter un suivi de cible basé sur la vision, et obtenir un premier résultat de suivi pour une cible suivie (S102); selon des informations de mouvement utilisées pour décrire un état de mouvement de la cible suivie, en combinaison avec le premier résultat de suivi, effectuer une analyse pour obtenir un second résultat de suivi pour la cible suivie (S104), le second résultat de suivi étant identique ou différent du premier résultat de suivi, les informations de mouvement utilisées pour décrire l'état de mouvement de la cible suivie étant basées sur une mesure et obtenues à partir d'un second dispositif électronique (32) agencé sur la cible suivie.
PCT/CN2017/110745 2016-10-12 2017-11-13 Systeme et procédé de suivi de cible, dispositif électronique et support de memoire informatique Ceased WO2018068771A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610891388 2016-10-12
CN201610891388.8 2016-10-12
CN201611001901.8 2016-11-14
CN201611001901.8A CN106682572B (zh) 2016-10-12 2016-11-14 目标跟踪方法、系统和第一电子设备

Publications (1)

Publication Number Publication Date
WO2018068771A1 true WO2018068771A1 (fr) 2018-04-19

Family

ID=58840208

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/110745 Ceased WO2018068771A1 (fr) 2016-10-12 2017-11-13 Systeme et procédé de suivi de cible, dispositif électronique et support de memoire informatique

Country Status (2)

Country Link
CN (1) CN106682572B (fr)
WO (1) WO2018068771A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751022A (zh) * 2019-09-03 2020-02-04 平安科技(深圳)有限公司 基于图像识别的城市宠物活动轨迹监测方法及相关设备
CN111126807A (zh) * 2019-12-12 2020-05-08 浙江大华技术股份有限公司 行程切分方法和装置、存储介质及电子装置
CN111753599A (zh) * 2019-03-29 2020-10-09 杭州海康威视数字技术股份有限公司 人员操作流程检测方法、装置、电子设备及存储介质
CN112651414A (zh) * 2019-10-10 2021-04-13 马上消费金融股份有限公司 运动数据处理和模型训练方法、装置、设备及存储介质
CN113701746A (zh) * 2020-05-21 2021-11-26 华为技术有限公司 目标朝向确定方法和装置
CN114972415A (zh) * 2021-12-28 2022-08-30 广东东软学院 机器人视觉跟踪方法、系统、电子设备及介质
CN117423051A (zh) * 2023-10-18 2024-01-19 广州元沣智能科技有限公司 一种基于场所活动对象的信息监测分析方法
CN120405758A (zh) * 2025-04-07 2025-08-01 河北省地震局 基于视频的地震烈度计算方法、装置和设备

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682572B (zh) * 2016-10-12 2020-09-08 纳恩博(北京)科技有限公司 目标跟踪方法、系统和第一电子设备
JP6962878B2 (ja) * 2018-07-24 2021-11-05 本田技研工業株式会社 操作補助システムおよび操作補助方法
CN110657794A (zh) * 2019-08-21 2020-01-07 努比亚技术有限公司 可穿戴设备的指南针校准方法、可穿戴设备及存储介质
CN112750301A (zh) * 2019-10-30 2021-05-04 杭州海康威视系统技术有限公司 目标对象追踪方法、装置、设备及计算机可读存储介质
CN113029190B (zh) * 2019-12-09 2024-12-24 未来市股份有限公司 运动跟踪系统和方法
CN111487993A (zh) * 2020-04-26 2020-08-04 重庆市亿飞智联科技有限公司 信息获取方法、装置、存储介质、自动驾驶仪及无人机
CN111759239A (zh) * 2020-06-08 2020-10-13 江苏美的清洁电器股份有限公司 一种区域确定方法、装置和计算机存储介质
CN111652911B (zh) * 2020-06-10 2023-07-28 创新奇智(南京)科技有限公司 目标监控方法、装置和设备
CN113870977A (zh) * 2020-06-30 2021-12-31 华为技术有限公司 运动数据的处理方法和装置
CN111986224B (zh) * 2020-08-05 2024-01-05 七海行(深圳)科技有限公司 一种目标行为预测追踪方法及装置
CN111932579A (zh) * 2020-08-12 2020-11-13 广东技术师范大学 基于被跟踪目标运动轨迹对设备角度的调整方法及装置
CN114001738B (zh) * 2021-09-28 2024-08-30 浙江大华技术股份有限公司 视觉巡线定位方法、系统以及计算机可读存储介质
CN116263958A (zh) * 2021-12-10 2023-06-16 浙江宇视科技有限公司 一种跟踪自调节方法、装置以及跟踪系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859439A (zh) * 2010-05-12 2010-10-13 合肥寰景信息技术有限公司 一种用于人机交互的运动追踪装置及其追踪方法
CN103135549A (zh) * 2012-12-21 2013-06-05 北京邮电大学 一种具有视觉反馈的球形机器人运动控制系统及运动控制方法
CN103149939A (zh) * 2013-02-26 2013-06-12 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
CN103411621A (zh) * 2013-08-09 2013-11-27 东南大学 一种面向室内移动机器人的光流场视觉/ins组合导航方法
CN106682572A (zh) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 目标跟踪方法、系统和第一电子设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9495759B2 (en) * 2014-02-26 2016-11-15 Apeiros, Llc Mobile, wearable, automated target tracking system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859439A (zh) * 2010-05-12 2010-10-13 合肥寰景信息技术有限公司 一种用于人机交互的运动追踪装置及其追踪方法
CN103135549A (zh) * 2012-12-21 2013-06-05 北京邮电大学 一种具有视觉反馈的球形机器人运动控制系统及运动控制方法
CN103149939A (zh) * 2013-02-26 2013-06-12 北京航空航天大学 一种基于视觉的无人机动态目标跟踪与定位方法
CN103411621A (zh) * 2013-08-09 2013-11-27 东南大学 一种面向室内移动机器人的光流场视觉/ins组合导航方法
CN106682572A (zh) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 目标跟踪方法、系统和第一电子设备

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111753599B (zh) * 2019-03-29 2023-08-08 杭州海康威视数字技术股份有限公司 人员操作流程检测方法、装置、电子设备及存储介质
CN111753599A (zh) * 2019-03-29 2020-10-09 杭州海康威视数字技术股份有限公司 人员操作流程检测方法、装置、电子设备及存储介质
CN110751022A (zh) * 2019-09-03 2020-02-04 平安科技(深圳)有限公司 基于图像识别的城市宠物活动轨迹监测方法及相关设备
CN110751022B (zh) * 2019-09-03 2023-08-22 平安科技(深圳)有限公司 基于图像识别的城市宠物活动轨迹监测方法及相关设备
CN112651414A (zh) * 2019-10-10 2021-04-13 马上消费金融股份有限公司 运动数据处理和模型训练方法、装置、设备及存储介质
CN111126807B (zh) * 2019-12-12 2023-10-10 浙江大华技术股份有限公司 行程切分方法和装置、存储介质及电子装置
CN111126807A (zh) * 2019-12-12 2020-05-08 浙江大华技术股份有限公司 行程切分方法和装置、存储介质及电子装置
CN113701746A (zh) * 2020-05-21 2021-11-26 华为技术有限公司 目标朝向确定方法和装置
CN114972415B (zh) * 2021-12-28 2023-03-28 广东东软学院 机器人视觉跟踪方法、系统、电子设备及介质
CN114972415A (zh) * 2021-12-28 2022-08-30 广东东软学院 机器人视觉跟踪方法、系统、电子设备及介质
CN117423051A (zh) * 2023-10-18 2024-01-19 广州元沣智能科技有限公司 一种基于场所活动对象的信息监测分析方法
CN117423051B (zh) * 2023-10-18 2024-03-26 广州元沣智能科技有限公司 一种基于场所活动对象的信息监测分析方法
CN120405758A (zh) * 2025-04-07 2025-08-01 河北省地震局 基于视频的地震烈度计算方法、装置和设备

Also Published As

Publication number Publication date
CN106682572B (zh) 2020-09-08
CN106682572A (zh) 2017-05-17

Similar Documents

Publication Publication Date Title
WO2018068771A1 (fr) Systeme et procédé de suivi de cible, dispositif électronique et support de memoire informatique
KR102226846B1 (ko) Imu 센서와 카메라를 이용한 하이브리드 실내 측위 시스템
EP3014476B1 (fr) Utilisation de motifs de mouvement pour anticiper les attentes d'un utilisateur
US10748061B2 (en) Simultaneous localization and mapping with reinforcement learning
US9377310B2 (en) Mapping and positioning system
EP2769574B1 (fr) Suivi d'activité, de vitesse et de cap à l'aide de capteurs dans des dispositifs mobiles ou autres systèmes
US9442564B1 (en) Motion sensor-based head location estimation and updating
Rambach et al. Learning to fuse: A deep learning approach to visual-inertial camera pose estimation
Wu et al. Indoor positioning based on walking-surveyed Wi-Fi fingerprint and corner reference trajectory-geomagnetic database
JP2020505614A (ja) 1つ以上の慣性センサからの向き情報を補正する装置および方法
US20140348380A1 (en) Method and appratus for tracking objects
Wang et al. A comprehensive review on sensor fusion techniques for localization of a dynamic target in GPS-denied environments
CN111595344B (zh) 一种基于地图信息辅助的多姿态下行人行位推算方法
WO2019119289A1 (fr) Procédé et dispositif de positionnement, appareil électronique et produit-programme d'ordinateur
US10242281B2 (en) Hybrid orientation system
CN110942474B (zh) 机器人目标跟踪方法、设备及存储介质
CN109769206B (zh) 一种室内定位融合方法、装置、存储介质及终端设备
CN112087728A (zh) 获取Wi-Fi指纹空间分布的方法、装置和电子设备
KR20190081334A (ko) 복합 측위 기반의 이동 궤적 추적 방법 및 그 장치
CN109945864A (zh) 室内行车定位融合方法、装置、存储介质及终端设备
Hamadi et al. An accurate smartphone-based indoor pedestrian localization system using ORB-SLAM camera and PDR inertial sensors fusion approach
US10871547B1 (en) Radiofrequency based virtual motion model for localization using particle filter
Silva et al. Towards a grid based sensor fusion for visually impaired navigation using sonar and vision measurements
CN112291701B (zh) 定位验证方法、装置、机器人、外部设备和存储介质
Wang et al. AsynEIO: Asynchronous monocular event-inertial odometry using Gaussian process regression

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17860727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17860727

Country of ref document: EP

Kind code of ref document: A1