[go: up one dir, main page]

WO2014198227A1 - Procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome - Google Patents

Procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome Download PDF

Info

Publication number
WO2014198227A1
WO2014198227A1 PCT/CN2014/079742 CN2014079742W WO2014198227A1 WO 2014198227 A1 WO2014198227 A1 WO 2014198227A1 CN 2014079742 W CN2014079742 W CN 2014079742W WO 2014198227 A1 WO2014198227 A1 WO 2014198227A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
image
line laser
self
energy center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2014/079742
Other languages
English (en)
Chinese (zh)
Inventor
汤进举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Publication of WO2014198227A1 publication Critical patent/WO2014198227A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the invention relates to a line laser ranging method, in particular to a line laser ranging method for a self-mobile robot, belonging to the technical field of small household appliance manufacturing. Background technique
  • the core technology of the mobile robot is to perceive the surrounding environment, then plan the walking route, effectively traverse the various areas, and complete the operations in each area.
  • the most basic and critical part of this technology is the "sensing" module, the laser ranging sensor.
  • One of the existing sweeping robots is that the distance measuring sensor is not used, the obstacle is isolated by the collision plate, and then the ground is randomly cleaned. This will result in multiple invalid cleanings, such as: some corners are not cleaned or some areas are cleaned multiple times, the purpose of the work is blind and the level of intelligence is low, time-consuming and laborious.
  • Some similar applications now use laser ranging sensors as their intelligent front-ends, but in these applications, the principle is to use a single-point laser as the light source, and the receiving device to form a transmitting-receiving system, by time, space, etc. Calculate the distance of a space object.
  • a two-dimensional section distance measuring device is constructed. The drawbacks of this type of distance measuring device and method are: the rotating tourb machine introduces noise, the system is not stable enough, the service life is short and the cost is high.
  • Another type of existing sweeping robot uses a distance detector to detect the distance between the robot and the obstacle.
  • the sweeping robot avoids the ground obstacle in real time according to the distance detected by the detector to avoid collision.
  • the sweeping robot can also position itself according to the distance information, and can also use the distance information to establish an indoor map, which facilitates the planning and setting of the subsequent walking path.
  • the laser detector on the sweeping robot is also provided with a rotatable base, so that the laser detector can scan the obstacles in the room at a sustainable 360 degrees.
  • the rotating laser ranging sensor's rotating gyro opportunity introduces noise, the system is not stable enough, and the service life is short. And the cost is higher.
  • Chinese patent CN101210800 also discloses a line laser ranging sensor comprising a line laser source and a camera. The line laser ranging sensor calculates the distance of the obstacle according to the pixel position of the laser stripe in the image, and the ranging method is to binarize the whole image to obtain laser stripe data, and then perform subsequent processing on the triangle after the data is processed. Obstacle distance.
  • the amount of information is large and the processing speed is slow. Summary of the invention
  • the technical problem to be solved by the present invention is to provide a line laser ranging method for a self-moving robot, which only partially images the captured line laser stripe image projected on the target object, in view of the deficiencies of the prior art.
  • the processing of the area greatly reduces the data that needs to be processed, and improves the system efficiency without affecting the accuracy rate, so that the entire system runs at a double speed, saves working time and is convenient and reliable.
  • a line laser ranging method for a self-mobile robot comprising the following steps:
  • Step 100 using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
  • Step 200 Determine an image processing local area on the image of the line laser stripe
  • Step 300 Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
  • Step 400 Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
  • Step 500 According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
  • the step 200 specifically includes:
  • Step 201 traverse each line in the captured image with line laser stripe to find the point at which the laser gradation value is the largest in the line;
  • Step 202 N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
  • the step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
  • Step 401 In the image of the line laser stripe after the distortion correction, traverse each line to find the point at which the laser gradation value is the largest in the line;
  • Step 402 N2 pixels are respectively taken on both sides of the point where the gray value of the laser light having the largest value is found, wherein 5 ⁇ N2 ⁇ 20;
  • Step 403 Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe
  • centroid algorithm formula is:
  • yi i is the pixel column coordinate position after the centroid algorithm
  • (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction)
  • f (x, yi) is the gray at the (x, yi) position Degree value
  • yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
  • the curve fitting method can also be used in the step 400 to find the energy center line of the line laser stripe.
  • the step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
  • the step 500 specifically includes:
  • Step 501 Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
  • Step 502 According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
  • the n in the step 501 is greater than or equal to 20.
  • the present invention only performs local area processing on the captured image of the line laser stripe projected on the target object, so that the data to be processed is greatly reduced, and the system efficiency is improved without affecting the accuracy. It doubles the operating speed of the entire system and saves working time.
  • Figure 1 is a schematic diagram of a similar triangular principle of the present invention. detailed description
  • Figure 1 is a schematic diagram of a similar triangular principle of the present invention.
  • the light emitted from the light source A is irradiated onto the target object C, and an image of the line-containing laser stripe projected on the target object C is captured by the image sensor B.
  • the light source A is a line laser, and therefore, the spot shape projected on the target object C is stripe-like.
  • a triangle AABC is formed by the light source A, the image sensor B, and the target object C.
  • the emitted light emitted by the light source A is reflected back from the target object C, and the falling point on the film is b, and another triangular AabB is formed between the striped image and the image sensor.
  • AABC and AabB are similar triangles with a certain proportional relationship with each other. Since the light source A and the image sensor B are both disposed on the body of the mobile robot, the vertical distance between the target object C and the line connecting the two can be regarded as the actual distance q between the target object C and the self-mobile robot. In AABC, the value of the distance s between the light source A and the image sensor B is determined.
  • AabB the value of f is determined, and the length X of the line stripe image can be fixed by value, using AABC. And AabB is a similar triangle, with a certain proportional relationship between each other, the actual distance q between the target object C and the self-mobile robot can be obtained.
  • the image sensor B can be disposed directly in front of the mobile robot, so that it is connected with the motherboard of the mobile robot, and provides the current location information of the motherboard. For the board to handle and make decisions.
  • the light source A is disposed at the top of the front end of the mobile robot, substantially in the same vertical plane as the image sensor B.
  • the present invention provides a line laser ranging method for a self-mobile robot, the method comprising the following steps:
  • Step 100 using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
  • Step 200 Determine an image processing local area on the image of the line laser stripe
  • Step 300 Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
  • Step 400 Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
  • Step 500 According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
  • the step 200 specifically includes:
  • Step 201 traverse each line in the image of the captured line laser stripe to find a point at which the laser gradation value is the largest in the line;
  • Step 202 N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
  • the step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
  • Step 401 traverse each line in the image of the line laser stripe after the distortion correction, and find a point at which the laser gradation value is the largest in the line;
  • Step 402 N2 pixels are respectively taken on both sides of the point where the gray value of the laser is found to be the largest, wherein 5 ⁇ N2 ⁇ 20;
  • Step 403 Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe
  • centroid algorithm formula is:
  • yi i is the pixel column coordinate position after the centroid algorithm
  • (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction)
  • f (x, yi) is the gray at the (x, yi) position Degree value
  • yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
  • the step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
  • the step 500 specifically includes:
  • Step 501 Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
  • Step 502 According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
  • the n in the step 501 is greater than or equal to 20.
  • the light source A in the present invention uses a line laser, and the emission line laser is projected on the target object C (shown in FIG. 1 is a single-point laser, line laser) It is an extension of the laser in the direction perpendicular to the paper.)
  • a 640 X 480 CMOS image sensor is used to capture images with line laser stripes. Scanning each line of data containing the line laser stripe image by line scan, and discarding the unwanted background image, thereby forming line laser stripe (such as the maximum point of gray value and 30 points around) in the column direction of the image sensor.
  • the line laser stripe processing is then performed to obtain the laser fringe energy centerline in the column direction.
  • the values of Nl, N2, and n can be appropriately selected or adjusted according to the processing accuracy and speed.
  • step 400 in addition to the centroid method for finding the energy center line of the line laser stripe, other methods such as a curve fitting algorithm may be used.
  • the curve fitting method utilizes the principle that the cross-sectional intensity of the laser stripe approximates the Gaussian distribution.
  • a sub-pixel-level optical strip center coordinate is obtained by curve fitting the pixel point coordinates and the gray value in the vicinity of the peak point of the light intensity distribution in the cross section.
  • the specific steps are as follows: (1) Search each line sequentially for the processed image, and find the peak intensity point, that is, the maximum point of the gray value in the line, and set the point to M. (x., y.), the gray value is g. ; (2) Select M.
  • the left neighbor M- 2 (x., y), IVU (x., y- and the right neighbor Mi ⁇ y , M 2 (x., y 2 ), and their gray values are g- 2 , g- 1 ; gl , g 2 .
  • the quadratic curve fitting is performed according to the principle of least squares; (3) the coordinates of the maximal point in the direction are obtained by curve fitting. Then (x., which is the center point of the light stripe sought.
  • the fitted curve equation can be a parabolic or Gaussian curve. This method is suitable for straight strips of light in the image where the normal direction does not change much.
  • the deformed image must be corrected by the process of distortion correction. Since the line laser fills a certain direction of the image, it is indispensable to correct the deformation to obtain a real image to obtain more accurate ranging accuracy. Through the distortion correction process, the deformed laser stripes in the image, such as the curved shapes on both sides, can be corrected to a true straight shape. However, if the entire image captured by the image sensor is subjected to distortion correction processing as in the prior art, it is necessary to traverse each pixel of the entire image, which requires a large amount of data and information processing.
  • step 200 the image processing local area is first determined on the image containing the line laser stripe, and then, in step 300, the image processing local area is corrected for distortion to determine the true position of the line laser stripe on the image.
  • the distortion correction algorithm is used only for the obtained partial image, which can greatly speed up the calculation and speed up the image processing speed.
  • Local image distortion correction including:
  • a laser stripe grayscale image is captured, and each line is traversed in the image to find a point where the laser gradation value is the largest;
  • each pixel takes 30 pixels on each side of the maximum point of the laser gradation value, and performs local image distortion correction processing.
  • the number of selected pixels is not limited to 30, and according to the processing precision or speed, Adjust the number of selected ones by yourself;
  • the distortion correction algorithm formula is:
  • (x, is the original position of the original image distortion point captured on the image sensor, J is the corrected new position
  • the centroid algorithm is used to find the energy center line of the laser stripe, specifically, only the real laser stripe image is obtained in the previous step, and the data of the laser stripe in each line occupies several pixels. Location, no accurate calculations are possible.
  • the actual distance q between the self-mobile robot and the target object C can be calculated.
  • the calculation can be performed in the following two ways.
  • the first method is to obtain the spatial distance corresponding to each center point of the laser stripe in the column direction by using the triangulation principle after obtaining the coordinates of the stripe energy center, thereby obtaining the laser line corresponding to Section distance.
  • the method of obtaining the distance is also based on the principle of triangulation in the above figure, and a triangular relation formula is established, in which the variable is also yi i, and then calculated and obtained distance from other known parameters.
  • Another method is to calculate the distance by using the look-up table method, that is, each coordinate center coordinate corresponds to a spatial distance, and on this basis, a set of fixed tables consisting of the distance Z and the stripe center column coordinate yi i is established, that is, it needs to be advanced Measuring the data of n distances Z and the data of the column coordinates yi i corresponding thereto, thereby obtaining a linear relationship of the distance Z of the column coordinates yii, thereby establishing a table for reference calculation, in actual use, n according to the measurement
  • the distance range is determined. For example, the measurement range within 5 meters needs to measure at least 20 sets of correspondence data, that is, the value of n is greater than or equal to 20, which is more reliable. After pre-testing to obtain this reference table, in actual Only the actual centerline coordinates of the stripe are obtained during the measurement, and then an interpolation algorithm is used to correspond to a spatial distance.
  • the mobile robot After obtaining all the spatial distances, the mobile robot can control the path planning and walking mode according to the relevant information.
  • the line laser ranging method for the self-mobile robot solves the defect of the non-intelligent of the existing self-mobile robot through the line laser ranging sensor, and overcomes the existing ranging.
  • the single point laser and the rotating device bring about instability, high cost and short life.
  • the distance measuring sensor of the present invention has high efficiency and cost performance in a short-range ranging application of 200 mm to 5000 mm indoors.
  • the laser ranging sensor of the present invention uses line structured light as a light source, which is easier to calibrate and adjust than a spot laser.
  • the line structure light cooperates with the area array CMOS sensor to extract the depth distance information of the measurement target by the two-dimensional gray scale information in the CMOS image.
  • the camera needs to be calibrated to obtain the internal and external parameters of the camera. These parameters are used to correct the deformation of the image to obtain a true laser fringe image. Since the laser line segment of the line structure light modulated by the target object is a laser stripe having a certain width, the laser stripe obeys a Gaussian distribution in the cross section, so when calculating the spatial distance, the energy center of the line laser stripe needs to be extracted in the image.
  • the line structure light sensor is used as the "eye” of the robot to obtain the two-dimensional distance information within the sensor setting range, and provides the robot host with the indoor situation seen by the "eye”. The host uses this information to locate, plan the route and establish an indoor map. The robot will make its own route according to the map to clean the ground, which makes the cleaning robot's work absolutely intelligent, and the efficiency is greatly improved.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome comprenant les étapes suivantes : étape 100 : l'installation d'un laser linéaire (A) sur un robot à déplacement autonome comme source d'émission, le laser linéaire émis étant projeté sur un objet cible (C), et un capteur d'images (B) effectuant une prise de vue d'une image contenant un ruban laser, qui est projeté sur l'objet cible ; étape 200 : la détermination d'une région de traitement d'image locale sur l'image contenant le ruban laser linéaire ; étape 300 : la réalisation d'une correction de distorsion sur la région de traitement d'image locale, et la détermination de la vraie position du ruban laser linéaire; étape 400 : la recherche d'un axe énergétique dans le ruban laser linéaire après la correction de distorsion, et la détermination de la position de l'axe énergétique sur l'image ; et étape 500 : selon une coordonnée de pixel de l'axe énergétique sur l'image, l'obtention de la distance réelle (q) depuis le robot à déplacement autonome jusqu'à l'objet cible (C). Le procédé n'effectue que le traitement d'une région locale de l'image, qui réduit la quantité de données nécessitant un traitement, permettant ainsi d'améliorer l'efficacité de traitement et l'économie de temps de travail sur les lieux sans affecter le taux de précision.
PCT/CN2014/079742 2013-06-14 2014-06-12 Procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome Ceased WO2014198227A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310234951.0 2013-06-14
CN201310234951.0A CN104236521A (zh) 2013-06-14 2013-06-14 用于自移动机器人的线激光测距方法

Publications (1)

Publication Number Publication Date
WO2014198227A1 true WO2014198227A1 (fr) 2014-12-18

Family

ID=52021667

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/079742 Ceased WO2014198227A1 (fr) 2013-06-14 2014-06-12 Procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome

Country Status (2)

Country Link
CN (1) CN104236521A (fr)
WO (1) WO2014198227A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109774197A (zh) * 2018-07-13 2019-05-21 中国航空工业集团公司济南特种结构研究所 一种复合材料曲面铺层激光投影仪位置的确定方法

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657981B (zh) * 2015-01-07 2017-05-24 大连理工大学 一种移动机器人运动中三维激光测距数据动态补偿方法
CN105890047A (zh) * 2015-01-21 2016-08-24 深圳市沃森空调技术有限公司 空调机器人
CN104859562A (zh) * 2015-05-27 2015-08-26 北京信息科技大学 用于检测车辆后方障碍物的方法及装置
CN106371101B (zh) * 2015-07-20 2019-08-16 北醒(北京)光子科技有限公司 一种智能测距及避障的装置
CN105759280A (zh) * 2016-05-17 2016-07-13 上海酷哇机器人有限公司 对人眼安全的激光三角测量系统
CN107436439A (zh) * 2016-05-27 2017-12-05 科沃斯机器人股份有限公司 激光测距装置及其感光芯片的安装方法
CN106125738A (zh) * 2016-08-26 2016-11-16 北京航空航天大学 一种基于agv的货架识别装置及方法
CN106226755B (zh) * 2016-08-30 2019-05-21 北京小米移动软件有限公司 机器人
CN106092146A (zh) * 2016-08-30 2016-11-09 宁波菜鸟智能科技有限公司 激光测距校正方法及系统
WO2018127209A1 (fr) * 2017-01-09 2018-07-12 苏州宝时得电动工具有限公司 Dispositif mobile autonome et système de positionnement, procédé de positionnement et son procédé de commande
CN108398694B (zh) * 2017-02-06 2024-03-15 苏州宝时得电动工具有限公司 激光测距仪及激光测距方法
CN109143167B (zh) * 2017-06-28 2021-07-23 杭州海康机器人技术有限公司 一种障碍信息获取装置及方法
CN107607960B (zh) * 2017-10-19 2024-12-20 深圳市欢创科技股份有限公司 一种光学测距的方法及装置
CN107861113B (zh) * 2017-11-06 2020-01-14 深圳市杉川机器人有限公司 标定方法及装置
CN108345002B (zh) * 2018-02-27 2024-08-23 上海图漾信息科技有限公司 结构光测距装置及方法
CN109615586B (zh) * 2018-05-07 2022-03-11 杭州新瀚光电科技有限公司 红外图像畸变矫正算法
CN108594205A (zh) * 2018-06-21 2018-09-28 深圳市镭神智能系统有限公司 一种基于线激光的激光雷达
EP3660451B1 (fr) * 2018-11-28 2022-04-27 Hexagon Technology Center GmbH Module de stationnement intelligent
CN110781779A (zh) * 2019-10-11 2020-02-11 北京地平线机器人技术研发有限公司 物体位置检测方法、装置、可读存储介质及电子设备
CN110960138A (zh) * 2019-12-30 2020-04-07 科沃斯机器人股份有限公司 结构光模组及自主移动设备
CN112116619B (zh) * 2020-09-16 2022-06-10 昆明理工大学 一种基于结构约束的多线结构光系统条纹中心线提取方法
CN114510015B (zh) * 2020-10-29 2024-10-15 深圳市普森斯科技有限公司 扫地机器人移动方法、电子装置及存储介质
CN112595385B (zh) * 2020-11-25 2025-01-21 创新奇智(南京)科技有限公司 一种目标物高度获取方法以及装置
CN113050113B (zh) * 2021-03-10 2023-08-01 广州南方卫星导航仪器有限公司 一种激光点定位方法和装置
CN114608520B (zh) * 2021-04-29 2023-06-02 北京石头创新科技有限公司 一种测距方法、装置、机器人和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354073A (zh) * 2000-11-17 2002-06-19 三星光州电子株式会社 移动式机器人和它的路径调节方法
WO2005098476A1 (fr) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Procede et appareil d'estimation de position a l'aide de sources de lumiere reflechie
US20060237634A1 (en) * 2005-04-23 2006-10-26 Lg Electronics Inc. Position sensing device for mobile robots and robot cleaner equipped with the same
JP2009063379A (ja) * 2007-09-05 2009-03-26 Casio Comput Co Ltd 距離測定装置及びこの距離測定装置を備えたプロジェクタ
CN102980528A (zh) * 2012-11-21 2013-03-20 上海交通大学 无位姿约束线激光单目视觉三维测量传感器参数标定方法
CN103134469A (zh) * 2011-12-01 2013-06-05 财团法人工业技术研究院 距离感测装置及距离感测方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8996172B2 (en) * 2006-09-01 2015-03-31 Neato Robotics, Inc. Distance sensor system and method
TWI289196B (en) * 2006-10-02 2007-11-01 Ming-Chih Lu Distance measurement system and method
CN101587591B (zh) * 2009-05-27 2010-12-08 北京航空航天大学 基于双参数阈值分割的视觉精确跟踪方法
KR20120007735A (ko) * 2010-07-15 2012-01-25 삼성전기주식회사 거리 측정 모듈 및 이를 포함하는 전자 장치
CN102889864A (zh) * 2011-07-19 2013-01-23 中铝上海铜业有限公司 带卷边部物体塔形检测系统及其检测方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354073A (zh) * 2000-11-17 2002-06-19 三星光州电子株式会社 移动式机器人和它的路径调节方法
WO2005098476A1 (fr) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Procede et appareil d'estimation de position a l'aide de sources de lumiere reflechie
US20060237634A1 (en) * 2005-04-23 2006-10-26 Lg Electronics Inc. Position sensing device for mobile robots and robot cleaner equipped with the same
JP2009063379A (ja) * 2007-09-05 2009-03-26 Casio Comput Co Ltd 距離測定装置及びこの距離測定装置を備えたプロジェクタ
CN103134469A (zh) * 2011-12-01 2013-06-05 财团法人工业技术研究院 距离感测装置及距离感测方法
CN102980528A (zh) * 2012-11-21 2013-03-20 上海交通大学 无位姿约束线激光单目视觉三维测量传感器参数标定方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109774197A (zh) * 2018-07-13 2019-05-21 中国航空工业集团公司济南特种结构研究所 一种复合材料曲面铺层激光投影仪位置的确定方法
CN109774197B (zh) * 2018-07-13 2021-05-07 中国航空工业集团公司济南特种结构研究所 一种复合材料曲面铺层激光投影仪位置的确定方法

Also Published As

Publication number Publication date
CN104236521A (zh) 2014-12-24

Similar Documents

Publication Publication Date Title
WO2014198227A1 (fr) Procédé de télémétrie par laser linéaire utilisé pour un robot à déplacement autonome
CN112797915B (zh) 一种线结构光测量系统的标定方法、标定装置、以及系统
US10052766B2 (en) Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
CN103411553B (zh) 多线结构光视觉传感器的快速标定方法
CN101698303B (zh) 一种三维激光和单目视觉间的自动标定方法
CN104331896B (zh) 一种基于深度信息的系统标定方法
CN108020825B (zh) 激光雷达、激光摄像头、视频摄像头的融合标定系统及方法
CN116009559B (zh) 一种输水管道内壁巡检机器人及检测方法
CN113096183B (zh) 一种基于激光雷达与单目相机的障碍物检测与测量方法
CN103837085B (zh) 基于激光跟踪仪逐点标定的目标位移矢量测量装置及方法
CN108986070B (zh) 一种基于高速视频测量的岩石裂缝扩展实验监测方法
TWI517101B (zh) 三維掃描器校正系統及其校正方法
KR102424135B1 (ko) 2개의 카메라로부터의 곡선의 세트의 구조형 광 매칭
CN102927908A (zh) 机器人手眼系统结构光平面参数标定装置及方法
CN116342718B (zh) 一种线激光3d相机的标定方法、装置、存储介质及设备
CN111238368A (zh) 三维扫描方法及装置
CN116579955B (zh) 一种新能源电芯焊缝反光点去噪和点云补全方法及系统
CN103822594A (zh) 一种基于激光传感器和机器人的工件扫描成像方法
CN105717511B (zh) 基于线束激光器和普通摄像头芯片的多点测距方法
CN110866954B (zh) 长度约束下的弹目标高精度姿态测量方法
CN109087360A (zh) 一种机器人相机外参的标定方法
CN112419427A (zh) 用于提高飞行时间相机精度的方法
JP2004309318A (ja) 位置検出方法、その装置及びそのプログラム、並びに、較正情報生成方法
Huang et al. Extrinsic calibration of a multi-beam LiDAR system with improved intrinsic laser parameters using v-shaped planes and infrared images
CN112504125B (zh) 一种基于tof的非接触式长方体体积测量方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14810187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14810187

Country of ref document: EP

Kind code of ref document: A1