[go: up one dir, main page]

JPH06137828A - Detecting method for position of obstacle - Google Patents

Detecting method for position of obstacle

Info

Publication number
JPH06137828A
JPH06137828A JP4291150A JP29115092A JPH06137828A JP H06137828 A JPH06137828 A JP H06137828A JP 4291150 A JP4291150 A JP 4291150A JP 29115092 A JP29115092 A JP 29115092A JP H06137828 A JPH06137828 A JP H06137828A
Authority
JP
Japan
Prior art keywords
spot
camera
obstacle
measured
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP4291150A
Other languages
Japanese (ja)
Inventor
Tatsuro Sato
竜郎 佐藤
Hitoshi Nakajima
仁志 中嶋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kajima Corp
Original Assignee
Kajima Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kajima Corp filed Critical Kajima Corp
Priority to JP4291150A priority Critical patent/JPH06137828A/en
Publication of JPH06137828A publication Critical patent/JPH06137828A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

PURPOSE:To stably detect the position of an obstacle in three-dimensional space by photographing a light spot projected to the obstacle with two cameras arranged at a fixed interval, and processing the image of the light spot to determine its position, and continuously carrying out this process. CONSTITUTION:The first and the second camera A, B are arranged at a fixed interval, and their visual axes are made to intersect mutually at a spot being a reference distance ahead for measuring the width of a camera photograph domain on the crossing face of camera transverse axes. A laser beam is projected to a measured body 1 by a laser projector 2, and converged into a spot-shape. Next luminescence at the laser spot is caught by the cameras A, B, and inputted into an image processing device 3 and binary-processed. In this case, the visual axis of the camera and the central axis of an image plane are made to coincide mutually. Next the spot position on a binary image is measured, and the two-dimensional coordinates of the spot position on the measured body 1 are measured from the above spot position and the previously set up arrangement of the cameras A, B. Three dimensonal coordinates are computed by a personal computer 4 on a basis of two dimensional coordinates.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、建設用ロボットに必須
である自律機能のうち、時間と共に変化するロボット周
囲の障害物の位置を非接触で検出する障害物位置検出方
法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an obstacle position detecting method for non-contact detection of the position of an obstacle around a robot which changes with time, among the autonomous functions essential for construction robots.

【0002】[0002]

【従来の技術】従来の方法によれば、超音波や光波等を
障害物に投射し、その反射を検出するまでの時間によ
り、障害物までの距離を測定するようにしていた。
2. Description of the Related Art According to a conventional method, an ultrasonic wave, a light wave, or the like is projected onto an obstacle, and the distance to the obstacle is measured by the time until the reflection is detected.

【0003】[0003]

【発明が解決しようとする課題】従来の方法において
は、障害物の反射率や表面形状等により、測定値が安定
しない。また、工事現場においては、周囲環境が変化す
るため測定に際しての調整が非常に困難である。
In the conventional method, the measured values are not stable due to the reflectance and surface shape of obstacles. Also, at the construction site, it is very difficult to make adjustments during measurement because the surrounding environment changes.

【0004】本発明は、障害物の反射率や表面形状等に
依存することなく、周囲環境の変化から影響を受けない
で安定した障害物の位置検出が可能な障害物位置検出方
法を提供することを目的としている。
The present invention provides an obstacle position detecting method capable of stably detecting the position of an obstacle without being influenced by the change of the surrounding environment without depending on the reflectance or surface shape of the obstacle. Is intended.

【0005】[0005]

【課題を解決するための手段】本発明によれば、所定の
離隔距離で配置した第1、第2のカメラにより障害物に
投射した充分に収束された光のスポットを撮影し、撮影
した画像を画像処理して前記スポットの位置である障害
物上の一点の位置を求め、この処理を連続して行うこと
により3次元空間における障害物全体の位置を求めるこ
とを特徴としている。
According to the present invention, a sufficiently converged light spot projected on an obstacle is photographed by the first and second cameras arranged at a predetermined distance, and the photographed image is taken. Is image-processed to find the position of one point on the obstacle, which is the position of the spot, and the position of the entire obstacle in the three-dimensional space is found by continuously performing this processing.

【0006】上記充分に収束された光には、レーザ光又
はペンライト光を用いるのが好ましい。
Laser light or penlight light is preferably used for the above-mentioned sufficiently converged light.

【0007】また、処理は、スポットの画像を2値化処
理して2次元座標を計測し、その2次元座標を連続処理
して3次元座標を算出するのが好ましい。
Further, it is preferable that the processing is such that the spot image is binarized to measure two-dimensional coordinates, and the two-dimensional coordinates are continuously processed to calculate three-dimensional coordinates.

【0008】[0008]

【作用】本発明の方法によれば、投射した充分に収束し
た光のスポットを撮影し、画像処理して障害物上の測点
の位置を求めるので、安定した位置の検出が可能であ
る。
According to the method of the present invention, the projected sufficiently condensed spot of light is photographed, and the position of the measuring point on the obstacle is obtained by image processing, so that stable position detection is possible.

【0009】[0009]

【実施例】以下図面を参照して本発明の実施例を説明す
る。
Embodiments of the present invention will be described below with reference to the drawings.

【0010】図1には、本発明を実施する装置が示され
ている。この装置は、障害物すなわち被測体1に充分に
収束された光であるレーザ光を投射してスキャニングす
るレーザ光投射器2と、CCDカメラで構成された第
1、第2カメラA、Bと、これら第1、第2カメラA、
Bに接続された画像処理装置3と、その画像処理装置3
に接続されたパーソナルコンピュータ4とから構成され
ている。
FIG. 1 shows an apparatus embodying the present invention. This device comprises a laser light projector 2 for projecting and scanning a laser light, which is light sufficiently converged on an obstacle, that is, a DUT 1, and first and second cameras A, B composed of CCD cameras. And these first and second cameras A,
Image processing device 3 connected to B and the image processing device 3
And a personal computer 4 connected to.

【0011】測定に際し、あらかじめ図2に示すよう
に、第1、第2カメラA、Bを所定の離隔間隔Wで配置
し、カメラ視軸a1、a2が基準距離L前方で交差する
ように配置する。このとき、カメラ支軸交差面(以下基
準面という)Sにおけるカメラ撮影範囲Hの巾を測定し
ておく。
In the measurement, as shown in FIG. 2, the first and second cameras A and B are arranged in advance at a predetermined separation distance W so that the camera visual axes a1 and a2 intersect in front of the reference distance L. To do. At this time, the width of the camera photographing range H on the camera spindle crossing surface (hereinafter referred to as the reference surface) S is measured.

【0012】図3において、レーザ投射器2で被測体1
にレーザ光を被測体1の上でスポット状に収束するよう
に投射する(ステップS1)。
In FIG. 3, the laser projector 2 is used to measure the object 1 to be measured.
The laser light is projected onto the DUT 1 so as to converge in a spot shape (step S1).

【0013】そこで、カメラA、Bで被測体1の方向を
撮影する。この際、カメラA、Bには、レーザ光の波長
に合せた光学フィルタを装着しておく。そして、カメラ
A、Bでレーザスポットの発光を捕捉した場合を測定可
能とする。画像処理装置3は、カメラA、Bからの画像
を入力して(ステップS2a、S2b)、その画像を2
値化処理する(ステップS3a、S3b)。この際、カ
メラ視軸a1、a2と画面中心軸とを一致させておく。
次いで、2値化画像におけるレーザスポットの位置を測
定し、測定したレーザスポット画像の位置と、あらかじ
め設定しておいたカメラA、Bの配置との関係から被測
体1の上のレーザスポット発光位置の2次元座標を計測
する(ステップS4a、S4b)。この計測した2次元
座標に基づき、パーソナルコンピュータ4は、3次元座
標を計算する(ステップS5)。
Therefore, the cameras A and B shoot the direction of the object 1. At this time, an optical filter matching the wavelength of the laser light is attached to each of the cameras A and B. Then, the case where the emission of the laser spot is captured by the cameras A and B can be measured. The image processing device 3 inputs the images from the cameras A and B (steps S2a and S2b), and outputs the images to
The value is processed (steps S3a and S3b). At this time, the camera viewing axes a1 and a2 and the screen center axis are made to coincide with each other.
Next, the position of the laser spot in the binarized image is measured, and the laser spot emission on the DUT 1 is determined based on the relationship between the position of the measured laser spot image and the preset arrangement of the cameras A and B. The two-dimensional coordinates of the position are measured (steps S4a and S4b). The personal computer 4 calculates three-dimensional coordinates based on the measured two-dimensional coordinates (step S5).

【0014】以下、処理の態様を説明する。The mode of processing will be described below.

【0015】図4において、レーザ光投射器2は、被測
体1に向け、第1、第2カメラA、Bの有効視野角ψ内
だけにレーザ光を投射する。
In FIG. 4, a laser beam projector 2 projects a laser beam toward the object to be measured 1 only within the effective viewing angle ψ of the first and second cameras A and B.

【0016】カメラA、Bは、レーザ光の当っている被
測体1の画像を入力する。カメラA、Bには、外乱光を
除去するため、レーザ光の波長に合せたフィルタが装着
されている。
The cameras A and B input the image of the DUT 1 on which the laser beam is applied. The cameras A and B are equipped with filters matching the wavelength of the laser light in order to remove ambient light.

【0017】画像処理装置3は、カメラA、Bからの画
像を、レーザ光の当っている部分だけを白、それ以外を
部分を黒に変換した2値化画像とする。そして、2値化
画像の白画素の図形(レーザ光投射面)の重心から、レ
ーザ光投射位置を求め、それを画素数単位の2次元座標
(X軸、Y軸)として外部出力する。
The image processing apparatus 3 converts the images from the cameras A and B into a binary image in which only the portion exposed to the laser light is converted into white and the other portions are converted into black. Then, the laser light projection position is obtained from the center of gravity of the figure of the white pixel (laser light projection surface) of the binarized image, and this is externally output as two-dimensional coordinates (X axis, Y axis) in units of the number of pixels.

【0018】パーソナルコンピュータ4は、レーザ光投
射位置のそれぞれのカメラA、Bにおける2次元座標
と、カメラ離隔距離W、カメラ軸交差距離等のあらかじ
め設定しておいたカメラA、Bの位置の関係から、カメ
ラ位置を基準としたスポットの位置を実寸単位の3次元
座標として算出する。
The personal computer 4 has a relationship between the two-dimensional coordinates of the laser light projection positions of the cameras A and B and the preset positions of the cameras A and B such as the camera separation distance W and the camera axis crossing distance. From the above, the position of the spot based on the camera position is calculated as the three-dimensional coordinates in the actual size unit.

【0019】その測定座標の演算態様は、先ず、図5に
おける水平方向角θ1、θ2を求める。
In the calculation mode of the measurement coordinates, first, the horizontal angles θ1 and θ2 in FIG. 5 are obtained.

【0020】カメラAにおいて、カメラA(以下点Aと
もいう)から基準距離Lだけ前方の基準面S上で、画面
中心点Cから画面端ENDまでの画素数がNドットであ
った場合、中心点C〜画面端END間の実測距離がDで
あったとき、測点Pが直線A・P上にある場合は、画像
で得られる測点Pの位置は、点Aからの距離にかかわら
ず中心点CからK1ドットの位置である。
In the camera A, when the number of pixels from the screen center point C to the screen edge END is N dots on the reference plane S ahead of the camera A (hereinafter also referred to as the point A) by the reference distance L, When the measured distance between the point C and the screen edge END is D and the measured point P is on the straight line A / P, the position of the measured point P obtained in the image is irrespective of the distance from the point A. It is the position of K1 dots from the center point C.

【0021】したがって、直線A・Cと直線A・Pのな
す角度θ1と、同様に、直線B・Cと直線B・Pのなす
角度θ2は、 で求められる。
Therefore, the angle θ1 formed by the straight lines A and C and the straight lines A and P and the angle θ2 formed by the straight lines B and C and the straight lines B and P are Required by.

【0022】次に、図6において、測点Pの位置座標P
(Xp、Yp)を求める。先ず、ベクトルABの単位ベ
クトルn1を、 から求める。
Next, referring to FIG. 6, the position coordinates P of the measuring point P are shown.
Calculate (Xp, Yp). First, the unit vector n1 of the vector AB is Ask from.

【0023】ベクトルAEは、単位ベクトルn1を角度
α+θ1だけ時計回りに回転させ、L倍したものであ
る。ベクトルAEの単位ベクトルをn2(X´、Y´)
とすると、 したがって、点Eの座標E(Xe、Ye)は、 したがって、直線A・Eの方程式は、 で与えられる。
The vector AE is a unit vector n1 rotated clockwise by an angle α + θ1 and multiplied by L. The unit vector of the vector AE is n2 (X ', Y')
Then, Therefore, the coordinate E (Xe, Ye) of the point E is Therefore, the equation of straight line AE is Given in.

【0024】同様に、図7において、ベクトルBAの単
位ベクトルn1は、 ベクトルBFは、単位ベクトルm1を角度β+θ2だけ
反時計回りに回転させ、L倍したもので、ベクトルBF
の単位ベクトルをm2(Xg、Yg)とすると、 したがって、点Fの座標F(Xf、Yf)は、 したがって、直線B・Fの方程式は、 で与えられる。
Similarly, in FIG. 7, the unit vector n1 of the vector BA is The vector BF is obtained by rotating the unit vector m1 counterclockwise by the angle β + θ2 and multiplying it by L.
Let m2 (Xg, Yg) be the unit vector of Therefore, the coordinates F (Xf, Yf) of the point F are Therefore, the equation of straight line BF is Given in.

【0025】したがって、図8において、測点Pの位置
座標P(Xp、Yp)は、直線A・Eと直線B・Fとの
交点であるから、次式を解くことで求められる。
Therefore, in FIG. 8, the position coordinate P (Xp, Yp) of the measuring point P is the intersection of the straight lines A and E and the straight lines B and F, and can be obtained by solving the following equation.

【0026】 次に、図9において、測点Pの鉛直方向角φを求める。
カメラAにおいて、点Aから基準距離Lだけ前方の基準
面S上で、画面中心点Cから画面端END´までの画素
数がN´ドットであった場合の画面中心点Cと画面端E
ND´間の実測距離がD´であったとき、測点Pが直線
A・P上にある場合は、画像で得られる測点P位置は、
点Aからの距離にかかわらず中心点CからK3ドットの
位置にある。したがって、直線A・Cと直線A・Pのな
す角度φは、 で求められる。
[0026] Next, in FIG. 9, the vertical angle φ of the measurement point P is obtained.
In the camera A, the screen center point C and the screen edge E when the number of pixels from the screen center point C to the screen edge END 'is N'dots on the reference plane S ahead of the point A by the reference distance L
When the measured distance between ND ′ is D ′ and the measured point P is on the straight line A / P, the measured point P position obtained in the image is
It is located at the position of K3 dots from the center point C regardless of the distance from the point A. Therefore, the angle φ formed by the straight lines A and C and the straight lines A and P is Required by.

【0027】図10において、測点Pの鉛直座標P(P
z)を求める。点Aから点Pまでの水平距離L´は、 で与えられるから、 で求められる。そして、上記の処理を連続して行い、被
測体1の全体の位置を求める。
In FIG. 10, the vertical coordinate P (P
z) is calculated. The horizontal distance L'from point A to point P is Given by Required by. Then, the above processing is continuously performed to obtain the entire position of the DUT 1.

【0028】[0028]

【発明の効果】以上説明したように本発明によれば、撮
影した光のスポットから障害物全体の位置を求めること
により、障害物の反射率や表面形状に依存することな
く、周囲環境の変化から影響を受けないで安定した位置
検出を行うことができる。
As described above, according to the present invention, by determining the position of the entire obstacle from the spot of the photographed light, the change of the surrounding environment can be achieved without depending on the reflectance or the surface shape of the obstacle. It is possible to perform stable position detection without being affected by.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明を実施する一例を示す斜視図。FIG. 1 is a perspective view showing an example for carrying out the present invention.

【図2】カメラ配置を説明する平面図。FIG. 2 is a plan view illustrating a camera arrangement.

【図3】制御フローチャート図。FIG. 3 is a control flowchart.

【図4】レーザ投射を説明する平面図。FIG. 4 is a plan view illustrating laser projection.

【図5】測点の水平方向の計算を説明する平面図。FIG. 5 is a plan view illustrating calculation of measurement points in the horizontal direction.

【図6】一方のカメラからの測点水平座標の計算を説明
する平面図。
FIG. 6 is a plan view illustrating calculation of horizontal coordinates of a measurement point from one camera.

【図7】他方のカメラからの測点水平座標の計算を説明
する平面図。
FIG. 7 is a plan view illustrating calculation of horizontal coordinates of a measurement point from the other camera.

【図8】測点の座標計算を説明する平面図。FIG. 8 is a plan view for explaining coordinate calculation of measurement points.

【図9】測点の鉛直方向角の計算を説明する側面図。FIG. 9 is a side view for explaining calculation of a vertical angle of a measurement point.

【図10】測点の鉛直座標の計算を説明する側面図。FIG. 10 is a side view for explaining calculation of vertical coordinates of measurement points.

【符号の説明】[Explanation of symbols]

A・・・第1カメラ a1、a2・・・カメラ視軸 B・・・第2カメラ C・・・画面中心点 D、D´・・・実測距離 END、END´・・・画面端 H・・・カメラ撮影範囲 L・・・基準距離 P、P´・・・測点 S・・・基準面 W・・・離隔距離 ψ・・・有効視野角 θ1、θ2・・・水平方向角 φ・・・鉛直方向角 1・・・被測体 2・・・レーザ光投射器 3・・・画像処理装置 4・・・パーソナルコンピュータ A ... First camera a1, a2 ... Camera viewing axis B ... Second camera C ... Screen center point D, D '... Measured distance END, END' ... Screen edge H. .. Camera shooting range L ... Reference distance P, P '... Measuring point S ... Reference plane W ... Separation distance ψ ... Effective viewing angle θ1, θ2 ... Horizontal angle φ ..Vertical angle 1 ... Object to be measured 2 ... Laser light projector 3 ... Image processing device 4 ... Personal computer

───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.5 識別記号 庁内整理番号 FI 技術表示箇所 H04N 7/18 C ─────────────────────────────────────────────────── ─── Continuation of the front page (51) Int.Cl. 5 Identification code Office reference number FI Technical display location H04N 7/18 C

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 所定の離隔距離で配置した第1、第2の
カメラにより障害物に投射した充分に収束された光のス
ポットを撮影し、撮影した画像を画像処理して前記スポ
ットの位置である障害物上の一点の位置を求め、この処
理を連続して行うことにより3次元空間における障害物
全体の位置を求めることを特徴とする障害物位置検出方
法。
1. A sufficiently converged light spot projected on an obstacle is photographed by first and second cameras arranged at a predetermined separation distance, and the photographed image is image-processed at the spot position. An obstacle position detecting method, wherein the position of one point on an obstacle is obtained, and the position of the entire obstacle in a three-dimensional space is obtained by continuously performing this processing.
JP4291150A 1992-10-29 1992-10-29 Detecting method for position of obstacle Pending JPH06137828A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP4291150A JPH06137828A (en) 1992-10-29 1992-10-29 Detecting method for position of obstacle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP4291150A JPH06137828A (en) 1992-10-29 1992-10-29 Detecting method for position of obstacle

Publications (1)

Publication Number Publication Date
JPH06137828A true JPH06137828A (en) 1994-05-20

Family

ID=17765096

Family Applications (1)

Application Number Title Priority Date Filing Date
JP4291150A Pending JPH06137828A (en) 1992-10-29 1992-10-29 Detecting method for position of obstacle

Country Status (1)

Country Link
JP (1) JPH06137828A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100433272B1 (en) * 2001-07-03 2004-05-31 대한민국 An apparatus of detecting 3-D position of fruits and method of detecting 3-D position of it
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
CN106926247A (en) * 2017-01-16 2017-07-07 深圳前海勇艺达机器人有限公司 With the robot looked for something in automatic family
JP2017189856A (en) * 2016-04-15 2017-10-19 オムロン株式会社 Actuator control system, actuator control method, information processing program, and recording medium
CN107490338A (en) * 2017-04-27 2017-12-19 安徽华脉科技发展有限公司 A kind of Mechanical Parts Size vision detection system
CN107957237A (en) * 2016-10-17 2018-04-24 维蒂克影像国际无限责任公司 Laser projector with flash alignment
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
CN111300428A (en) * 2020-03-23 2020-06-19 北京海益同展信息科技有限公司 Robot, robot control method, and storage medium
JP2020110916A (en) * 2020-04-07 2020-07-27 オムロン株式会社 Actuator control system, sensor device, control device, actuator control method, information processing program, and recording medium
CN111805628A (en) * 2020-07-21 2020-10-23 上饶师范学院 An efficient and precise drilling device suitable for furniture panels

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5537982A (en) * 1978-09-11 1980-03-17 Ishikawajima Harima Heavy Ind Co Ltd Solid-shape detector for characteristic test of deformation of curved-surface body

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5537982A (en) * 1978-09-11 1980-03-17 Ishikawajima Harima Heavy Ind Co Ltd Solid-shape detector for characteristic test of deformation of curved-surface body

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
KR100433272B1 (en) * 2001-07-03 2004-05-31 대한민국 An apparatus of detecting 3-D position of fruits and method of detecting 3-D position of it
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US8854001B2 (en) 2004-01-21 2014-10-07 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8378613B2 (en) 2004-01-28 2013-02-19 Irobot Corporation Debris sensor for cleaning apparatus
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US8839477B2 (en) 2007-05-09 2014-09-23 Irobot Corporation Compact autonomous coverage robot
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US8438695B2 (en) 2007-05-09 2013-05-14 Irobot Corporation Autonomous coverage robot sensing
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
WO2017179452A1 (en) * 2016-04-15 2017-10-19 オムロン株式会社 Actuator control system, actuator control method, information-processing program, and recording medium
US11209790B2 (en) 2016-04-15 2021-12-28 Omron Corporation Actuator control system, actuator control method, information processing program, and storage medium
JP2017189856A (en) * 2016-04-15 2017-10-19 オムロン株式会社 Actuator control system, actuator control method, information processing program, and recording medium
CN107957237A (en) * 2016-10-17 2018-04-24 维蒂克影像国际无限责任公司 Laser projector with flash alignment
CN107957237B (en) * 2016-10-17 2021-04-20 维蒂克影像国际无限责任公司 Laser projector with flash alignment
CN106926247A (en) * 2017-01-16 2017-07-07 深圳前海勇艺达机器人有限公司 With the robot looked for something in automatic family
CN107490338A (en) * 2017-04-27 2017-12-19 安徽华脉科技发展有限公司 A kind of Mechanical Parts Size vision detection system
CN111300428A (en) * 2020-03-23 2020-06-19 北京海益同展信息科技有限公司 Robot, robot control method, and storage medium
JP2020110916A (en) * 2020-04-07 2020-07-27 オムロン株式会社 Actuator control system, sensor device, control device, actuator control method, information processing program, and recording medium
CN111805628A (en) * 2020-07-21 2020-10-23 上饶师范学院 An efficient and precise drilling device suitable for furniture panels

Similar Documents

Publication Publication Date Title
JPH06137828A (en) Detecting method for position of obstacle
JP7194015B2 (en) Sensor system and distance measurement method
Nitzan Three-dimensional vision structure for robot applications
JP5480914B2 (en) Point cloud data processing device, point cloud data processing method, and point cloud data processing program
US12067083B2 (en) Detecting displacements and/or defects in a point cloud using cluster-based cloud-to-cloud comparison
US7502504B2 (en) Three-dimensional visual sensor
JP2005324297A (en) robot
JP3690581B2 (en) POSITION DETECTION DEVICE AND METHOD THEREFOR, PLAIN POSITION DETECTION DEVICE AND METHOD THEREOF
JP2980195B2 (en) Method and apparatus for measuring rebar diameter
US20230186437A1 (en) Denoising point clouds
EP4009273A1 (en) Cloud-to-cloud comparison using artificial intelligence-based analysis
JP3696336B2 (en) How to calibrate the camera
JP3696335B2 (en) Method for associating each measurement point of multiple images
JP3343583B2 (en) 3D surface shape estimation method using stereo camera with focal light source
JP3638569B2 (en) 3D measuring device
JPH0875454A (en) Ranging device
Czajewski et al. Development of a laser-based vision system for an underwater vehicle
JP2970835B2 (en) 3D coordinate measuring device
JP3519296B2 (en) Automatic measurement method and automatic measurement device for thermal image
JPS60183509A (en) visual device
Hata et al. 3D vision sensor with multiple CCD cameras
JPH09231371A (en) Image information input device and image information input method
KR100395773B1 (en) Apparatus for measuring coordinate based on optical triangulation using the images
JP3222664B2 (en) Member position / posture measuring method and member joining method
CN116091608B (en) Positioning method and positioning device for underwater target, underwater equipment and storage medium