JPH0797408B2 - Moving object position detection image processing method - Google Patents
Moving object position detection image processing methodInfo
- Publication number
- JPH0797408B2 JPH0797408B2 JP62225814A JP22581487A JPH0797408B2 JP H0797408 B2 JPH0797408 B2 JP H0797408B2 JP 62225814 A JP62225814 A JP 62225814A JP 22581487 A JP22581487 A JP 22581487A JP H0797408 B2 JPH0797408 B2 JP H0797408B2
- Authority
- JP
- Japan
- Prior art keywords
- moving object
- data array
- frame difference
- section
- processing method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Landscapes
- Image Analysis (AREA)
Description
【発明の詳細な説明】 産業上の利用分野 本発明は移動物体の位置検出を画像処理で行う方法に関
する。Description: TECHNICAL FIELD The present invention relates to a method for detecting the position of a moving object by image processing.
従来の技術 移動画像の処理の中には、例えば特開昭61−153774号公
報に示されているように、撮像時刻の異なる2つの画像
のフレーム差分を求め、両画像中移動物体のみを描出す
る構造のものがある。2. Description of the Related Art In the processing of moving images, for example, as shown in Japanese Patent Laid-Open No. 61-153774, a frame difference between two images having different image capturing times is calculated and only moving objects in both images are drawn. There is a structure that does.
発明が解決しようとする問題点 前述した従来の処理方法では、描出した移動画像の重心
を求めて、移動物体の位置を推定することはできるもの
の、物体の移動量が少ない時あるいは処理系の雑音等に
よって、誤差が増大してしまう不都合がある。Problems to be Solved by the Invention In the conventional processing method described above, the position of a moving object can be estimated by obtaining the center of gravity of a drawn moving image, but when the amount of movement of the object is small or noise in the processing system is present. Therefore, there is a disadvantage that the error increases.
問題点を解決するための手段 移動物体の撮像時刻の異なる2つの画像のフレーム差分
を求め、このフレーム差分画像のx軸,y軸への投影加算
した結果の1次元データ配列を2値化して、この2値化
データ配列の閾値以上の点の数が特定値以上のとき、2
値化データ配列中のフレーム差分画像の重心位置および
重心位置の分散度を求め、該分散度が特定値以上のと
き、前記重心位置を用いて移動物体の新しい位置を求め
るものである。Means for Solving Problems A frame difference between two images at different imaging times of a moving object is obtained, and the one-dimensional data array resulting from the projection addition of the frame difference image on the x-axis and the y-axis is binarized. , If the number of points in the binarized data array that is equal to or greater than the threshold value is equal to or greater than a specific value, 2
The barycentric position of the frame difference image in the binarized data array and the degree of dispersion of the barycentric position are obtained, and when the degree of dispersion is a specific value or more, the barycentric position is used to obtain a new position of the moving object.
実施例 以下、本発明の実施例を図面に基づいて詳述する。Embodiment Hereinafter, an embodiment of the present invention will be described in detail with reference to the drawings.
この実施例では、撮像装置のビデオ信号をコンピュータ
で画像処理する場合について例示してある。In this embodiment, the case where the video signal of the image pickup apparatus is image-processed by the computer is illustrated.
第1図に示すように、 (1)セクションで移動物体の2次元位置の初期値
(x0,y0)をコンピュータに入力する。As shown in FIG. 1, in the section (1), the initial value (x 0 , y 0 ) of the two-dimensional position of the moving object is input to the computer.
(2)セクションで移動物体の撮像時刻の異なる2つ
の画像のうちの撮像時刻の早い方の第1画像f1(x,y)
を撮像装置からコンピュータに入力する。(2) The first image f 1 (x, y) of the earlier of the two images of the moving object at different imaging times in the section
Is input to the computer from the imaging device.
(3)セクションで前記2つの画像のうちの撮像時刻
の遅い方の第2画像f2(x,y)を撮像装置からコンピュ
ータに入力する。なお、前記2つの入力画像の寸法はNx
×Nyとする。In the section (3), the second image f 2 (x, y) of the two images having the later imaging time is input from the imaging device to the computer. The size of the two input images is Nx
× Ny.
(4)そして、コンピュータでセクション〜の処理
を行う。(4) Then, the computer performs the processing of the sections 1 to 3.
セクションで第1画像f1(x,y)と第2画像f2(x,y)
とのフレーム差分g(x,y)=|f1(x,y)−f2(x,y)|
を求める。Section 1st image f 1 (x, y) and 2nd image f 2 (x, y)
Frame difference g (x, y) = | f 1 (x, y) −f 2 (x, y) |
Ask for.
(5)セクションでx軸,y軸への加算データ配列Gx
(x),Gy(y)およびこれらの平均濃度x,yを下
式より求める。(5) Addition data array Gx to x-axis and y-axis in section
(X), Gy (y) and their average densities x, y are calculated by the following equation.
(6)セクションで平均濃度x,yと雑音強度の分
散σx,σyに基づき、Gx(x),Gy(y)の2値化デー
タ配列G′x(x),G′y(y)を下記より求める。 In the section (6), the binarized data array G'x (x), G'y (y) of Gx (x), Gy (y) is calculated based on the average density x, y and the variance σx, σy of the noise intensity. Obtain from the following.
但し、nは実験より求めた任意の定数。 However, n is an arbitrary constant obtained from the experiment.
尚、前記雑音強度の分散σx,σyは、入力画像f1(x,
y)、f2(x,y)に移動物が無い時、前記Gx(x),Gy
(y),x,yを求め、下式より計算する。It should be noted that the variances σ x and σ y of the noise intensity are calculated by using the input image f 1 (x,
y), f 2 (x, y) when there is no moving object, the above Gx (x), Gy
(Y), x, y is calculated and calculated from the following formula.
(7)セクションでG′x(x),G′y(y)の有意
性判定を行い、下記条件を満足しないときは、セクショ
ンに戻る。 (7) The significance of G'x (x) and G'y (y) is judged in the section, and when the following conditions are not satisfied, the section is returned to.
但し、t11,t12は経験則より定められる特定値。 However, t 11 and t 12 are specific values determined by empirical rules.
(8)上記条件を満足したときには、セクションで2
値化データ配列G′x(x),G′y(y)の重心位置
,を下式により求める。(8) If the above conditions are met, the section 2
The barycentric position of the digitized data array G'x (x), G'y (y) is calculated by the following formula.
(9)セクションで重心位置,についての分散度
Vx,Vyを下式より求める。 (9) Dispersion degree about center of gravity position in section
Calculate Vx and Vy from the following formula.
なお、t21,t22は移動物体の寸法および速度変化分に依
存する特定値。 Note that t 21 and t 22 are specific values that depend on the size of the moving object and the change in speed.
(10)セクションで分散度Vx,Vyの有意性判定を行
い、下記条件を満足しないときは、セクションに戻
る。(10) The significance of dispersion degree Vx, Vy is judged in the section, and if the following conditions are not satisfied, return to the section.
条件;Vx>t31又は Vy>t32 但し、t31,t32は実験より求めた特定値。Conditions; Vx> t 31 or Vy> t 32 However, t 31 and t 32 are specific values obtained by experiments.
(11)上記条件を満足したときは、セクションで前記
重心位置,を用いて下式より移動物体の新しい位置
x′0,y′0を求める。(11) When the above condition is satisfied, the new position x ′ 0 , y ′ 0 of the moving object is obtained from the following equation using the center of gravity position in the section.
x′0=x0+2(−x0) y′0=x0+2(−y0) (12)セクションでデータの更新を、 (x′0,y′0)→(x0,y0) f2(x2,y2)→f1(x1,y1) と行って、セクションに戻るのである。 x '0 = x 0 +2 ( -x 0) y' 0 = x 0 +2 (-y 0) (12) the updating of data in the section, (x '0, y' 0) → (x 0, y 0 ) F 2 (x 2 , y 2 ) → f 1 (x 1 , y 1 ) to return to the section.
発明の効果 以上のように本発明によれば、フレーム差分のx軸,y軸
への加算データ配列を2値化し、x軸方向,y軸方向の重
心位置を求める一方、2回の有意性判定を併用するの
で、処理系の雑音に感応し難く、物体の移動量が少ない
時でも、安定した位置検出ができる。しかも、フレーム
差分をx軸,y軸に加算投影後は、すべて1次元データ配
列に対する処理とすることができるので、処理時間の増
大化を招くことがないという等の新規な効果がある。As described above, according to the present invention, the addition data array of the frame difference to the x-axis and the y-axis is binarized, and the barycentric position in the x-axis direction and the y-axis direction is obtained, while the significance of two times is significant. Since the determination is also used, it is difficult to be sensitive to noise in the processing system, and stable position detection can be performed even when the amount of movement of the object is small. In addition, after the frame difference is added and projected onto the x-axis and the y-axis, all the processing can be performed on the one-dimensional data array, so that there is a new effect that the processing time is not increased.
第1図は本発明の一実施例を示すフローチャートであ
る。 f1(x,y)……第1画像、f2(x,y)……第2画像、g
(x,y)……フレーム差分、Gx(x),Gy(y)……加算
データ配列、G′x(x),G′y(y)……2値化デー
タ配列、,……重心位置、Vx,Vy……分散度。FIG. 1 is a flow chart showing an embodiment of the present invention. f 1 (x, y) …… first image, f 2 (x, y) …… second image, g
(X, y) ... Frame difference, Gx (x), Gy (y) ... Addition data array, G'x (x), G'y (y) ... Binary data array ,. Position, Vx, Vy ... Dispersion.
Claims (1)
フレーム差分を求め、このフレーム差分画像のx軸,y軸
への投影加算した結果の1次元データ配列を2値化し
て、この2値化データ配列の閾値以上の点の数が特定値
以上のとき、2値化データ配列中のフレーム差分画像の
重心位置および重心位置の分散度を求め、該分散度が特
定値以上のとき、前記重心位置を用いて移動物体の新し
い位置を求めることを特徴とする移動物体の位置検出画
像処理方法。1. A frame difference between two images of a moving object at different imaging times is obtained, and the one-dimensional data array obtained by projecting and adding the frame difference image to the x-axis and the y-axis is binarized, and When the number of points equal to or greater than the threshold value of the binarized data array is equal to or greater than a specific value, the barycentric position of the frame difference image in the binarized data array and the degree of dispersion of the barycentric position are obtained. A position detection image processing method for a moving object, characterized in that a new position of the moving object is obtained using the position of the center of gravity.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP62225814A JPH0797408B2 (en) | 1987-09-09 | 1987-09-09 | Moving object position detection image processing method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP62225814A JPH0797408B2 (en) | 1987-09-09 | 1987-09-09 | Moving object position detection image processing method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| JPS6467690A JPS6467690A (en) | 1989-03-14 |
| JPH0797408B2 true JPH0797408B2 (en) | 1995-10-18 |
Family
ID=16835213
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| JP62225814A Expired - Lifetime JPH0797408B2 (en) | 1987-09-09 | 1987-09-09 | Moving object position detection image processing method |
Country Status (1)
| Country | Link |
|---|---|
| JP (1) | JPH0797408B2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220301185A1 (en) * | 2021-03-16 | 2022-09-22 | Honda Motor Co., Ltd. | Information processing apparatus, information processing method, and storage medium for estimating movement amount of moving object |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS61201583A (en) * | 1985-03-04 | 1986-09-06 | Toshiba Corp | Dynamic vector detector |
| JPH0814849B2 (en) * | 1985-11-12 | 1996-02-14 | ソニー株式会社 | Motion detection device |
| JPS62145383A (en) * | 1985-12-19 | 1987-06-29 | Toshiba Corp | Position recognition device |
-
1987
- 1987-09-09 JP JP62225814A patent/JPH0797408B2/en not_active Expired - Lifetime
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220301185A1 (en) * | 2021-03-16 | 2022-09-22 | Honda Motor Co., Ltd. | Information processing apparatus, information processing method, and storage medium for estimating movement amount of moving object |
| US12272075B2 (en) * | 2021-03-16 | 2025-04-08 | Honda Motor Co., Ltd. | Information processing apparatus, information processing method, and storage medium for estimating movement amount of moving object |
Also Published As
| Publication number | Publication date |
|---|---|
| JPS6467690A (en) | 1989-03-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1329850B1 (en) | Apparatus, program and method for detecting both stationary objects and moving objects in an image | |
| US6647131B1 (en) | Motion detection using normal optical flow | |
| JP6620881B2 (en) | Apparatus, system, method, and program for measuring the number of passengers, and apparatus, method, and program for calculating travel distance of vehicle | |
| CN102985957B (en) | Vehicle Surrounding Monitoring Device | |
| JP2006059252A (en) | Method, device and program for detecting movement, and monitoring system for vehicle | |
| JP2021089778A (en) | Information processing apparatus, information processing method, and program | |
| JP7048157B2 (en) | Analytical equipment, analysis method and program | |
| CN115272730A (en) | Method for removing dynamic point for autonomous mobile platform, system and equipment thereof | |
| CN115147683B (en) | Training method of posture estimation network model, posture estimation method and device | |
| JPH0797408B2 (en) | Moving object position detection image processing method | |
| JP3364771B2 (en) | Moving object detection device | |
| WO2010089938A1 (en) | Rotation estimation device, rotation estimation method, and storage medium | |
| JP2002190027A (en) | System and method for measuring speed by image recognition | |
| JPH10123163A (en) | Flow velocity distribution measurement method | |
| WO2021193816A1 (en) | Imaging device, imaging method, and program | |
| WO2022249534A1 (en) | Information processing device, information processing method, and program | |
| JP4595834B2 (en) | Motion vector detection method and apparatus | |
| WO2022107548A1 (en) | Three-dimensional skeleton detection method and three-dimensional skeleton detection device | |
| JP2743509B2 (en) | Method for determining binarization threshold of floc measuring device | |
| JP2007078354A (en) | Length measuring device | |
| JP7605089B2 (en) | KEY POINT CORRECTION DEVICE, KEY POINT CORRECTION METHOD, AND PROGRAM | |
| CN113691731B (en) | Processing method and device and electronic equipment | |
| JPH0829358A (en) | Product inspection method by image processing | |
| JP3334451B2 (en) | Moving object detection device | |
| JP2011053731A (en) | Image processor |