JPH09330415A - Image monitoring method and image monitoring system - Google Patents
Image monitoring method and image monitoring systemInfo
- Publication number
- JPH09330415A JPH09330415A JP8147190A JP14719096A JPH09330415A JP H09330415 A JPH09330415 A JP H09330415A JP 8147190 A JP8147190 A JP 8147190A JP 14719096 A JP14719096 A JP 14719096A JP H09330415 A JPH09330415 A JP H09330415A
- Authority
- JP
- Japan
- Prior art keywords
- monitoring
- person
- image
- area
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
Abstract
(57)【要約】
【課題】監視条件に適応した判定アルゴリズムを選択す
る画像監視システムを提供する。
【解決手段】カメラ100により所定周期で撮影される複
数の監視エリアの画像を、人物抽出部300で解析しエリ
ア毎に人物画像を抽出する。人物監視部600は、監視エ
リア毎に順次、人物画像を取り込み移動軌跡を算出す
る。さらに、現在時期(年、月、日、時間)から監視対
象の監視環境を求める。対象が店舗の場合、監視環境は
開店または閉店となる。この監視環境と監視エリアの組
合せに対し、予め監視モードテーブル705に設定されて
いる処理モードを読み出す。例えば、ある移動軌跡は、
処理モード1(出入口を含む監視エリア、開店中)では
店舗への入場者に判定され、処理モード4(出入口を含
まない監視エリア、閉店中)では侵入者に判定される。
(57) [Abstract] [PROBLEMS] To provide an image monitoring system for selecting a determination algorithm adapted to a monitoring condition. A person extraction unit 300 analyzes images of a plurality of monitoring areas captured by a camera 100 in a predetermined cycle, and extracts a person image for each area. The person monitoring unit 600 sequentially takes in a person image for each monitoring area and calculates a movement locus. Furthermore, the monitoring environment of the monitoring target is obtained from the current time (year, month, day, time). When the target is a store, the monitoring environment is open or closed. The processing mode set in advance in the monitoring mode table 705 is read for this combination of the monitoring environment and the monitoring area. For example, a movement trajectory is
In processing mode 1 (monitoring area including the entrance / exit, while the store is open), it is determined to be a person entering the store, and in processing mode 4 (monitoring area not including the entrance / exit, while the store is closed), it is determined to be an intruder.
Description
【0001】[0001]
【発明の属する技術分野】本発明は画像監視システムに
係り、画像処理により抽出した移動軌跡を監視条件に適
応した判定処理によって侵入者を監視する方式に関す
る。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image monitoring system, and more particularly to a method for monitoring an intruder by a determination process adapted to a monitoring condition for a movement locus extracted by image processing.
【0002】[0002]
【従来の技術】画像処理による不特定多数の人が集まる
空間の安全管理方式として、特開平4−60880号公
報には、監視エリアを撮影した画像データの位置、移動
速度などによる軌跡から人物の行動を把握し、禁止エリ
アへの侵入者を監視する方法が記載されている。また、
昼と夜で画像処理のしきい値を変更することも知られて
いる。2. Description of the Related Art As a safety management system for a space where an unspecified number of people gather by image processing, Japanese Patent Application Laid-Open No. 4-60880 discloses that a person can be detected from a locus depending on the position of image data of a surveillance area and moving speed. It describes how to understand behavior and monitor intruders in the prohibited areas. Also,
It is also known to change the threshold of image processing day and night.
【0003】[0003]
【発明が解決しようとする課題】同一監視領域でも時間
帯や季節により、監視モードを変更したい場合がある。
例えば、店舗の出入り口付近では、開店時と閉店時で人
物の挙動に違いがあるだけでなく、侵入者と判断するた
めのアルゴリズムが相違する。There is a case where it is desired to change the monitoring mode depending on the time zone and the season even in the same monitoring area.
For example, in the vicinity of the entrance / exit of a store, not only is there a difference in the behavior of the person when the store is opened and when the store is closed, but the algorithms for determining an intruder are different.
【0004】しかし、従来の画像監視方法では、監視対
象に対応して人物の行動把握の解析方法が固定して、専
用化されているため、監視対象や監視目的に応じて監視
装置を設置しなければならず、システムコストが上昇す
るのみならず、安全管理が複雑になるという問題があっ
た。However, in the conventional image monitoring method, since the analysis method for grasping the behavior of a person is fixed and dedicated for each monitoring target, a monitoring device is installed according to the monitoring target and the monitoring purpose. However, there is a problem that not only the system cost rises but also the safety management becomes complicated.
【0005】本発明の目的は、従来技術の現状に鑑み、
同一の移動軌跡を基にしながら監視条件に適応したアル
ゴリズムで判定を行う画像監視方法及び画像監視システ
ムを提供することにある。これによって、不特定多数の
人物が出入りする場所で、入退場者の管理や不法な侵入
者の監視などを一元化できる。In view of the state of the art, the object of the present invention is to
An object of the present invention is to provide an image monitoring method and an image monitoring system that make a determination based on the same movement trajectory using an algorithm adapted to the monitoring conditions. This makes it possible to centralize the management of visitors and the monitoring of illegal intruders at places where an unspecified number of people enter and leave.
【0006】[0006]
【課題を解決するための手段】上記目的は、建物など監
視対象周辺の画像を時系列に取り込み、所定の画像処理
を経て抽出した人物の移動軌跡から、監視対象に対する
人物の行動を監視する画像監視方法において、監視対象
の活動または休止の時間に依存する環境を現在時間から
求め、その環境に適応した処理モードを選択して、ある
いは、前記監視エリア毎に適応した処理モードを選択し
て、前記移動軌跡の人物の行動を判定することにより達
成される。または、監視エリアと監視条件に対応して設
定される処理モードを切替しながら、前記移動軌跡の人
物の行動をオンラインで判定することにより達成され
る。[Means for Solving the Problems] The above-described object is an image for monitoring the behavior of a person with respect to the monitoring target from the movement trajectory of the person, which is obtained by time-sequentially capturing an image of the periphery of the monitoring target such as a building and performing predetermined image processing. In the monitoring method, the environment depending on the time of activity or pause of the monitored object is obtained from the current time, and a processing mode adapted to the environment is selected, or a processing mode adapted for each of the monitoring areas is selected, This is achieved by determining the behavior of the person on the movement trajectory. Alternatively, it can be achieved by switching the processing mode set corresponding to the monitoring area and the monitoring condition and determining the action of the person on the movement locus online.
【0007】ここで、監視対象が店舗の場合、前記時間
に依存する環境は開店状態または閉店状態である。ま
た、前記監視エリアは監視対象の出入口付近を含み、前
記監視条件に監視対象の開/閉の環境が設定されている
場合に、前記開の環境に対応する処理モードで「入場
者」または「退場者」と判定される移動軌跡は、前記閉
の環境に対応する処理モードで「侵入者」と判定され
る。Here, when the monitoring target is a store, the time-dependent environment is a store open state or a store closed state. Further, the monitoring area includes the vicinity of the entrance / exit of the monitoring target, and when an open / closed environment of the monitoring target is set in the monitoring condition, the "entry person" or The movement trajectory determined as "exiter" is determined as "intruder" in the processing mode corresponding to the closed environment.
【0008】即ち、前記監視エリアが監視対象の出入口
付近を含む場合に、前記移動軌跡が前記出入口に向かっ
て移動し、その終点位置が前記出入口付近となるとき、
前記開の環境に対応する処理モードでは「入場者」、前
記閉の環境に対応する処理モードでは「侵入者」と判定
され、一方、前記監視エリアが監視対象の出入口付近以
外の周辺を含む場合に、前記移動軌跡が監視対象に向か
って移動し、その終点位置が予め設定されている監視対
象の監視ライン付近となるとき、前記開及び閉の環境に
対応する処理モードはともに「侵入者」と判定される。That is, when the monitored area includes the vicinity of the entrance / exit of the object to be monitored, when the movement locus moves toward the entrance / exit and the end position thereof is near the entrance / exit,
In the processing mode corresponding to the open environment, it is determined as "entry person" and in the processing mode corresponding to the closed environment, "intruder", while the monitoring area includes the periphery other than the vicinity of the entrance and exit of the monitoring target In addition, when the movement locus moves toward the monitoring target and the end point position is near the preset monitoring line of the monitoring target, the processing modes corresponding to the open and closed environments are both “intruder”. Is determined.
【0009】また、前記監視エリアが監視対象の出入口
付近以外の周辺をむ場合に、前記開の環境に対応する処
理モードで「滞留者」と判定される移動軌跡は、前記閉
の環境に対応する処理モードで「侵入者候補」と判定さ
れる。Further, when the monitoring area covers a region other than the vicinity of the entrance / exit of the object to be monitored, the movement locus determined as "resident" in the processing mode corresponding to the open environment corresponds to the closed environment. It is determined that the candidate is an “intruder candidate” in the processing mode.
【0010】即ち、前記監視エリアが監視対象の出入口
付近以外の周辺を含む場合に、前記移動軌跡が前記監視
エリアに予め設定されている監視ラインに近く且つ終点
位置を示していないとき、前記開の環境に対応する処理
モードでは「滞留者」と判定され、前記閉の環境に対応
する処理モードでは「侵入者候補」と判定される。That is, when the monitored area includes a periphery other than the vicinity of the entrance / exit of the object to be monitored, and the movement locus is near the monitoring line preset in the monitored area and does not indicate the end point position, the opening is performed. In the processing mode corresponding to the environment (1), it is determined as "retained person", and in the processing mode corresponding to the closed environment, it is determined as "intruder candidate".
【0011】なお、上記で「侵入者候補」の検知が連続
的に所定回数を超える場合は、「侵入者」と判定する。If the number of "intruder candidates" detected continuously exceeds the predetermined number of times, it is determined to be "intruder".
【0012】本発明の画像監視方法を適用する画像監視
システムは、建物など監視対象の周辺を周期的に撮像す
るテレビカメラと、撮像された画像を監視エリア毎に時
系列に取り込み所定の画像処理を行なう画像処理手段、
その画像処理を経て算出された所定の特徴量から人物を
抽出する人物抽出手段、その人物の移動軌跡から監視対
象に対する人物の行動を判定する人物監視手段を有する
画像監視装置とを備え、さらに、時間に依存する監視条
件と前記監視エリアの組合せに適応する処理モードを予
め設定する監視モードテーブルを有し、前記人物監視手
段が、現在時間から求めた前記監視条件と当該移動軌跡
の監視エリアを基に、前記監視モードテーブルより選択
した処理モードに従って前記移動軌跡の人物の行動を判
定することを特徴とする。An image monitoring system to which the image monitoring method of the present invention is applied includes a television camera which periodically captures an image of the periphery of an object to be monitored, such as a building, and the captured images in time series for each monitoring area, and a predetermined image processing. Image processing means for performing
A person extracting means for extracting a person from a predetermined feature amount calculated through the image processing; and an image monitoring apparatus having a person monitoring means for determining a person's behavior with respect to a monitoring target from the movement trajectory of the person, and There is a monitoring mode table that presets a processing mode adapted to a combination of a monitoring condition that depends on time and the monitoring area, and the person monitoring means sets the monitoring condition obtained from the current time and the monitoring area of the movement locus. Based on the processing mode selected from the monitoring mode table, the behavior of the person on the movement trajectory is determined.
【0013】また、1または複数の前記テレビカメラ毎
に、1または複数の監視エリアを設定する監視エリア設
定手段を有することを特徴とする。Further, it is characterized in that a surveillance area setting means for setting one or a plurality of surveillance areas is provided for each of the one or a plurality of the television cameras.
【0014】さらに、前記人物監視手段による判定が監
視対象に対する「侵入」または「侵入候補」の場合に、
その判定結果と該当移動軌跡を表示する表示手段を備え
ている。Further, if the judgment by the person monitoring means is “intrusion” or “intrusion candidate” with respect to the monitoring target,
A display means for displaying the determination result and the corresponding movement trajectory is provided.
【0015】本発明によれば、各監視エリアにおける人
物の把握と移動軌跡の抽出までは同一手順で行ない、人
物の移動軌跡に対する侵入者などの判定の処理モード
を、監視条件に応じて切り替える。According to the present invention, the process of grasping a person in each surveillance area and extracting the movement trajectory is performed in the same procedure, and the processing mode for determining an intruder or the like with respect to the movement trajectory of the person is switched according to the monitoring condition.
【0016】これによれば、監視対象や監視目的に応じ
て所望の処理モードを柔軟に適用でき、システムコスト
の安価な画像監視装置を提供できる。また、監視条件に
応じて処理モードを自動切り替えしながらオンライン監
視できるので、不特定多数の人物が出入りする場所での
安全管理を一元化でき、監視のための人手を低減でき
る。According to this, it is possible to flexibly apply a desired processing mode according to a monitoring target and a monitoring purpose, and it is possible to provide an image monitoring apparatus having a low system cost. Further, since online monitoring can be performed while automatically switching the processing mode according to the monitoring condition, safety management at a place where a large number of unspecified persons come in and out can be unified, and manpower for monitoring can be reduced.
【0017】[0017]
【発明の実施の形態】以下、本発明の一実施形態を図面
にしたがって詳細に説明する。図1は、一実施形態によ
る画像監視システムの構成図である。本システムでは、
ITVカメラ100が監視対象エリアを撮影し、その画像信号
を画像入力部200が取り込んでA/D変換する。BEST MODE FOR CARRYING OUT THE INVENTION An embodiment of the present invention will be described in detail below with reference to the drawings. FIG. 1 is a configuration diagram of an image monitoring system according to an embodiment. In this system,
The ITV camera 100 photographs the area to be monitored, and the image signal is captured by the image input unit 200 and A / D converted.
【0018】人物抽出部300は、A/D変換した入力画
像信号からフレーム間差分または、入力画像と基準画像
(例えば、サンプリング開始時に取り込んだ背景画像
等)の背景差分から、差分画像を作成してその濃度頻度
分布を算出する。さらに、濃度頻度分布のノイズ除去し
た濃度情報から、所定しきい値で2値化した2値画像や
その差分画像から変化領域を抽出し、後述する変化領域
の特徴量を算出して人物を抽出する。The person extraction unit 300 creates a difference image from the inter-frame difference from the A / D converted input image signal or the background difference between the input image and a reference image (for example, a background image captured at the start of sampling). To calculate the concentration frequency distribution. Further, from the density information obtained by removing the noise of the density frequency distribution, a change area is extracted from a binary image binarized by a predetermined threshold value or a difference image thereof, and a feature amount of the change area described later is calculated to extract a person. To do.
【0019】監視領域設定部500は、1つ以上の監視エ
リアを監視条件設定部700を介して監視モードテーブル7
05に設定する。監視モードテーブル705には監視エリア7
20毎に、監視対象の環境条件730、たとえば店舗であれ
ば開店/閉店のように、対象内に人間が活動している活
動状態か、休日や勤務時間外で人間が不在となる休止状
態かと、処理モード740が、監視条件設定部700より設定
される。The monitoring area setting unit 500 sets one or more monitoring areas to the monitoring mode table 7 via the monitoring condition setting unit 700.
Set to 05. Monitoring area 7 in monitoring mode table 705
For each 20th, the environmental condition 730 to be monitored, for example, if the store is a store open / closed, whether it is an active state in which humans are active in the target, or a dormant state in which humans are absent on holidays or after work hours. The processing mode 740 is set by the monitoring condition setting unit 700.
【0020】人物監視部600は、現在の監視時間と監視
エリアを基に監視モードテーブル705を参照し、その処
理モードに従って、人物抽出部300で抽出した人物が、
どの領域でどのような挙動をするか移動軌跡の抽出と行
動の判定を行い、侵入者または侵入者候補と判定した場
合、その画像を時系列にデータ格納部1200に蓄積し、表
示制御部1300を介して表示装置1400へリアルタイムに表
示する。The person monitoring unit 600 refers to the monitoring mode table 705 based on the current monitoring time and monitoring area, and the person extracted by the person extracting unit 300 according to the processing mode is
When the movement trajectory is extracted and the behavior is determined to determine what kind of behavior in which area, and when it is determined that the intruder or the intruder candidate, the image is accumulated in time series in the data storage unit 1200, and the display control unit 1300 Is displayed in real time on the display device 1400 via.
【0021】図2に、画像入力部の機能ブロック図を示
す。画像入力部200は、ITVカメラ100による対象シー
ンの映像信号を取り込む場合、一方の画像取り込み部F
k210より取り込まれた画像信号を基準(例えば、取り
込み開始時の最初にサンプリングした画像を基準とす
る)とし、背景画像とする。又は、フレーム間差分を行
う直前の画像として、もう一方の画像取り込み部Fi22
0により取り込まれた画像信号を処理対象の入力画像と
する。A/D変換部230で変換した、基準信号の画像
(背景画像又は直前画像)と処理対象の入力画像は人物
抽出部300に送信する。FIG. 2 shows a functional block diagram of the image input section. When the image input unit 200 captures the video signal of the target scene from the ITV camera 100, one image capturing unit F
The image signal captured from k210 is used as a reference (for example, the first sampled image at the start of capturing) is used as a background image. Alternatively, as the image immediately before the inter-frame difference is performed, the other image capturing unit Fi22
The image signal captured by 0 is set as the input image to be processed. The image of the reference signal (background image or immediately preceding image) converted by the A / D conversion unit 230 and the input image to be processed are transmitted to the person extraction unit 300.
【0022】図3は、監視領域設定部による領域設定の
例を示す説明図である。例えば、監視対象施設510に対
して、設備外壁と屋外の境界に侵入ライン550を、設備
外壁付近に監視エリア560を、設備の出入り口付近に監
視エリア570をマウス等を用いてマニュアル操作で設定
する。監視エリアは、任意の位置に複数の設定が可能で
あるが、同一の監視モードに対応して1つの監視領域を
設けるのがよい。また、ITVカメラ100の視野範囲
内、夜間等の照度不足の場合に用いる照明装置800(例
えば、LED赤外照明等)の照明可能範囲内であることは
言うまでもない。FIG. 3 is an explanatory diagram showing an example of area setting by the monitoring area setting unit. For example, with respect to the monitored facility 510, an intrusion line 550 is set at the boundary between the facility outer wall and the outdoor, a monitoring area 560 is set near the facility outer wall, and a monitoring area 570 is set near the facility entrance / exit by using a mouse or the like. . A plurality of monitoring areas can be set at arbitrary positions, but it is preferable to provide one monitoring area corresponding to the same monitoring mode. Also, it goes without saying that it is within the field of view of the ITV camera 100 and within the illuminatable range of the illumination device 800 (for example, LED infrared illumination) used when the illuminance is insufficient at night.
【0023】図4に、人物抽出部の機能ブロック図を示
す。人物抽出部300の差分画像算出部310は、A/D変換
後の画像信号を取り込んで、画像間の画素毎の濃度差を
算出して差分画像を作成し、濃度頻度分布算出部320が
差分画像の濃度頻度分布を濃度頻度分布Hj(j=0,
1,...,n)算出する。この後、ノイズ除去部330が、
濃度頻度分布Hjを数1により平滑化する。FIG. 4 shows a functional block diagram of the person extracting section. The difference image calculation unit 310 of the person extraction unit 300 takes in the image signal after A / D conversion, calculates the density difference for each pixel between the images, creates a difference image, and calculates the density frequency distribution. The unit 320 converts the density frequency distribution of the difference image into the density frequency distribution Hj (j = 0,
1 ,. . . , N) Calculate. After that, the noise removing unit 330
The density frequency distribution Hj is smoothed by the equation 1.
【0024】[0024]
【数1】Hj=(Hj-1+2Hj+Hj+1)/(4-1) H0=(2H0+H1)/(3-1) Hn=(2Hn+Hn-1)/(3-1) 但し、j=1,...,n−1である。また、nは濃度
の最大値である。## EQU1 ## Hj = (Hj-1 + 2Hj + Hj + 1) / (4-1) H0 = (2H0 + H1) / (3-1) Hn = (2Hn + Hn-1) / (3-1) where j = 1 ,. . . , N−1. Further, n is the maximum value of the density.
【0025】2値化しきい値算出部360は、ノイズ除去
した濃度頻度分布Hjから差分画像の2値化しきい値を
自動的に算出する。2値化処理部370はこの2値化しき
い値を用いて、差分画像算出部310からの差分画像を2
値化処理する。このとき、人物検知に適さない所定以上
の面積をもつ画像はノイズとして除外し、2値画像を作
成する。The binarization threshold calculation unit 360 automatically calculates the binarization threshold of the difference image from the noise-removed density frequency distribution Hj. The binarization processing unit 370 uses the binarization threshold value to convert the difference image from the difference image calculation unit 310 into two.
Quantize. At this time, an image having an area larger than a predetermined value which is not suitable for human detection is excluded as noise and a binary image is created.
【0026】特徴量算出部380は、差分画像算出部310に
よる差分画像と2値化処理部370による2値化画像か
ら、人物検知の特徴を算出して、人物の識別を行う。こ
こで、監視場所が暗く、差分画像算出部310で有効に差
分画像を抽出できない場合、照明装置800を点灯させ
る。照明装置800にLED赤外照明(近赤外)装置等を
用いると照明公害を生じることがなく、不審者に感付か
れることもない。The feature amount calculation unit 380 calculates the feature of person detection from the difference image obtained by the difference image calculation unit 310 and the binarized image obtained by the binarization processing unit 370 to identify the person. Here, when the monitoring place is dark and the difference image calculation unit 310 cannot effectively extract the difference image, the illumination device 800 is turned on. If an LED infrared lighting (near infrared) device or the like is used for the lighting device 800, no lighting pollution will occur and no suspicious person will notice.
【0027】図5は、2値化しきい値を自動算出するた
めの一例を示す説明図である。2値化しきい値算出部36
0は、ノイズ除去部330で平滑化した濃度頻度分布Hjか
ら求めた最大濃度(max)362から、2値化しきい値(t
h)を自動算出する。FIG. 5 is an explanatory diagram showing an example for automatically calculating the binarization threshold value. Binarization threshold value calculation unit 36
0 represents the maximum density (max) 362 obtained from the density frequency distribution Hj smoothed by the noise removal unit 330, and the binarization threshold value (t
h) is calculated automatically.
【0028】まず、変化部分の領域が多い場合、平滑化
した濃度頻度分布Hjは、同図(a)に示すようにな
る。ここで、最大濃度362から小さい方に濃度を探索す
ると、少なくとも1個の極大値363が存在する。この極
大値363を基準に、更に小さい方に濃度を探索すると、
最大濃度に最も近い極小値364が存在する。この極小値3
64の濃度値から最大濃度362までが、変化部分の濃度分
布である。従って、極小値364の濃度値を2値化しきい
値(th)364とする。これは、画像処理では、モード法
による2値化しきい値の決定である。First, when there are many areas of change, the smoothed density frequency distribution Hj becomes as shown in FIG. Here, when the density is searched from the maximum density 362 to the smaller one, there is at least one local maximum 363. Based on this maximum value 363, searching for a smaller concentration,
There is a local minimum 364 that is closest to the maximum concentration. This minimum 3
From the density value of 64 to the maximum density 362 is the density distribution of the changed portion. Therefore, the density value of the minimum value 364 is set as the binarization threshold value (th) 364. In image processing, this is the determination of the binarization threshold value by the modal method.
【0029】しかし、変化部分の領域が少ない場合、即
ち、変化部分の濃度の度数が少ないと、(a)のような
極大値363が存在しないで、同図(b)に示すような濃
度頻度分布となる。この場合の2値化しきい値(th)36
8は、数2により最大濃度(max)362からある定数Cを
差し引いた値に設定する。However, when the area of the changed portion is small, that is, when the frequency of the density of the changed portion is small, there is no maximum value 363 as shown in (a) and the density frequency as shown in FIG. Distribution. Binarization threshold (th) 36 in this case
8 is set to a value obtained by subtracting a constant C from the maximum density (max) 362 according to the equation 2.
【0030】[0030]
【数2】th=max−C (th≧thmin) ここで、thminは予め設定した下限しきい値366で、算出
したしきい値(th)368がthminより小さくなる場合は、
下限しきい値366とする。## EQU00002 ## th = max-C (th.gtoreq.thmin) where thmin is a preset lower limit threshold value 366, and when the calculated threshold value (th) 368 is smaller than thmin,
The lower threshold value is set to 366.
【0031】定数Cは対象画像の正常画像から、予め算
出した正常状態での白色ノイズの上限しきい値(thma
x)と下限しきい値(thmin)の差とする。あるいは、対
象画像の正常状態におけるノイズの最大変動幅を示す値
ならば、何でもよい。例えば、昼間の屋外環境を監視す
るシーンならば、上限しきい値thmaxは12〜14階調
程度、下限しきい値thminは4〜5階調程度である。従
って、定数Cは7〜10階調程度になる。なお、変化部
分の最大値に白色ノイズが含まれる場合でも、数1の平
滑化処理で低減されているので、濃度差のわずかな変化
でも精度よく設定できる。The constant C is an upper threshold (thma) of white noise in a normal state calculated in advance from the normal image of the target image.
x) and the lower threshold (thmin). Alternatively, any value may be used as long as it indicates the maximum fluctuation range of noise in the normal state of the target image. For example, in a scene where the outdoor environment is monitored in the daytime, the upper limit threshold thmax is about 12 to 14 gradations and the lower limit threshold thmin is about 4 to 5 gradations. Therefore, the constant C is about 7 to 10 gradations. Even when the maximum value of the changed portion includes white noise, the white noise is reduced by the smoothing process of Equation 1, so that even a slight change in the density difference can be set accurately.
【0032】図6に、2値化処理部の機能ブロック図を
示す。2値化部374は2値化しきい値thで2値画像を
作成する。微小面積除去部376は、抽出対象に不適な微
小面積の領域をノイズとして除去し、変化領域のみの2
値画像を作成する。2値化処理部370が変化領域として
2値化するのは、人物に適した大きさの面積をもつ領域
であり、例えば、画像の分解精度が512×512の場
合、数十ないし100画素以下の領域はノイズとして除外
する。FIG. 6 shows a functional block diagram of the binarization processing section. The binarization unit 374 creates a binary image with the binarization threshold th. The small area removing unit 376 removes an area having a small area unsuitable for extraction as noise, and changes only the change area to 2
Create a value image. The binarization processing unit 370 binarizes the changed area as an area having an area of a size suitable for a person. For example, when the resolution accuracy of an image is 512 × 512, several tens to 100 pixels or less. The area of is excluded as noise.
【0033】変化領域のみの2値画像は、抽出対象物体
が分離している場合が多い。画像補正部378は、これら
分離領域を膨張して連結した後、収縮して2値画像を補
正する。画像補正部378の膨張回数と収縮回数は同一回
数で、人物の抽出では2〜3回程度である。In many cases, in the binary image of only the change area, the object to be extracted is separated. The image correction unit 378 expands and connects the separated regions and then contracts them to correct the binary image. The image correction unit 378 has the same number of times of expansion and contraction, which is about two to three times when extracting a person.
【0034】図7に、特徴量算出部の機能ブロック図を
示す。差分画像を用いて濃度の分散算値、2値画像を用
いて面積、縦横比、移動距離及び移動方向を、i−1回
目、i回目、i+1回目と時系列に算出する。FIG. 7 shows a functional block diagram of the feature amount calculation section. The calculated variance of the density is calculated using the difference image, and the area, the aspect ratio, the moving distance, and the moving direction are calculated in time series using the binary image, i−1 time, i time, and i + 1 time.
【0035】この場合、濃度分散算出部381が算出する
濃度の分散値は、差分画像算出部310による差分画像の
濃度の分散値でも、該差分画像を2値化処理部370で2
値化して抽出した部分に限定してマスクした領域の差分
画像の濃度の分散値でもよい。In this case, even if the density variance calculated by the density variance calculator 381 is the density variance of the difference image calculated by the difference image calculator 310, the difference image is binarized by the binarization processor 370.
It may be a variance value of the densities of the difference image in the masked area limited to the part that is digitized and extracted.
【0036】面積算出部383は、2値化処理部370で2値
化した2値画像の画素数を計算して面積を算出する。縦
横比算出部385は、2値化処理部370で2値化した2値画
像を、X方向に投影してX軸の位置毎の画素数を累積し
たX投影分布と、Y方向に投影してY軸の位置毎の画素
数を累積したY投影分布の割合を、縦横比として算出す
る。The area calculation unit 383 calculates the area by calculating the number of pixels of the binary image binarized by the binarization processing unit 370. The aspect ratio calculation unit 385 projects the binary image binarized by the binarization processing unit 370 in the X direction and the X projection distribution obtained by accumulating the number of pixels at each position of the X axis, and the Y direction. Then, the ratio of the Y projection distribution obtained by accumulating the number of pixels for each position on the Y axis is calculated as the aspect ratio.
【0037】図8は、移動方向算出部における1回目の
移動方向の算出方法を示す説明図である。移動方向算出
部388は、1回目(i=1)に人物抽出部300で抽出した
2値画像の重心から移動方向を算出する。FIG. 8 is an explanatory view showing the first method of calculating the moving direction in the moving direction calculating section. The moving direction calculating unit 388 calculates the moving direction from the center of gravity of the binary image extracted by the person extracting unit 300 at the first time (i = 1).
【0038】即ち、1回目に抽出した人物の重心位置か
ら最も近い監視エリアの矩形の端辺を算出する。たとえ
ば、抽出人物478の位置の場合、最も近い監視エリア570
の矩形端辺は472である。この矩形端辺472に平行な線47
4を求め、抽出人物478の移動は前方半円476の方向を前
進とする。That is, the rectangular side of the monitoring area closest to the center of gravity of the person extracted for the first time is calculated. For example, in the case of the position of the extracted person 478, the closest monitoring area 570
The rectangular edge of is 472. A line 47 parallel to this rectangular edge 472
4 is obtained, and the movement of the extracted person 478 is forward in the direction of the front semicircle 476.
【0039】同様に、抽出人物486の場合は、矩形端辺
は480であり、矩形端辺480に平行な線482を求め、前方
半円484の方向を前進とする。抽出人物492の場合は、矩
形端辺は488であり、矩形端辺488に平行な線490を求
め、前方半円491の方向を前進とする。抽出人物496の場
合は、矩形端辺は493であり、矩形端辺493に平行な線49
4を求め、前方半円495の方向を前進とする。Similarly, in the case of the extracted person 486, the rectangular end side is 480, a line 482 parallel to the rectangular end side 480 is obtained, and the direction of the front semicircle 484 is set as the forward direction. In the case of the extracted person 492, the rectangular end side is 488, a line 490 parallel to the rectangular end side 488 is obtained, and the direction of the front semicircle 491 is set to the forward direction. In the case of the extracted person 496, the rectangle edge is 493, and the line 49 parallel to the rectangle edge 493 is
Find 4 and make the forward semicircle 495 the forward direction.
【0040】図9は、移動方向算出部における2回目以
降の移動方向の算出方法を示す説明図である。移動方向
算出部388は、i−1回目の重心位置475から前進した最
も近い位置の重心座標485に対応させて算出したベクト
ルai−1を移動方向とする。ここで、前進するとは、
同図に示すベクトルai−1の角度をθとすると、ベク
トルaiの角度がθ−π/2からπ/2+θの範囲内に
ある場合である。FIG. 9 is an explanatory diagram showing the second and subsequent movement direction calculation methods in the movement direction calculation unit. The moving direction calculation unit 388 sets the vector ai-1 calculated corresponding to the barycentric coordinate 485 at the closest position advanced from the i-1th barycentric position 475 as the moving direction. Here, to move forward is
When the angle of the vector ai-1 shown in the figure is θ, this is a case where the angle of the vector ai is within the range of θ−π / 2 to π / 2 + θ.
【0041】移動距離算出部387は、i−1回目に人物
抽出部300で抽出した2値画像の重心と、i回目に人
物抽出部300で抽出した2値画像の重心間の長さをi
回目の距離として算出する。但し、1回目の移動距離=
0とする。The moving distance calculation unit 387 calculates the distance between the center of gravity of the binary image extracted by the person extracting unit 300 at the (i-1) th time and the length between the center of gravity of the binary image extracted by the person extracting unit 300 at the ith time by i.
Calculated as the distance for the second time. However, the first moving distance =
Set to 0.
【0042】図10は、人物抽出部の抽出手順を示すフ
ロー図である。i回目において、まず、形状が縦長であ
るか否かチェックし(ステップ405)、形状が縦長の場
合(Y)、面積の大、中、小をチェックする(ステップ
410)。面積が中程度ならば、濃度の分散値をチェック
し(ステップ415)、濃度の分散値が大きい(Y)なら
ば、人物と判定する(ステップ420)。濃度の分散値が
大きくない場合は、外乱と判定する(ステップ425)。FIG. 10 is a flow chart showing the extraction procedure of the person extracting section. At the i-th time, it is first checked whether or not the shape is vertically long (step 405), and if the shape is vertically long (Y), large, medium, and small areas are checked (step 405).
410). If the area is medium, the density dispersion value is checked (step 415), and if the density dispersion value is large (Y), it is determined to be a person (step 420). If the dispersion value of the density is not large, it is determined to be a disturbance (step 425).
【0043】ここで、大人の全身人物像の場合、縦横比
は約4/1〜6/1である。面積はカメラの視野範囲に
追随するが、例えば、画像の分解精度が512×512
において、画面に人物の頭部から足部が映る視野範囲の
場合、約25000〜35000画素が中程度である。
ステップ410で面積が大きいと判定(上記の条件で約3
5000画素以上)した場合、外乱と判定する。Here, in the case of an adult whole-body image, the aspect ratio is about 4/1 to 6/1. The area follows the visual field range of the camera, but for example, the resolution accuracy of the image is 512 × 512.
In the case of the visual field range in which the person's head to feet are reflected on the screen, about 25,000 to 35,000 pixels are medium.
It is determined in step 410 that the area is large (about 3 under the above conditions).
If it is 5000 pixels or more), it is determined to be a disturbance.
【0044】次に、ステップ410で面積が小さいと判定
(上記の条件で約25000画素以下)した場合、この
手順を通過する度に連続検知回数をカウントして、n回
連続したかチエックする(ステップ430)。n回連続通
過の場合、移動方向をチェックし(ステップ435)、移
動方向が同一方向(Y)の場合、人物と判定する(ステ
ップ440)。Next, when it is determined in step 410 that the area is small (about 25,000 pixels or less under the above conditions), the number of consecutive detections is counted each time this procedure is passed, and it is checked whether or not n consecutive times ( Step 430). In the case of n consecutive passages, the moving direction is checked (step 435), and if the moving direction is the same direction (Y), it is determined to be a person (step 440).
【0045】ステップ430の連続検知回数nは、例え
ば、人物の通常歩行の場合で約2〜4回である(i−2
回目、i−1回目、i回目と連続通過する場合、n=
3)。ステップ430の通過が1回限り又は連続しない場
合、外乱と判定する(ステップ425)。また、ステップ4
35で移動方向が同一方向でない場合も外乱と判定する。
ステップ435における移動方向は、図8で説明した前進
する場合を同一方向とする。The number of consecutive detections n in step 430 is, for example, about 2 to 4 times when a person walks normally (i-2).
In the case of continuous passage at the 1st, i−1th, and ith times, n =
3). If the passage in step 430 is one-time or not continuous, it is determined to be a disturbance (step 425). Step 4
Even if the movement directions are not the same in 35, it is determined to be a disturbance.
The moving directions in step 435 are the same when moving forward as described in FIG.
【0046】最後に、ステップ405で形状が縦長でない
場合、移動距離をチェックし(ステップ445)、移動距
離が小さい場合(Y)に移動方向をチェックし(ステッ
プ450)、移動方向が規則的でない場合(N)は面積を
チェックする(ステップ455)。面積が小さい場合
(Y)、所定時間内の検知回数をカウントし(ステップ
460)、n回検知した場合(Y)、人物と判定する(ス
テップ465)。この人物は、n回の検知によっても面積
小であるから、あまり移動することなく同一場所付近に
停滞している滞留者であり、通常はその付近での作業者
である可能性が高い。Finally, if the shape is not vertically long in step 405, the moving distance is checked (step 445), and if the moving distance is small (Y), the moving direction is checked (step 450), and the moving direction is not regular. If (N), the area is checked (step 455). When the area is small (Y), the number of detections within a predetermined time is counted (step
460), if it is detected n times (Y), it is determined to be a person (step 465). Since this person has a small area even after being detected n times, it is a resident person who does not move much and stays in the vicinity of the same place, and is usually likely to be a worker in the vicinity.
【0047】ステップ445で移動距離が大きい場合、外
乱と判定する(ステッ470)。ステップ445の移動距離
は、カメラの視野範囲に追随するが、例えば、画像の分
解精度が512×512において、画面に人物の全身像
が映る視野範囲で、人物の通常歩行の場合、約50画素
程度以下を移動距離小とする。ステップ450の移動方向
が規則的な場合は外乱と判定する。ステップ450におけ
る移動方向は、ステップ435の場合と同じであり、直前
のベクトルと比較して同一方向でない場合や、移動方向
に周期性がない場合を規則的でないとする。If the movement distance is large in step 445, it is determined to be a disturbance (step 470). The moving distance in step 445 follows the visual field range of the camera. For example, when the resolution of the image is 512 × 512, the visual field range in which the whole body image of the person appears on the screen. The travel distance is less than the degree. If the moving direction in step 450 is regular, it is determined to be a disturbance. The moving direction in step 450 is the same as that in step 435, and it is assumed that the vector is not regular when compared with the immediately preceding vector or when the moving direction has no periodicity.
【0048】ステップ455でチェックした面積が小さく
ない場合も外乱と判定する。ステップ455における面積
は、約25000画素程度以上が小さくない場合であ
り、これ以下が小さい場合である。ステップ460におい
て、所定時間内の検知回数をカウントし、n回検知しな
い場合、ステップ470で外乱と判定する。ステップ460に
おける所定時間は、監視対象に追随するが、例えば、不
審な作業者検知の場合は数分以内でよく、この時間内に
ステップ460の手順を通過する回数が、1回の処理時間
が約500ミリ秒の場合で数回以上カウントできればよ
い。Even if the area checked in step 455 is not small, it is determined to be a disturbance. The area in step 455 is about 25,000 pixels or more when the area is not small, and is less than this area when it is small. In step 460, the number of times of detection within a predetermined time is counted, and if it is not detected n times, it is determined as a disturbance in step 470. The predetermined time in step 460 follows the monitoring target, but for example, in the case of suspicious worker detection, it may be within a few minutes, and the number of times the procedure of step 460 is passed within this time is one processing time. It is only necessary to be able to count several times in the case of about 500 milliseconds.
【0049】図11は、人物監視部の構成を示す機能ブ
ロック図である。人物監視部600の移動軌跡算出部603
は、人物抽出部300により抽出され、監視エリア毎の人
物の座標データを、時系列に蓄積しながら人物の移動軌
跡を算出する。算出した移動軌跡は人物別にナンバーリ
ングし、監視エリア順に監視モード切替部613に渡す。
また、当該監視タイミングでの全ての監視エリアの処理
が終了すれば、終了信号を出力する。FIG. 11 is a functional block diagram showing the structure of the person monitoring section. Movement locus calculation unit 603 of the person monitoring unit 600
Is calculated by the person extraction unit 300, and the movement trajectory of the person is calculated while accumulating the coordinate data of the person for each monitoring area in time series. The calculated movement locus is numbered for each person and passed to the monitoring mode switching unit 613 in the order of monitoring areas.
When the processing of all the monitoring areas at the monitoring timing is completed, the end signal is output.
【0050】監視モード切替部613は、監視モードテー
ブル705に設定された処理モードに従い、監視エリアに
応じた監視モードの切り替えを行なう。監視処理部623
は移動軌跡算出部603から渡される人物の移動軌跡に対
し、処理モード毎の行動解析アルゴリズムによって追跡
し、入場者、移動者、侵入者などの判定を行なう。解析
の開始から判定までのデータはデータ格納部1200に蓄積
し、侵入者ないし侵入候補者を検知すると、表示制御部
1300に出力する。The monitoring mode switching unit 613 switches the monitoring mode according to the monitoring area according to the processing mode set in the monitoring mode table 705. Monitoring processing unit 623
Tracks the movement locus of the person passed from the movement locus calculation unit 603 by an action analysis algorithm for each processing mode, and determines a visitor, a moving person, an intruder, or the like. Data from the start of analysis to the determination is accumulated in the data storage unit 1200, and when an intruder or intrusion candidate is detected, the display control unit
Output to 1300.
【0051】図12に、監視モードテーブルのフォーマ
ットを示す。監視条件設定部700の入力モードによっ
て、表示装置1400の画面上に入力フォーマットが表示さ
れ、キーボードや監視領域設定部500からの条件設定が
可能になる。監視レベル710はグレーゾーンにある侵入
者の判定レベルで、この例では侵入者候補を侵入者と判
定するときの連続検知回数mが設定される。FIG. 12 shows the format of the monitoring mode table. Depending on the input mode of the monitoring condition setting unit 700, the input format is displayed on the screen of the display device 1400, and the conditions can be set from the keyboard or the monitoring area setting unit 500. The monitoring level 710 is a determination level of an intruder in the gray zone, and in this example, the continuous detection number m when determining an intruder candidate as an intruder is set.
【0052】監視領域720は監視領域設定部500により、
監視対象の画面上で設定されるもので、この例では建物
の出入口を含む監視エリア570と、出入口を含まない建
物周辺の監視エリア560を設定している。なお、出入口
を含む出入口付近に出入口エリア540、及び、建物の外
壁に沿って監視ライン550をそれぞれ設定している。The monitoring area 720 is set by the monitoring area setting unit 500.
It is set on the screen of the monitoring target, and in this example, the monitoring area 570 including the entrance / exit of the building and the monitoring area 560 around the building not including the entrance / exit are set. An entrance area 540 is set near the entrance including the entrance, and a monitoring line 550 is set along the outer wall of the building.
【0053】この外にも、監視エリアの設定は表側か裏
側か、危険物や重要物の配置付近かなど、監視対象と監
視の目的に応じ、さらにカメラ台数などによって任意に
設定できる。In addition to this, the monitoring area can be arbitrarily set depending on the object to be monitored and the purpose of monitoring, such as whether the monitoring area is on the front side or the back side or near the placement of dangerous or important objects.
【0054】監視環境730は、監視対象の動作状態/静
止状態あるいは開放状態/閉鎖状態などの観点から設定
されるもので、通常は時間帯、期日、時期などによって
定まる条件である。本例では、監視対象を店舗とし、開
店または閉店の2通りを監視エリア毎に設定している。The monitoring environment 730 is set from the viewpoint of the operating state / stationary state or the open state / closed state of the monitored object, and is usually a condition determined by time zone, due date, time and the like. In this example, the monitoring target is a store, and two types of opening and closing are set for each monitoring area.
【0055】処理モード740は、監視領域720と監視環境
730に対応した人物監視の処理モードで、予め用意して
ある複数の行動解析アルゴリズムの中から設定される。
本例では、監視の厳しさや対応の異なるモード1からモ
ード4を設定している。The processing mode 740 is the monitoring area 720 and the monitoring environment.
It is set in a person monitoring processing mode corresponding to 730 from among a plurality of behavior analysis algorithms prepared in advance.
In this example, modes 1 to 4 are set, which differ in the severity of monitoring and the correspondence.
【0056】図13は、監視モードの切替処理を示すフ
ローチャートである。監視モード切替部613は、移動軌
跡抽出部603から監視領域番号と人物の移動軌跡を受信
して起動し、終了信号を受信するまで以下の処理を繰り
返す(801)。FIG. 13 is a flowchart showing the monitoring mode switching processing. The monitoring mode switching unit 613 receives the monitoring area number and the movement locus of the person from the movement locus extraction unit 603, is activated, and repeats the following processing until the end signal is received (801).
【0057】まず、受信した監視エリア番号から対応す
る監視領域の確認を行なう(802)。次に、タイマによ
る現在の年/月/日/時刻から監視環境730が開(店)
/閉(店)か算定する(804)。この結果、監視領域と
開/閉状態から、対応する処理モード番号を監視モード
テーブル705から選択する(805)。この処理モード番号
と、移動軌跡算出部603から受信した移動軌跡(1また
は複数)を、監視処理部623に送信し(806)、ステップ
801に戻る。First, the corresponding monitoring area is confirmed from the received monitoring area number (802). Next, the monitoring environment 730 opens from the current year / month / day / time by a timer (store)
/ It is calculated whether it is closed (store) (804). As a result, the processing mode number corresponding to the monitoring area and the open / closed state is selected from the monitoring mode table 705 (805). This processing mode number and the movement locus (one or more) received from the movement locus calculation unit 603 are transmitted to the monitoring processing unit 623 (806), and the step
Return to 801.
【0058】なお、上記では、店舗の開/閉状態に応じ
て、昼間でも夜間でも同一エリアは同じ処理モードとな
る。しかし、エリアによっては、夜間の場合により厳し
い監視が求められることもある。このような場合、監視
環境730の開/閉に昼/夜の条件を加え、さらに多様な
処理モードとすることも可能である。In the above, the same processing mode is set for the same area during the day and at night depending on the open / closed state of the store. However, some areas may require more stringent monitoring during the night. In such a case, it is possible to add day / night conditions to the opening / closing of the monitoring environment 730 to set various processing modes.
【0059】図14は、処理モードに応じた人物の行動
判定処理を示すフローチャートである。監視対象が店舗
合で、図12の監視条件の場合を例にして説明する。FIG. 14 is a flow chart showing a person action determination process according to the processing mode. A case where the monitoring target is a store and the monitoring condition of FIG. 12 is taken as an example will be described.
【0060】人物監視部の監視処理部623は、監視モー
ド切替部613からの処理モード番号と移動軌跡のデータ
の受信で起動し、モード1〜モード4のいずれかの判定
プログラムによる処理を実行する。まず、処理モードを
判定し、モード1〜モード4のいずれかに分岐する(15
00)。なお、移動軌跡の始点位置と終点位置とは、監視
エリアに最初に出現する位置と、監視エリアから消失す
る直前の位置のことである。The monitoring processing unit 623 of the person monitoring unit is started by receiving the data of the processing mode number and the movement locus from the monitoring mode switching unit 613, and executes the processing by the determination program of any one of Mode 1 to Mode 4. . First, the processing mode is determined, and the process branches to any one of modes 1 to 4 (15
00). The start point position and the end point position of the movement trajectory are the position that first appears in the monitoring area and the position immediately before the position disappears from the monitoring area.
【0061】モード1は、出入口エリア540を含む監視
エリア570で、開店のケースである。まず、移動軌跡が
出入口エリア540に向かっているかチエックする(151
0)。その方向であれば(Y)、移動軌跡の終点位置が
出入口エリア540かチエックし(1520)、そうであれば
入場者と判定する(1530)。Mode 1 is a monitoring area 570 including an entrance / exit area 540, which is a case of opening a store. First, check whether the movement path is toward the entrance area 540 (151
0). If it is in that direction (Y), the end point position of the movement trajectory is checked at the entrance / exit area 540 (1520), and if so, it is determined to be a visitor (1530).
【0062】一方、移動軌跡が出入口エリア540に向か
わなければ、出入口エリア540から離れる方向かチエッ
クする(1540)。そうであれば(Y)、移動軌跡の視点
位置が出入口エリア540にあるかチエックし(1550)、
そうであれば退場者と判定する(1560)。これにより、
店舗への入退場者数が管理できる。なお、ステップ1154
0、1550のチエックで否の場合は、店舗外の移動者と判
定される(1560)。On the other hand, if the movement locus does not go to the entrance / exit area 540, the check is made in a direction away from the entrance / exit area 540 (1540). If so (Y), check whether the viewpoint position of the movement trajectory is in the entrance / exit area 540 (1550),
If so, it is determined that the player has exited (1560). This allows
The number of people entering and leaving the store can be managed. Note that step 1154
If the check of 0 or 1550 is not possible, it is determined that the person is a person outside the store (1560).
【0063】図15に、モード1で入場者あるいは退場
者と判定される移動軌跡を示す。監視エリア570におけ
る始点位置920、終点位置930となる移動軌跡が入場者と
判定される。また、始点位置960で終点位置970の移動軌
跡が退場者と判定される。FIG. 15 shows a locus of movement determined as an attendant or an exiter in the mode 1. The movement locus in the monitoring area 570, which is the starting point position 920 and the ending point position 930, is determined to be an attendee. Further, it is determined that the moving track from the starting point position 960 to the ending point position 970 is an exiting person.
【0064】モード2は、出入口を含む監視エリア570
で、閉店のケースである。まず、移動軌跡が終点位置を
有しているかチエックする(1610)。終点位置があれば
(Y)、その位置が出入口エリア540かチエックし(162
0)、そうであれば侵入者と判定する(1640)。この移
動軌跡は、モード1の入場者に相当する。In mode 2, the monitoring area 570 including the entrance and exit
And that's the case of closing. First, a check is made to see if the movement trajectory has an end point position (1610). If there is an end point position (Y), check that position at the entrance area 540 (162
0), if so, it is determined to be an intruder (1640). This movement locus corresponds to a mode 1 visitor.
【0065】一方、移動軌跡に終点位置がなければ
(N)、現在の移動位置が出入口エリア540ないしその
近辺にあるかチエックする(1650)。そうであれば
(Y)、その移動状態が所定時間内にm回検知されたか
チエックする(1660)。m回検知されたら、侵入者と判
定する(1640)。m回未満では、侵入者候補と判定する
(1670)。ステップ1660における所定時間は監視する施
設の重要度に応じて、例えば数十秒から1〜2分程度の
値を設定する。また、m回は、店舗やビル等の施設で
は、例えば2〜4回程度、重要施設では1〜2回程度に
設定する。On the other hand, if there is no end point position on the movement locus (N), it is checked whether the current movement position is at the entrance / exit area 540 or its vicinity (1650). If so (Y), check whether the movement state is detected m times within a predetermined time (1660). If detected m times, it is determined to be an intruder (1640). If it is less than m times, it is determined as a candidate for an intruder (1670). The predetermined time in step 1660 is set to a value of, for example, several tens of seconds to 1 to 2 minutes depending on the importance of the facility to be monitored. In addition, m times are set, for example, about 2 to 4 times in facilities such as stores and buildings, and about 1 to 2 times in important facilities.
【0066】モード3は、出入口を含まない監視エリア
560で、開店のケースである。まず、移動軌跡が終点位
置を有しているかチエックする(1700)。終点位置があ
れば(Y)、その位置が監視ライン550付近かチエック
し(1710)、付近であれば侵入者と判定し(1720)、付
近でなければ移動者と判定する(1750)。Mode 3 is a monitoring area that does not include an entrance / exit
At 560, it is the case of opening a store. First, it is checked whether the movement locus has an end point position (1700). If there is an end point position (Y), a check is made near the monitoring line 550 (1710). If it is near, it is determined to be an intruder (1720), and if not, it is determined to be a mover (1750).
【0067】一方、移動軌跡に終点位置がなければ
(N)、現在の移動位置が監視ライン550に近いかチエ
ックする(1730)。近ければ(Y)、建物周辺での停滞
者(例えば、作業者)と判定する(1740)。停滞者の検
知が一定時間以上連続する場合に、作業者と判定するよ
うにしてもよい。この停滞者ないし作業者の移動軌跡
は、モード2における侵入者候補に類似している。しか
し、開店状態では、建物周辺での停滞者を侵入者ないし
候補に判定することはしない。On the other hand, if there is no end point position on the movement track (N), check whether the current movement position is close to the monitoring line 550 (1730). If it is close (Y), it is determined that the person is a stagnation (for example, a worker) around the building (1740). The worker may be determined when the detection of the stagnant person continues for a certain time or longer. The locus of movement of the stagnate or worker is similar to the intruder candidate in mode 2. However, in the opened state, the stagnation person around the building is not judged as an intruder or a candidate.
【0068】図16に、モード3による移動者の移動軌
跡を示す。監視エリア560で監視ライン550と離れた終点
位置1150のある場合、始点位置が出入口エリア540や監
視ライン550付近でなければ、移動者と判定される。こ
れは、他のモードにおいても同じである。FIG. 16 shows a locus of movement of a person in Mode 3. In the case where there is an end point position 1150 apart from the monitoring line 550 in the monitoring area 560, if the starting point position is not in the vicinity of the entrance / exit area 540 or the monitoring line 550, it is determined to be a mover. This is the same in other modes.
【0069】モード4は、出入口を含まない監視エリア
560で、閉店のケースである。まず、移動軌跡が終点位
置を有しているかチエックする(1800)。終点位置があ
れば(Y)、その位置が監視ライン550付近かチエック
し(1810)、付近であれば侵入者と判定し(1820)、付
近でなければ移動者と判定する(1830)。Mode 4 is a monitoring area that does not include a doorway.
At 560, it was the case of closing the store. First, check whether the movement trajectory has an end point position (1800). If there is an end point position (Y), the check is made near the monitoring line 550 (1810), if it is near, it is determined to be an intruder (1820), and if not, it is determined to be a mover (1830).
【0070】一方、移動軌跡に終点位置がなければ
(N)、現在の移動位置が監視ライン550に近いかチエ
ックする(1840)。近ければ(Y)、その移動状態が所
定時間内にm/2回検知されたかチエックする(185
0)。m/2回検知されたら、侵入者と判定する。m/
2回未満では、侵入者候補と判定する(1860)。On the other hand, if there is no end point position on the movement locus (N), check whether the current movement position is close to the monitoring line 550 (1840). If it is close (Y), check whether the moving state is detected m / 2 times within a predetermined time (185).
0). If detected m / 2 times, it is determined to be an intruder. m /
If it is less than twice, it is determined to be an intruder candidate (1860).
【0071】本例におけるモード4のアルゴリズムは、
判定基準位置が出入口エリアから監視ラインに変わるの
みで、基本的にはモード2の場合と同じである。ただ
し、関係者で閉店後に用事のあるものは出入口付近に停
滞する場合があること、一般には出入口付近が最もシビ
アに防護されていることなどにより、侵入者候補から侵
入者への判定切替をより短時間に行なうアルゴリズムと
している。The mode 4 algorithm in this example is
The judgment reference position is basically changed from the entrance area to the monitoring line, and is basically the same as in the case of mode 2. However, due to the fact that there are cases in which people involved with the business after closing the store may stagnate near the entrance and exit, and generally the area near the entrance and exit is most severely protected, it is better to switch the decision from an intruder candidate to an intruder. The algorithm is used in a short time.
【0072】図17に、モード4における侵入候補ない
し侵入者の移動軌跡を示す。監視エリア560での移動軌
跡が、始点位置1020から位置1030などを経て監視ライン
550近辺の位置1040に達したとき、侵入者候補と判定さ
れる。さらに、終点位置1050を経て軌跡がエリア560か
ら消失したとき、侵入者と判定される。しかし、軌跡が
消失しないで、位置1060、1070、1080などと監視ライン
550付近に停滞ないしうろついている場合、そのつど侵
入者候補と判定し、所定回数を超えると侵入者と判定す
る。FIG. 17 shows the locus of movement of an intruder or an intruder in mode 4. The locus of movement in the monitoring area 560 goes from the starting point position 1020 to the position 1030, etc.
When the position 1040 near 550 is reached, it is determined to be an intruder candidate. Further, when the trajectory disappears from the area 560 after passing through the end point position 1050, it is determined that the intruder. However, the locus does not disappear, the position 1060, 1070, 1080 etc. and the monitoring line
If there is stagnation or prowling around 550, it is judged as an intruder candidate each time, and if it exceeds the predetermined number of times, it is judged as an intruder.
【0073】上記の処理で、解析中の移動軌跡や検知回
数などのデータ及び、最終判定結果はデータ格納部1200
に記憶される。そして、判定結果が侵入者や候補の場合
は、表示制御部1300を介して表示装置1400に表示する。In the above processing, the data such as the movement trajectory during the analysis and the number of detections, and the final determination result are stored in the data storage unit 1200.
Is stored. If the determination result is an intruder or a candidate, it is displayed on the display device 1400 via the display control unit 1300.
【0074】図18は、表示制御部による侵入者の監視
画面の一例である。表示装置1400には、ユーザID等の
ユーザ情報1480、異常情報や警報等の異常報知1490、侵
入者有り等の異常メッセージ1495を表示する。さらに、
画像データの表示制御部1300によって、侵入者の移動軌
跡を始点位置1020から消失前の終点位置1050までを画面
表示する。このように、監視がリアルタイムな画面表示
とともに行なわれるので、監視員の判断が容易になる。FIG. 18 shows an example of an intruder monitoring screen displayed by the display controller. The display device 1400 displays user information 1480 such as user ID, abnormality notification 1490 such as abnormality information and alarm, and abnormality message 1495 such as presence of intruder. further,
The display control unit 1300 for image data displays on the screen the locus of movement of the intruder from the start point position 1020 to the end point position 1050 before disappearance. In this way, since the monitoring is performed together with the real-time screen display, it becomes easy for the monitoring staff to judge.
【0075】以上のように、本記実施形態の画像監視シ
ステムは、監視対象領域を撮影した濃淡画像を順次取り
込み、2値化などの画像処理を経て抽出した特徴量から
人物を抽出し、人物の移動軌跡から所定のアルゴリズム
で人物の行動を判定し、入退場者の管理や不法な侵入者
の監視を行なう。この行動判定において、監視環境と監
視エリアに適応された処理モード(判定アルゴリズム)
が、監視時間に対応して自動的に切替られる。監視環境
は、例えば監視対象の稼働状態(開店)/休止状態(閉
店)で、現在時期に基づいて自動的に選択できる。As described above, the image monitoring system according to the present embodiment extracts a person from the feature amount extracted through the image processing such as binarization by sequentially taking in the grayscale images of the area to be monitored, The behavior of a person is determined by a predetermined algorithm from the movement locus of, and the entrance / exiters are managed and illegal intruders are monitored. In this behavior judgment, the processing mode (judgment algorithm) adapted to the monitoring environment and the monitoring area
However, it is automatically switched according to the monitoring time. The monitoring environment can be automatically selected based on the current time, for example, the operating state (open store) / suspended state (closed) of the monitoring target.
【0076】これによれば、監視エリアや監視環境によ
って、同じ移動軌跡に対して異なる行動判定が可能にな
る。ちなみに、監視対象が店舗の場合、開店時の出入口
からの入退場者の軌跡は、閉店時の場合は侵入者と判定
される。また、建物周辺に停滞ないしうろついてる滞留
者は、開店時は作業者などに判定されるが、閉店時は侵
入者候補ないし侵入者と判定される。According to this, it is possible to determine different actions for the same movement trajectory depending on the monitoring area and the monitoring environment. By the way, when the monitoring target is a store, the locus of the entrance / exiter from the entrance / exit when the store is opened is determined to be an intruder when the store is closed. A stagnation or stagnant person around the building is determined to be a worker or the like when the store is opened, but is determined to be an intruder candidate or an intruder when the store is closed.
【0077】[0077]
【発明の効果】本発明によれば、画像処理装置から抽出
された人物の移動軌跡を基に、判定の処理モードのみを
監視条件に応じて自動切替することで、入場者管理や侵
入者監視など多目的に、また、多様な監視レベルによる
判定アルゴリズムの採用が可能になるので、画像監視シ
ステムの機能性を一段と向上でき、一元的な管理が容易
になる効果がある。According to the present invention, based on the movement trajectory of a person extracted from the image processing apparatus, only the determination processing mode is automatically switched according to the monitoring condition, whereby the attendee management and the intruder monitoring are performed. Since it is possible to adopt a determination algorithm for various purposes such as various monitoring levels, it is possible to further improve the functionality of the image monitoring system and facilitate centralized management.
【0078】さらに、条件の異なる複数の監視エリア
を、1台ないし複数のカメラで撮像し、以後は同一シス
テムで画像処理、監視処理できるので、システムの構成
が簡単で安価になる効果がある。Furthermore, since a plurality of monitoring areas under different conditions can be imaged by one or a plurality of cameras and thereafter the same system can perform image processing and monitoring processing, there is an effect that the system configuration is simple and inexpensive.
【図1】本発明の一実施形態による画像監視システムの
機能ブロック図。FIG. 1 is a functional block diagram of an image monitoring system according to an embodiment of the present invention.
【図2】画像入力部の構成を示す機能ブロック図。FIG. 2 is a functional block diagram showing the configuration of an image input unit.
【図3】監視領域の設定の一例を示す説明図。FIG. 3 is an explanatory diagram showing an example of setting a monitoring area.
【図4】人物抽出部の構成を示す機能ブロック図。FIG. 4 is a functional block diagram showing a configuration of a person extraction unit.
【図5】2値化しきい値の算出のしかたを示す説明図。FIG. 5 is an explanatory diagram showing a method of calculating a binarization threshold value.
【図6】2値化処理部の構成を示す機能ブロック図。FIG. 6 is a functional block diagram showing the configuration of a binarization processing unit.
【図7】特徴量算出部の構成を示す機能ブロック図。FIG. 7 is a functional block diagram showing the configuration of a feature amount calculation unit.
【図8】1回目の移動方向の算出のしかたを示す説明
図。FIG. 8 is an explanatory diagram showing how to calculate a moving direction for the first time.
【図9】2回目以降の移動方向の算出のしかたを示す説
明図。FIG. 9 is an explanatory diagram showing how to calculate a moving direction after the second time.
【図10】人物抽出の手順を示すフローチャート。FIG. 10 is a flowchart showing a procedure of person extraction.
【図11】人物監視部の構成を示す機能ブロック図。FIG. 11 is a functional block diagram showing the configuration of a person monitoring unit.
【図12】監視モードテーブルのフォーマットと設定内
容を示す説明図。FIG. 12 is an explanatory diagram showing a format and setting contents of a monitoring mode table.
【図13】監視モードの切替の処理手順を示すフローチ
ャート。FIG. 13 is a flowchart showing a processing procedure for switching monitoring modes.
【図14】人物の行動判定の複数の処理モードを示すフ
ローチャート。FIG. 14 is a flowchart showing a plurality of processing modes for determining a person's action.
【図15】入場者と退場者の移動軌跡を示す説明図。FIG. 15 is an explanatory diagram showing movement trajectories of the visitors and the exiters.
【図16】移動者の移動軌跡を示す説明図。FIG. 16 is an explanatory diagram showing a movement trajectory of a moving person.
【図17】侵入者候補および侵入者の移動軌跡を示す説
明図。FIG. 17 is an explanatory diagram showing an intruder candidate and a locus of movement of the intruder.
【図18】侵入者検出時の画面表示の一例を示す説明
図。FIG. 18 is an explanatory diagram showing an example of a screen display when an intruder is detected.
100…ITVカメラ、200…画像入力部、300…人物抽出部、
500…監視領域設定部、540…出入口エリア、550…監視
ライン、560…監視エリア(出入口を含まない建物周
辺)、570…監視エリア(出入口を含む)、600…人物監
視部、603…移動軌跡算出部、613…監視モード切替部、
623…監視処理部、700…監視条件設定部、705…監視モ
ードテーブル、720…監視領域、730…監視環境、740…
処理モード、1200…データ格納部、1300…表示制御部、
1400…表示装置。100 ... ITV camera, 200 ... image input section, 300 ... person extraction section,
500 ... monitoring area setting unit, 540 ... entry area, 550 ... monitoring line, 560 ... monitoring area (around the building that does not include the doorway), 570 ... monitoring area (including doorway), 600 ... person monitoring unit, 603 ... moving locus Calculation unit, 613 ... Monitoring mode switching unit,
623 ... Monitoring processing unit, 700 ... Monitoring condition setting unit, 705 ... Monitoring mode table, 720 ... Monitoring area, 730 ... Monitoring environment, 740 ...
Processing mode, 1200 ... Data storage unit, 1300 ... Display control unit,
1400 ... Display device.
───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.6 識別記号 庁内整理番号 FI 技術表示箇所 H04N 7/18 G06F 15/62 380 ─────────────────────────────────────────────────── ─── Continuation of the front page (51) Int.Cl. 6 Identification code Internal reference number FI Technical indication H04N 7/18 G06F 15/62 380
Claims (12)
取り込み、所定の画像処理を経て抽出した人物の移動軌
跡から、監視対象に対する人物の行動を監視する画像監
視方法において、 監視対象の活動または休止など時間に依存する環境を現
在時間から求め、その環境に適応した処理モードを選択
して、前記移動軌跡の人物の行動を判定することを特徴
とする画像監視方法。1. An image monitoring method for monitoring a person's behavior with respect to a monitoring target based on a movement trajectory of the person, which is obtained by time-sequentially capturing images of the surroundings of the monitoring target such as a building, and subjected to predetermined image processing. Alternatively, the image monitoring method is characterized in that a time-dependent environment such as a pause is obtained from the current time, a processing mode suitable for the environment is selected, and the action of the person on the movement locus is determined.
ア毎に時系列に取り込み、所定の画像処理を経て抽出し
た人物の移動軌跡から、監視対象に対する人物の行動を
監視する画像監視方法において、 前記監視エリア毎に適応した処理モードを選択して、前
記移動軌跡の人物の行動を判定することを特徴とする画
像監視方法。2. An image monitoring method for monitoring the behavior of a person with respect to a monitoring target from a moving locus of the person, which is obtained by time-sequentially capturing images of the surroundings of the monitoring target such as a building for each monitoring area, and extracted through predetermined image processing, An image monitoring method, characterized by selecting a processing mode adapted for each monitoring area to determine the behavior of a person on the movement trajectory.
リア毎に時系列に取り込み、2値化などの画像処理を経
て算出した所定の特徴量から人物を抽出し、この人物の
移動軌跡から監視対象に対する人物の行動を監視する画
像監視方法において、 監視エリアと時間に依存する監視条件とに対応して設定
される処理モードを切替しながら、前記移動軌跡の人物
の行動をオンラインで判定することを特徴とする画像監
視方法。3. A person is extracted from a predetermined feature amount calculated by time-sequentially capturing images of the surroundings of a monitoring target such as a building for each monitoring area, and from a movement locus of this person. In an image monitoring method for monitoring the behavior of a person with respect to a monitoring target, the behavior of the person on the movement locus is determined online while switching the processing mode set corresponding to the monitoring area and the time-dependent monitoring condition. An image monitoring method characterized by the above.
記監視条件に監視対象の開/閉の環境が設定されている
場合に、前記開の環境に対応する処理モードで「入場
者」または「退場者」と判定される移動軌跡は、前記閉
の環境に対応する処理モードで「侵入者」と判定される
ことを特徴とする画像監視方法。4. The open environment according to claim 3, wherein the monitored area includes the vicinity of an entrance / exit of the monitored object, and the open / closed environment of the monitored object is set in the monitoring condition. An image monitoring method, wherein a movement locus determined to be an "entry person" or an "exiter" in the processing mode is determined to be an "intruder" in the processing mode corresponding to the closed environment.
て、 前記監視エリアが前記監視対象の出入口付近を含む場合
に、前記移動軌跡が前記出入口に向かって移動し、その
終点位置が前記出入口付近となるとき、前記開の環境に
対応する処理モードでは「入場者」、前記閉の環境に対
応する処理モードでは「侵入者」と判定され、 前記監視エリアが監視対象の出入口付近以外の周辺を含
む場合に、前記移動軌跡が監視対象に向かって移動し、
その終点位置が予め設定されている監視対象の監視ライ
ン付近となるとき、前記開及び閉の環境に対応する処理
モードはともに「侵入者」と判定されることを特徴とす
る侵入者の監視方法。5. The moving trajectory according to claim 3, wherein an open / closed environment of a monitoring target is set in the monitoring condition, and the movement locus is set to the entrance / exit when the monitoring area includes a vicinity of the entrance / exit of the monitoring target. When moving toward the end point position near the doorway, it is determined as "entrancer" in the processing mode corresponding to the open environment, "intruder" in the processing mode corresponding to the closed environment, When the monitored area includes a periphery other than the vicinity of the entrance / exit of the monitoring target, the movement locus moves toward the monitoring target,
When the end point position is near the preset monitoring line of the monitoring target, the processing modes corresponding to the open and closed environments are both determined as "intruder", and the intruder monitoring method is characterized. .
を含み、前記監視条件に監視対象の開/閉の環境が設定
されている場合に、前記開の環境に対応する処理モード
で「滞留者」と判定される移動軌跡は、前記閉の環境に
対応する処理モードで「侵入者候補」と判定されること
を特徴とする画像監視方法。6. The monitoring area according to claim 3, wherein the monitoring area includes a periphery other than the vicinity of the entrance / exit of the monitoring target, and an open / close environment of the monitoring target is set in the monitoring condition, The image monitoring method, wherein the movement locus determined to be "resident" in the processing mode corresponding to the open environment is determined to be "intruder candidate" in the processing mode corresponding to the closed environment.
て、前記監視エリアが前記監視対象の出入口付近以外の
周辺を含む場合に、前記移動軌跡が前記監視エリアに予
め設定されている監視ラインに近く且つ終点位置を示し
ていないとき、前記開の環境に対応する処理モードでは
「滞留者」と判定され、前記閉の環境に対応する処理モ
ードでは「侵入者候補」と判定されることを特徴とする
画像監視方法。7. The open / closed environment of a monitored object is set in the monitoring condition according to claim 3, 4 or 5, and when the monitored area includes a periphery other than the vicinity of the entrance / exit of the monitored object, When the movement locus is close to the monitoring line preset in the monitoring area and does not indicate the end point position, it is determined to be “resident” in the processing mode corresponding to the open environment, and corresponds to the closed environment. In the processing mode, the image monitoring method is characterized in that it is determined to be an “intruder candidate”.
場合は、前記「侵入者」と判定する画像監視方法。8. The image monitoring method according to claim 6, wherein when the detection of the “intruder candidate” continuously exceeds a predetermined number of times, the “intruder” is determined.
て、 前記監視対象が店舗の場合、前記開の環境は開店の状
態、閉の環境は閉店の状態である画像監視方法。9. The image monitoring method according to claim 3, wherein when the monitoring target is a store, the open environment is an open state and the closed environment is a closed state.
像するテレビカメラと、撮像された画像を監視エリア毎
に時系列に取り込み所定の画像処理を行なう画像処理手
段、その画像処理を経て算出された所定の特徴量から人
物を抽出する人物抽出手段、その人物の移動軌跡から監
視対象に対する人物の行動を判定する人物監視手段を有
する画像監視装置とを備えるシステムにおいて、 時間に依存する監視条件と前記監視エリアの組合せに適
応する処理モードを予め設定する監視モードテーブルを
有し、 前記人物監視手段が、現在の時間から求めた前記監視条
件と当該移動軌跡の監視エリアを基に、前記監視モード
テーブルより選択した処理モードに従って前記移動軌跡
の人物の行動を判定することを特徴とする画像監視シス
テム。10. A television camera that periodically captures an image of a surrounding area of a monitored object such as a building, an image processing unit that captures the captured image in time series for each monitoring area, and performs predetermined image processing, and calculation through the image processing. In a system including a person extracting unit that extracts a person from the predetermined characteristic amount, and an image monitoring apparatus that has a person monitoring unit that determines the action of the person with respect to the monitoring target from the movement trajectory of the person, a monitoring condition depending on time And a monitoring mode table that presets a processing mode adapted to the combination of the monitoring areas, and the person monitoring means monitors the monitoring conditions based on the monitoring conditions and the movement area of the movement locus obtained from the current time. An image monitoring system, characterized in that the behavior of a person on the movement locus is determined according to a processing mode selected from a mode table.
監視エリアを設定する監視エリア設定手段を有すること
を特徴とする画像監視システム。11. The image monitoring system according to claim 10, further comprising monitoring area setting means for setting one or more monitoring areas for each of the one or more television cameras.
入」または「侵入候補」の場合に、その判定結果と該当
移動軌跡を表示する表示手段を備えている画像監視シス
テム。12. The display device according to claim 10 or 11, further comprising a display unit that displays a determination result and a corresponding movement trajectory when the person monitoring unit determines that the target is “intrusion” or “intrusion candidate”. Image surveillance system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP8147190A JPH09330415A (en) | 1996-06-10 | 1996-06-10 | Image monitoring method and image monitoring system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP8147190A JPH09330415A (en) | 1996-06-10 | 1996-06-10 | Image monitoring method and image monitoring system |
Publications (1)
Publication Number | Publication Date |
---|---|
JPH09330415A true JPH09330415A (en) | 1997-12-22 |
Family
ID=15424609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP8147190A Pending JPH09330415A (en) | 1996-06-10 | 1996-06-10 | Image monitoring method and image monitoring system |
Country Status (1)
Country | Link |
---|---|
JP (1) | JPH09330415A (en) |
Cited By (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001320698A (en) * | 2000-05-12 | 2001-11-16 | Nippon Signal Co Ltd:The | Image type monitoring method, and image type monitoring device and safety system using it |
JP2002010240A (en) * | 2000-06-21 | 2002-01-11 | Matsushita Electric Ind Co Ltd | Monitoring system |
JP2002335501A (en) * | 2001-05-10 | 2002-11-22 | Mitsubishi Electric Corp | Portable display device |
JP2002342864A (en) * | 2001-03-16 | 2002-11-29 | Matsushita Electric Works Ltd | Monitoring system |
JP2003061076A (en) * | 2001-08-10 | 2003-02-28 | Sogo Keibi Hosho Co Ltd | Supervisory device, supervisory method and program for making computer execute the method |
JP2003296464A (en) * | 2002-04-05 | 2003-10-17 | Dainippon Printing Co Ltd | Medical care monitoring service system |
JP2004328622A (en) * | 2003-04-28 | 2004-11-18 | Matsushita Electric Ind Co Ltd | Behavior pattern identification device |
JP2005025313A (en) * | 2003-06-30 | 2005-01-27 | Secom Co Ltd | Resident detection system |
JP2005128815A (en) * | 2003-10-24 | 2005-05-19 | Matsushita Electric Ind Co Ltd | Person detection device and person detection method |
JP2005275912A (en) * | 2004-03-25 | 2005-10-06 | Institute Of Physical & Chemical Research | Behavior analysis method and system |
JP2006146378A (en) * | 2004-11-17 | 2006-06-08 | Hitachi Ltd | Surveillance system using multiple cameras |
JP2006245795A (en) * | 2005-03-01 | 2006-09-14 | Toa Corp | Moving object tracking device and moving object tracking display device |
WO2007138811A1 (en) * | 2006-05-31 | 2007-12-06 | Nec Corporation | Device and method for detecting suspicious activity, program, and recording medium |
JP2008016897A (en) * | 2006-06-30 | 2008-01-24 | Sony Corp | Monitoring apparatus, monitoring system and monitoring method |
JP2008165341A (en) * | 2006-12-27 | 2008-07-17 | Giken Torasutemu Kk | Person movement path recognition apparatus |
WO2008111459A1 (en) * | 2007-03-06 | 2008-09-18 | Kabushiki Kaisha Toshiba | Suspicious behavior detection system and method |
US7496212B2 (en) | 2003-05-16 | 2009-02-24 | Hitachi Kokusai Electric Inc. | Change detecting method and apparatus |
JP2009152748A (en) * | 2007-12-19 | 2009-07-09 | Mitsubishi Electric Corp | Image processing device for abandoned object monitoring |
WO2009096208A1 (en) * | 2008-01-31 | 2009-08-06 | Nec Corporation | Object recognition system, object recognition method, and object recognition program |
JP2009295063A (en) * | 2008-06-09 | 2009-12-17 | Meidensha Corp | Intrusion object detection device |
JP2009294974A (en) * | 2008-06-06 | 2009-12-17 | Meidensha Corp | Intruder detection device by image processing |
JP2009296087A (en) * | 2008-06-03 | 2009-12-17 | Keyence Corp | Area monitoring sensor |
JP2010044641A (en) * | 2008-08-14 | 2010-02-25 | Toshiba Corp | Ultrasonic diagnostic apparatus, ultrasonic image processor and ultrasonic image processing program |
JP2010124186A (en) * | 2008-11-19 | 2010-06-03 | Laurel Bank Mach Co Ltd | Monitoring camera system |
JP2010204719A (en) * | 2009-02-27 | 2010-09-16 | Toshiba Corp | Entrance management system, entrance management device, and entrance management method |
US7812723B2 (en) | 2005-12-28 | 2010-10-12 | Mitsubishi Electric Corporation | Intruder detection system |
JP2010233185A (en) * | 2009-03-30 | 2010-10-14 | Saxa Inc | Apparatus, system and program for processing image |
JP2011013827A (en) * | 2009-06-30 | 2011-01-20 | Secom Co Ltd | Security system and sensor terminal |
JP2011013828A (en) * | 2009-06-30 | 2011-01-20 | Secom Co Ltd | Security system |
JP2011028585A (en) * | 2009-07-27 | 2011-02-10 | Secom Co Ltd | Security system |
JP2011159041A (en) * | 2010-01-29 | 2011-08-18 | Secom Co Ltd | Security system |
JP2011191987A (en) * | 2010-03-15 | 2011-09-29 | Omron Corp | Reporting apparatus, and reporting system |
JP2012128877A (en) * | 2012-03-19 | 2012-07-05 | Toshiba Corp | Suspicious behavior detection system and method |
JP2012215983A (en) * | 2011-03-31 | 2012-11-08 | Secom Co Ltd | Intrusion detecting device |
JP2014002502A (en) * | 2012-06-18 | 2014-01-09 | Dainippon Printing Co Ltd | Stretched-out hand detector, stretched-out hand detecting method and program |
US8917169B2 (en) | 1993-02-26 | 2014-12-23 | Magna Electronics Inc. | Vehicular vision system |
JP2015018340A (en) * | 2013-07-09 | 2015-01-29 | キヤノン株式会社 | Image processing apparatus and image processing method |
US8993951B2 (en) | 1996-03-25 | 2015-03-31 | Magna Electronics Inc. | Driver assistance system for a vehicle |
US9008369B2 (en) | 2004-04-15 | 2015-04-14 | Magna Electronics Inc. | Vision system for vehicle |
US9020261B2 (en) | 2001-03-23 | 2015-04-28 | Avigilon Fortress Corporation | Video segmentation using statistical pixel modeling |
US9171217B2 (en) | 2002-05-03 | 2015-10-27 | Magna Electronics Inc. | Vision system for vehicle |
US9237315B2 (en) | 2014-03-03 | 2016-01-12 | Vsk Electronics Nv | Intrusion detection with directional sensing |
US9378632B2 (en) | 2000-10-24 | 2016-06-28 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
JP2016163075A (en) * | 2015-02-26 | 2016-09-05 | キヤノン株式会社 | Video processing device, video processing method, and program |
US9436880B2 (en) | 1999-08-12 | 2016-09-06 | Magna Electronics Inc. | Vehicle vision system |
JPWO2016147770A1 (en) * | 2015-03-19 | 2017-12-28 | 日本電気株式会社 | Monitoring system and monitoring method |
JPWO2016152196A1 (en) * | 2015-03-23 | 2018-01-11 | 日本電気株式会社 | Monitoring device, monitoring system, monitoring method, and computer-readable recording medium |
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US10071676B2 (en) | 2006-08-11 | 2018-09-11 | Magna Electronics Inc. | Vision system for vehicle |
JP2019200705A (en) * | 2018-05-18 | 2019-11-21 | Kddi株式会社 | Visitor specification device, visitor specification method, and visitor specification program |
US10645350B2 (en) | 2000-10-24 | 2020-05-05 | Avigilon Fortress Corporation | Video analytic rule detection system and method |
WO2020148890A1 (en) * | 2019-01-18 | 2020-07-23 | 日本電気株式会社 | Information processing device |
CN113595746A (en) * | 2021-07-16 | 2021-11-02 | 广州市瀚云信息技术有限公司 | Control method and device for power supply of shielding device |
WO2024127621A1 (en) * | 2022-12-16 | 2024-06-20 | オプテックス株式会社 | Security sensor unit, security sensor unit control method, control program, and recording medium |
-
1996
- 1996-06-10 JP JP8147190A patent/JPH09330415A/en active Pending
Cited By (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8917169B2 (en) | 1993-02-26 | 2014-12-23 | Magna Electronics Inc. | Vehicular vision system |
US8993951B2 (en) | 1996-03-25 | 2015-03-31 | Magna Electronics Inc. | Driver assistance system for a vehicle |
US9436880B2 (en) | 1999-08-12 | 2016-09-06 | Magna Electronics Inc. | Vehicle vision system |
JP2001320698A (en) * | 2000-05-12 | 2001-11-16 | Nippon Signal Co Ltd:The | Image type monitoring method, and image type monitoring device and safety system using it |
JP2002010240A (en) * | 2000-06-21 | 2002-01-11 | Matsushita Electric Ind Co Ltd | Monitoring system |
US9378632B2 (en) | 2000-10-24 | 2016-06-28 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US10347101B2 (en) | 2000-10-24 | 2019-07-09 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US10645350B2 (en) | 2000-10-24 | 2020-05-05 | Avigilon Fortress Corporation | Video analytic rule detection system and method |
US10026285B2 (en) | 2000-10-24 | 2018-07-17 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
JP2002342864A (en) * | 2001-03-16 | 2002-11-29 | Matsushita Electric Works Ltd | Monitoring system |
US9020261B2 (en) | 2001-03-23 | 2015-04-28 | Avigilon Fortress Corporation | Video segmentation using statistical pixel modeling |
JP2002335501A (en) * | 2001-05-10 | 2002-11-22 | Mitsubishi Electric Corp | Portable display device |
JP2003061076A (en) * | 2001-08-10 | 2003-02-28 | Sogo Keibi Hosho Co Ltd | Supervisory device, supervisory method and program for making computer execute the method |
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
JP2003296464A (en) * | 2002-04-05 | 2003-10-17 | Dainippon Printing Co Ltd | Medical care monitoring service system |
US9643605B2 (en) | 2002-05-03 | 2017-05-09 | Magna Electronics Inc. | Vision system for vehicle |
US10118618B2 (en) | 2002-05-03 | 2018-11-06 | Magna Electronics Inc. | Vehicular control system using cameras and radar sensor |
US10351135B2 (en) | 2002-05-03 | 2019-07-16 | Magna Electronics Inc. | Vehicular control system using cameras and radar sensor |
US10683008B2 (en) | 2002-05-03 | 2020-06-16 | Magna Electronics Inc. | Vehicular driving assist system using forward-viewing camera |
US9834216B2 (en) | 2002-05-03 | 2017-12-05 | Magna Electronics Inc. | Vehicular control system using cameras and radar sensor |
US11203340B2 (en) | 2002-05-03 | 2021-12-21 | Magna Electronics Inc. | Vehicular vision system using side-viewing camera |
US9555803B2 (en) | 2002-05-03 | 2017-01-31 | Magna Electronics Inc. | Driver assistance system for vehicle |
US9171217B2 (en) | 2002-05-03 | 2015-10-27 | Magna Electronics Inc. | Vision system for vehicle |
JP2004328622A (en) * | 2003-04-28 | 2004-11-18 | Matsushita Electric Ind Co Ltd | Behavior pattern identification device |
US7496212B2 (en) | 2003-05-16 | 2009-02-24 | Hitachi Kokusai Electric Inc. | Change detecting method and apparatus |
JP2005025313A (en) * | 2003-06-30 | 2005-01-27 | Secom Co Ltd | Resident detection system |
JP2005128815A (en) * | 2003-10-24 | 2005-05-19 | Matsushita Electric Ind Co Ltd | Person detection device and person detection method |
JP2005275912A (en) * | 2004-03-25 | 2005-10-06 | Institute Of Physical & Chemical Research | Behavior analysis method and system |
US10462426B2 (en) | 2004-04-15 | 2019-10-29 | Magna Electronics Inc. | Vehicular control system |
US9008369B2 (en) | 2004-04-15 | 2015-04-14 | Magna Electronics Inc. | Vision system for vehicle |
US9609289B2 (en) | 2004-04-15 | 2017-03-28 | Magna Electronics Inc. | Vision system for vehicle |
US10110860B1 (en) | 2004-04-15 | 2018-10-23 | Magna Electronics Inc. | Vehicular control system |
US9948904B2 (en) | 2004-04-15 | 2018-04-17 | Magna Electronics Inc. | Vision system for vehicle |
US10187615B1 (en) | 2004-04-15 | 2019-01-22 | Magna Electronics Inc. | Vehicular control system |
US9736435B2 (en) | 2004-04-15 | 2017-08-15 | Magna Electronics Inc. | Vision system for vehicle |
US9428192B2 (en) | 2004-04-15 | 2016-08-30 | Magna Electronics Inc. | Vision system for vehicle |
US10306190B1 (en) | 2004-04-15 | 2019-05-28 | Magna Electronics Inc. | Vehicular control system |
US10015452B1 (en) | 2004-04-15 | 2018-07-03 | Magna Electronics Inc. | Vehicular control system |
US9191634B2 (en) | 2004-04-15 | 2015-11-17 | Magna Electronics Inc. | Vision system for vehicle |
JP2006146378A (en) * | 2004-11-17 | 2006-06-08 | Hitachi Ltd | Surveillance system using multiple cameras |
JP2006245795A (en) * | 2005-03-01 | 2006-09-14 | Toa Corp | Moving object tracking device and moving object tracking display device |
US7812723B2 (en) | 2005-12-28 | 2010-10-12 | Mitsubishi Electric Corporation | Intruder detection system |
WO2007138811A1 (en) * | 2006-05-31 | 2007-12-06 | Nec Corporation | Device and method for detecting suspicious activity, program, and recording medium |
US7973656B2 (en) | 2006-05-31 | 2011-07-05 | Nec Corporation | Suspicious activity detection apparatus and method, and program and recording medium |
JP2008016897A (en) * | 2006-06-30 | 2008-01-24 | Sony Corp | Monitoring apparatus, monitoring system and monitoring method |
US10071676B2 (en) | 2006-08-11 | 2018-09-11 | Magna Electronics Inc. | Vision system for vehicle |
US10787116B2 (en) | 2006-08-11 | 2020-09-29 | Magna Electronics Inc. | Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera |
US11148583B2 (en) | 2006-08-11 | 2021-10-19 | Magna Electronics Inc. | Vehicular forward viewing image capture system |
US11396257B2 (en) | 2006-08-11 | 2022-07-26 | Magna Electronics Inc. | Vehicular forward viewing image capture system |
US11623559B2 (en) | 2006-08-11 | 2023-04-11 | Magna Electronics Inc. | Vehicular forward viewing image capture system |
US11951900B2 (en) | 2006-08-11 | 2024-04-09 | Magna Electronics Inc. | Vehicular forward viewing image capture system |
JP2008165341A (en) * | 2006-12-27 | 2008-07-17 | Giken Torasutemu Kk | Person movement path recognition apparatus |
WO2008111459A1 (en) * | 2007-03-06 | 2008-09-18 | Kabushiki Kaisha Toshiba | Suspicious behavior detection system and method |
JP2008217602A (en) * | 2007-03-06 | 2008-09-18 | Toshiba Corp | Suspicious behavior detection system and method |
JP2009152748A (en) * | 2007-12-19 | 2009-07-09 | Mitsubishi Electric Corp | Image processing device for abandoned object monitoring |
WO2009096208A1 (en) * | 2008-01-31 | 2009-08-06 | Nec Corporation | Object recognition system, object recognition method, and object recognition program |
JP2009296087A (en) * | 2008-06-03 | 2009-12-17 | Keyence Corp | Area monitoring sensor |
JP2009294974A (en) * | 2008-06-06 | 2009-12-17 | Meidensha Corp | Intruder detection device by image processing |
JP2009295063A (en) * | 2008-06-09 | 2009-12-17 | Meidensha Corp | Intrusion object detection device |
JP2010044641A (en) * | 2008-08-14 | 2010-02-25 | Toshiba Corp | Ultrasonic diagnostic apparatus, ultrasonic image processor and ultrasonic image processing program |
US8858442B2 (en) | 2008-08-14 | 2014-10-14 | Kabushiki Kaisha Toshiba | Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus |
JP2010124186A (en) * | 2008-11-19 | 2010-06-03 | Laurel Bank Mach Co Ltd | Monitoring camera system |
JP2010204719A (en) * | 2009-02-27 | 2010-09-16 | Toshiba Corp | Entrance management system, entrance management device, and entrance management method |
JP2010233185A (en) * | 2009-03-30 | 2010-10-14 | Saxa Inc | Apparatus, system and program for processing image |
JP2011013827A (en) * | 2009-06-30 | 2011-01-20 | Secom Co Ltd | Security system and sensor terminal |
JP2011013828A (en) * | 2009-06-30 | 2011-01-20 | Secom Co Ltd | Security system |
JP2011028585A (en) * | 2009-07-27 | 2011-02-10 | Secom Co Ltd | Security system |
JP2011159041A (en) * | 2010-01-29 | 2011-08-18 | Secom Co Ltd | Security system |
JP2011191987A (en) * | 2010-03-15 | 2011-09-29 | Omron Corp | Reporting apparatus, and reporting system |
JP2012215983A (en) * | 2011-03-31 | 2012-11-08 | Secom Co Ltd | Intrusion detecting device |
JP2012128877A (en) * | 2012-03-19 | 2012-07-05 | Toshiba Corp | Suspicious behavior detection system and method |
JP2014002502A (en) * | 2012-06-18 | 2014-01-09 | Dainippon Printing Co Ltd | Stretched-out hand detector, stretched-out hand detecting method and program |
JP2015018340A (en) * | 2013-07-09 | 2015-01-29 | キヤノン株式会社 | Image processing apparatus and image processing method |
US10148914B2 (en) | 2014-03-03 | 2018-12-04 | Vsk Electronics Nv | Intrusion detection with directional sensing |
US9237315B2 (en) | 2014-03-03 | 2016-01-12 | Vsk Electronics Nv | Intrusion detection with directional sensing |
US10474918B2 (en) | 2015-02-26 | 2019-11-12 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium |
JP2016163075A (en) * | 2015-02-26 | 2016-09-05 | キヤノン株式会社 | Video processing device, video processing method, and program |
JPWO2016147770A1 (en) * | 2015-03-19 | 2017-12-28 | 日本電気株式会社 | Monitoring system and monitoring method |
US10497131B2 (en) | 2015-03-23 | 2019-12-03 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
JPWO2016152196A1 (en) * | 2015-03-23 | 2018-01-11 | 日本電気株式会社 | Monitoring device, monitoring system, monitoring method, and computer-readable recording medium |
US11842499B2 (en) | 2015-03-23 | 2023-12-12 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
US10957052B2 (en) | 2015-03-23 | 2021-03-23 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
JP2019200705A (en) * | 2018-05-18 | 2019-11-21 | Kddi株式会社 | Visitor specification device, visitor specification method, and visitor specification program |
JPWO2020148890A1 (en) * | 2019-01-18 | 2021-10-14 | 日本電気株式会社 | Information processing device |
US11893797B2 (en) | 2019-01-18 | 2024-02-06 | Nec Corporation | Information processing device |
WO2020148890A1 (en) * | 2019-01-18 | 2020-07-23 | 日本電気株式会社 | Information processing device |
US12062239B2 (en) | 2019-01-18 | 2024-08-13 | Nec Corporation | Information processing device |
US12205377B2 (en) | 2019-01-18 | 2025-01-21 | Nec Corporation | Information processing device |
US12272146B2 (en) | 2019-01-18 | 2025-04-08 | Nec Corporation | Information processing device |
US12277774B2 (en) | 2019-01-18 | 2025-04-15 | Nec Corporation | Information processing device |
US12307777B2 (en) | 2019-01-18 | 2025-05-20 | Nec Corporation | Information processing device |
CN113595746B (en) * | 2021-07-16 | 2023-07-11 | 广州市瀚云信息技术有限公司 | Control method and device for power supply of shielding device |
CN113595746A (en) * | 2021-07-16 | 2021-11-02 | 广州市瀚云信息技术有限公司 | Control method and device for power supply of shielding device |
WO2024127621A1 (en) * | 2022-12-16 | 2024-06-20 | オプテックス株式会社 | Security sensor unit, security sensor unit control method, control program, and recording medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JPH09330415A (en) | Image monitoring method and image monitoring system | |
CN110222640B (en) | Method, device and method for identifying suspect in monitoring site and storage medium | |
US7646401B2 (en) | Video-based passback event detection | |
US7542588B2 (en) | System and method for assuring high resolution imaging of distinctive characteristics of a moving object | |
EP1346577B1 (en) | Method and apparatus to select the best video frame to transmit to a remote station for cctv based residential security monitoring | |
WO2018096787A1 (en) | Person's behavior monitoring device and person's behavior monitoring system | |
US20170053191A1 (en) | Image analysis system, image analysis method, and storage medium | |
JP3924171B2 (en) | Monitoring device for identifying monitoring objects | |
US20040240542A1 (en) | Method and apparatus for video frame sequence-based object tracking | |
JP2009017416A (en) | Monitoring device, monitoring method and program | |
JPH0337354B2 (en) | ||
US20180075307A1 (en) | Scan face of video feed | |
JP2006252248A (en) | Trespasser detecting system by image processing | |
JPH09224237A (en) | Image monitoring system | |
NO329869B1 (en) | Method and system for monitoring a predetermined area | |
JP3361399B2 (en) | Obstacle detection method and device | |
CN118470905B (en) | Personnel falling monitoring system and method | |
JP6000762B2 (en) | Image monitoring device | |
JP6785437B2 (en) | Human behavior monitoring device and human behavior monitoring system | |
JPH0737064A (en) | Method and device for detecting intruding object | |
JPH0460880A (en) | Moving body discrimination and analysis controlling system | |
JPH0793558A (en) | Image monitoring device | |
JP2005284652A (en) | Video monitoring method and apparatus using moving vector | |
JPH0869523A (en) | Human body recognition device | |
JP2003009137A (en) | Supervising system |