[go: up one dir, main page]

JPH09259278A - Image processing device - Google Patents

Image processing device

Info

Publication number
JPH09259278A
JPH09259278A JP6773396A JP6773396A JPH09259278A JP H09259278 A JPH09259278 A JP H09259278A JP 6773396 A JP6773396 A JP 6773396A JP 6773396 A JP6773396 A JP 6773396A JP H09259278 A JPH09259278 A JP H09259278A
Authority
JP
Japan
Prior art keywords
image
target object
illumination
unit
processing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP6773396A
Other languages
Japanese (ja)
Inventor
Michiyo Moriya
みち代 森家
Taro Imagawa
太郎 今川
Susumu Maruno
進 丸野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP6773396A priority Critical patent/JPH09259278A/en
Publication of JPH09259278A publication Critical patent/JPH09259278A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

(57)【要約】 【課題】 自然画像から対象物体を抽出する際に、照明
を用いて背景と対象物体を分離しやすくし、対象物体の
みを精度良く抽出することを目的とする。 【解決手段】 撮像手段2は照明手段1が対象物体とそ
の背景に光を照射したときの画像と通常の画像とを撮像
して各々画像記憶手段3に記憶させる。照明手段1と撮
像手段2の制御は機器制御手段3が行う。画像合成手段
5は画像記憶手段4が記憶する2枚の画像の差分画像を
求め、物体抽出手段6はしきい値処理により物体抽出手
段5で求めた差分画像から対象物体の領域を抽出する。
同時にしきい値処理結果に基づいて機器制御手段3の制
御方法を変更し、より良好な画像が得られるよう調整す
る。このように、しきい値処理を行うことにより、撮像
条件により生じるノイズを抽出せず自然背景下の対象領
域を精度良く抽出できる。
(57) Abstract: When extracting a target object from a natural image, it is an object of the present invention to facilitate the separation of the background and the target object by using illumination and to accurately extract only the target object. An image pickup means (2) picks up an image of a target object and its background illuminated with light and a normal image and stores them in an image storage means (3). The device control means 3 controls the illumination means 1 and the imaging means 2. The image synthesizing unit 5 obtains a difference image between the two images stored in the image storing unit 4, and the object extracting unit 6 extracts the area of the target object from the difference image obtained by the object extracting unit 5 by threshold processing.
At the same time, the control method of the device control means 3 is changed based on the result of the threshold processing, and adjustment is performed so that a better image can be obtained. In this way, by performing the threshold processing, it is possible to accurately extract the target region under the natural background without extracting noise caused by the imaging conditions.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【発明の属する技術分野】本発明は、画像から物体を抽
出する処理機能を備えた画像処理装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image processing apparatus having a processing function of extracting an object from an image.

【0002】[0002]

【従来の技術】従来は、背景と対象物体を含む画像(対
象画像)から対象物体のみを抽出するために、黒や青な
どの単色の背景を用い、輝度情報や色情報に対するしき
い値処理もしくはエッジ抽出処理などの物体抽出処理を
施す手法が用いられる(従来例1)。また、特殊な背景
を用いずに対象物体を抽出する方法として、対象物体を
含まない背景画像を用い、対象画像と背景画像との差分
を求めた後、物体抽出処理を施す手法が挙げられる(従
来例2)。
2. Description of the Related Art Conventionally, in order to extract only a target object from an image (a target image) including a background and a target object, a monochromatic background such as black or blue is used, and threshold processing for luminance information and color information is performed. Alternatively, a method of performing object extraction processing such as edge extraction processing is used (conventional example 1). Further, as a method of extracting a target object without using a special background, there is a method of using a background image that does not include the target object, obtaining a difference between the target image and the background image, and then performing an object extraction process ( Conventional example 2).

【0003】[0003]

【発明が解決しようとする課題】従来例1においては、
特殊な背景を用いる必要があるため、使用できる場所が
限られるという問題点があった。また、従来例2におい
ては、背景画像を取り込む際と対象画像を取り込む際の
撮像条件の変化による影響が大きく、対象物体とともに
撮像条件の違いにより生じるノイズを抽出してしまうと
いう問題点があった。
DISCLOSURE OF THE INVENTION In the conventional example 1,
Since it is necessary to use a special background, there is a problem that the usable place is limited. Further, in the conventional example 2, there is a problem that the influence of the change of the imaging condition when the background image is captured and when the target image is captured is large, and the noise caused by the difference in the imaging condition is extracted together with the target object. .

【0004】本発明は、これらの問題を解決するため
に、照明を用いて背景と対象物体を分離しやすくし、自
然背景下の対象物体を精度良く抽出することを目的とす
る。
In order to solve these problems, it is an object of the present invention to facilitate the separation of the background and the target object by using illumination and to accurately extract the target object under the natural background.

【0005】[0005]

【課題を解決するための手段】この課題を解決するため
に本発明は、対象物体および背景に光を照射する照明手
段と、前記対象物体および背景を撮像する撮像手段と、
前記照明手段と前記撮像手段の制御を行う機器制御手段
と、前記撮像手段で撮像した画像を記憶させておく画像
記憶手段と、前記画像記憶手段で記憶させた少なくとも
2枚の画像情報を合成する画像合成手段と、前記画像合
成手段で求めた画像情報から前記対象物体を抽出する物
体抽出手段とを備えたものである。
In order to solve this problem, the present invention provides an illumination means for irradiating a target object and a background with light, and an imaging means for imaging the target object and the background.
A device control means for controlling the illumination means and the image pickup means, an image storage means for storing an image picked up by the image pickup means, and at least two pieces of image information stored in the image storage means are combined. An image synthesizing unit and an object extracting unit for extracting the target object from the image information obtained by the image synthesizing unit are provided.

【0006】[0006]

【発明の実施の形態】BEST MODE FOR CARRYING OUT THE INVENTION

(実施の形態1)以下、本発明の第1の実施の形態につ
いて図1を用いて説明する。図1は本発明の実施の一形
態のブロック図を示すものである。図1に示すように、
対象物体および背景に光を照射する照明手段1と、対象
物体および背景を撮像する撮像手段2と、照明手段1と
撮像手段2の制御を行う機器制御手段3と、撮像手段2
で撮像した少なくとも2枚の画像を記憶させる画像記憶
手段4と、画像記憶手段4で記憶させた画像を合成する
画像合成手段5と、画像合成手段5で求めた合成画像か
ら対象物体を抽出する物体抽出手段6とで構成し、機器
制御手段3は、照明手段1の明るさの制御を行う照明制
御部31と、単位時間当りの撮像枚数やシャッター速度
などの撮像条件の制御を行う撮像制御部32とで構成す
る。
(First Embodiment) A first embodiment of the present invention will be described below with reference to FIG. FIG. 1 is a block diagram showing an embodiment of the present invention. As shown in FIG.
Illumination means 1 for irradiating the target object and the background with light, imaging means 2 for imaging the target object and the background, device control means 3 for controlling the illumination means 1 and the imaging means 2, and the imaging means 2.
An image storage unit 4 that stores at least two images captured in step S5, an image combination unit 5 that combines the images stored in the image storage unit 4, and a target object is extracted from the combined image obtained by the image combination unit 5. The device control means 3 comprises an object extraction means 6, and the device control means 3 controls the brightness of the lighting means 1 and an imaging control for controlling imaging conditions such as the number of images taken per unit time and the shutter speed. And a part 32.

【0007】図2は図1の照明制御部31の実施の一形
態のブロック図を示すものである。図2において照明制
御部31は、照明手段1の入切の切り替え操作を行うス
イッチ311と、スイッチ311の制御を行うスイッチ
制御部312とで構成する。
FIG. 2 is a block diagram showing an embodiment of the illumination control section 31 shown in FIG. In FIG. 2, the illumination control unit 31 includes a switch 311 that performs an on / off switching operation of the illumination unit 1 and a switch control unit 312 that controls the switch 311.

【0008】以上のように構成した画像処理装置の動作
を述べる。まず、照明手段1は対象物体およびその背景
に光を照射し、撮像手段2は照明手段1が光を照射する
対象物体およびその背景を撮像し、画像記憶手段4へ送
出する。照明手段1が光を照射する向きは図4に示すよ
うに、撮像手段2が対象物体を撮像する向きとほぼ同一
の向きとする。このとき撮像手段2が撮像する画像の例
を図5に示す。
The operation of the image processing apparatus configured as described above will be described. First, the illumination means 1 irradiates the target object and its background with light, and the imaging means 2 images the target object and its background with which the illumination means 1 irradiates light, and sends it to the image storage means 4. As shown in FIG. 4, the illuminating means 1 emits light in a direction substantially the same as the direction in which the imaging means 2 images the target object. An example of an image captured by the image capturing means 2 at this time is shown in FIG.

【0009】撮像手段2と背景との間の距離は撮像手段
2と対象物体との間の距離に対して大きいため、照明手
段1から背景に到達する光の量は、照明手段1から対象
物体に到達する光の量に比べて小さくなり、結果として
対象物体が背景に比べて明るく照射される。従って、照
明手段1が光を照射している際に撮像する画像は、図5
(a)に示すような通常の画像に比べて、図5(b)の
ように対象物体の明るさのみを強調した画像となる。
Since the distance between the image pickup means 2 and the background is larger than the distance between the image pickup means 2 and the target object, the amount of light reaching the background from the illumination means 1 is the same as that of the target object. Is smaller than the amount of light reaching the object, and as a result, the target object is illuminated brighter than the background. Therefore, the image captured while the illumination unit 1 is irradiating light is as shown in FIG.
As compared with the normal image as shown in FIG. 5A, the image has only the brightness of the target object emphasized as shown in FIG.

【0010】照明手段1の制御は照明制御部31が行
い、撮像手段2の制御は撮像制御部32が行うが、図2
のように照明制御部31を構成する場合、スイッチ31
1により照明手段1のON/OFFの切り替えを行い、
撮像制御部32はスイッチ311のON/OFFの切り
替えに応じて、照明が光を照射しているとき(スイッチ
がONの場合)、つまり図5(b)に示すような対象物
体の明るさのみを強調した画像と、光を照射していない
とき(スイッチがOFFの場合)、つまり図5(a)に
示すような通常の画像を交互に撮像し、画像記憶手段4
へ送出する。
The illumination control unit 31 controls the illumination unit 1, and the imaging control unit 32 controls the image pickup unit 2.
When configuring the lighting control unit 31 as shown in FIG.
1 switches ON / OFF of the lighting means 1,
The imaging control unit 32 responds to ON / OFF switching of the switch 311 when the illumination emits light (when the switch is ON), that is, only the brightness of the target object as shown in FIG. 5B. Image and the normal image as shown in FIG. 5A are alternately captured when the light is not emitted (when the switch is OFF), and the image storage means 4
Send to

【0011】画像記憶手段4は、撮像手段2が送出する
光を照射しているときの画像と照射していないときの画
像とを記憶し、画像合成手段5へ送出する。画像合成手
段5は、画像記憶手段4が送出した2枚の画像の差分画
像を求め、物体抽出手段6へ送出する。物体抽出手段6
は、あるしきい値を用いて画像合成手段5が送出した差
分画像に対してしきい値処理を行うことにより、対象物
体の領域を抽出する。同時に、しきい値処理結果に基づ
いて照明制御部31および撮像制御部32の制御方法を
変更し、より良好な画像が得られるよう調節する。例え
ば、しきい値処理により抽出された領域がある一定量よ
りも少ない場合、照明の光を強くする。このように、対
象物体に光を照射して対象物体の領域の明るさを強調し
た画像と通常の画像との差分を求めてしきい値処理を行
うことにより、撮像条件により生じるノイズを抽出せず
に対象領域のみを精度良く抽出することができる。ま
た、連続する2枚の画像を用いるため、背景の撮像条件
の変化が小さく、ノイズを軽減することができる。
The image storage means 4 stores an image when the light emitted from the image pickup means 2 is emitted and an image when the light is not emitted, and sends it to the image synthesizing means 5. The image synthesizing means 5 obtains a difference image between the two images sent by the image storing means 4, and sends it to the object extracting means 6. Object extracting means 6
Performs threshold processing on the difference image sent by the image synthesizing means 5 using a certain threshold to extract the area of the target object. At the same time, the control methods of the illumination control unit 31 and the imaging control unit 32 are changed based on the threshold processing result, and adjustment is performed so that a better image can be obtained. For example, when the region extracted by the threshold processing is smaller than a certain amount, the illumination light is made stronger. In this way, by extracting the difference between the image in which the brightness of the area of the target object is emphasized by illuminating the target object and the normal image and performing the threshold processing, the noise caused by the imaging conditions can be extracted. It is possible to extract only the target area with high accuracy. Further, since two consecutive images are used, the change in the background image pickup condition is small, and noise can be reduced.

【0012】なお、本実施の形態でのスイッチ311の
動作は、照明手段1の電源のON/OFFの切り替えを
行うものであるが、スイッチ311の動作が照明手段1
の明るさの強弱の切り替えを行うものであっても同様の
効果を得ることができる。
It should be noted that the operation of the switch 311 in the present embodiment is to switch ON / OFF of the power source of the illuminating means 1, but the operation of the switch 311 is the illuminating means 1.
The same effect can be obtained even if the brightness level is switched.

【0013】(実施の形態2)以下、本発明の第2の実
施の形態について図3を用いて説明する。第2の実施の
形態は、第1の実施の形態の撮像制御部32を図3に示
すブロック図のように構成したものである。
(Second Embodiment) The second embodiment of the present invention will be described below with reference to FIG. In the second embodiment, the image pickup control unit 32 of the first embodiment is configured as shown in the block diagram of FIG.

【0014】図3において、撮像制御部32は、照明手
段1の照射する光の量(光束)を検出する光束センサ3
21と、光束センサ321が計測した値に基づいて撮像
手段2が対象物体を撮像するタイミングを決定する撮像
指示部322とで構成する。
In FIG. 3, the image pickup control section 32 detects the amount of light (luminous flux) emitted by the illuminating means 1 and a luminous flux sensor 3 for detecting the quantity.
21 and an image capturing instruction unit 322 that determines the timing at which the image capturing unit 2 captures an image of the target object based on the value measured by the light flux sensor 321.

【0015】以上のように構成した画像処理装置の動作
を述べる。第1の実施の形態と同様に、照明手段1は、
対象物体の正面から対象物体及びその背景に光を照射
し、撮像手段2は照明手段1が光を照射する対象物体お
よびその背景を撮像し、画像記憶手段4へ送出する。照
明手段1の制御は照明制御部31が行い、撮像手段2の
制御は撮像制御部32が行うが、図3のように照明制御
部31を構成する場合、照明制御部31は、照明手段1
が常に対象物体および背景を照射するように制御する。
このとき、照明手段1は蛍光灯のように時間によって光
束が変化する照明を用いる。
The operation of the image processing apparatus configured as described above will be described. Similar to the first embodiment, the illumination means 1 is
The target object and its background are irradiated with light from the front of the target object, and the imaging unit 2 images the target object and its background illuminated by the illumination unit 1 and sends it to the image storage unit 4. The illumination control unit 31 controls the illumination unit 1 and the imaging control unit 32 controls the imaging unit 2. However, when the illumination control unit 31 is configured as shown in FIG. 3, the illumination control unit 31 controls the illumination unit 1
Controls to always illuminate the target object and the background.
At this time, the illumination unit 1 uses illumination such as a fluorescent lamp whose luminous flux changes with time.

【0016】また、光束センサ321は照明手段1の照
射する光の量(光束)を検出し、撮像指示部322は光
束センサ321の検出する光束の値に応じて撮像手段2
が対象物体および背景を撮像するタイミングを決定し、
決定したタイミングに応じて撮像手段2の制御を行う。
例えば照明手段1が照射する光が図6に示すような特性
を持つ場合、撮像指示部322は、光束が最大値Fmaxと
なる時間Tmaxと、最小値Fminとなる時間Tminとにおいて
画像を撮像するように、撮像手段2に対して指示を出
す。
The luminous flux sensor 321 detects the amount of light (luminous flux) emitted by the illuminating means 1, and the image pickup instructing section 322 determines the luminous flux detected by the luminous flux sensor 321.
Determines when to image the target object and background,
The imaging unit 2 is controlled according to the determined timing.
For example, when the light emitted by the illumination unit 1 has the characteristics shown in FIG. 6, the imaging instruction unit 322 captures an image at a time Tmax at which the luminous flux has the maximum value Fmax and a time Tmin at which the luminous flux has the minimum value Fmin. In this way, an instruction is issued to the image pickup means 2.

【0017】このように撮像手段2の制御を行うことに
よって、照明手段1の光束が多いとき、つまり対象物体
の明るさを特に強調した画像と、照明手段1が照射する
光の量が少ないとき、つまり対象物体の明るさをあまり
強調していない画像とを交互に撮像することになり、画
像記憶手段4ではこれら2種類の画像を1枚づつ記憶
し、画像合成手段5へ送出する。この後の処理及び照明
手段1の照射の向き(図4参照)は、第1の実施の形態
と同様である。
By thus controlling the image pickup means 2, when the luminous flux of the illuminating means 1 is large, that is, when the image in which the brightness of the target object is particularly emphasized and the illuminating means 1 emits a small amount of light. That is, an image in which the brightness of the target object is not so emphasized is alternately captured, and the image storage means 4 stores these two types of images one by one and sends them to the image synthesis means 5. The subsequent processing and the irradiation direction of the illuminating means 1 (see FIG. 4) are the same as those in the first embodiment.

【0018】このように、照明手段1の持つ特性を用い
て対象物体の領域の明るさを特に強調した画像とあまり
強調していない画像を撮像し、これらの差分を求めしき
い値処理を行うことにより、撮像条件により生じるノイ
ズを抽出せずに対象領域のみを精度良く抽出することが
できる。また、実施の形態1と同様、連続する2枚の画
像を用いるため、背景や対象領域の撮像条件の変化が小
さく、ノイズを軽減することができる。
As described above, an image in which the brightness of the region of the target object is particularly emphasized and an image in which the brightness of the region of the target object is not emphasized are picked up by using the characteristics of the illuminating means 1, and the difference between them is obtained and threshold processing is performed. As a result, only the target area can be accurately extracted without extracting noise caused by the imaging conditions. Further, as in the first embodiment, since two consecutive images are used, changes in the imaging conditions of the background and the target area are small, and noise can be reduced.

【0019】以上の示したように、第1,第2の実施の
形態ともに、対象物体の明るさのみを強調した画像を用
いるため、自然背景下においても、対象物体を精度良く
抽出することができる。なお、対象物体が人体の場合、
照明手段1に用いる光の種類を非可視光線とすると、通
常の光を用いる場合のように対象者がまぶしく感じるこ
とがなく、対象者の負担を軽減することができる。さら
に、照明手段が照射する光の向きが、撮像手段が対象物
体を撮像する向きとほぼ等しいため、照明手段1による
対象物体の影が生じにくく、照明を用いることによる悪
影響を受けないという利点がある。
As described above, in both the first and second embodiments, since the image in which only the brightness of the target object is emphasized is used, the target object can be accurately extracted even in the natural background. it can. If the target object is a human body,
When the type of light used for the illumination unit 1 is invisible light, the subject does not feel dazzling as in the case of using normal light, and the burden on the subject can be reduced. Furthermore, since the direction of the light emitted by the illumination means is substantially the same as the direction in which the imaging means images the target object, there is an advantage that the shadow of the target object by the illumination means 1 is unlikely to occur and the adverse effect of using the illumination is not exerted. is there.

【0020】[0020]

【発明の効果】以上のように本発明によれば、対象物体
に光を照射し、対象物体の領域の明るさを強調すること
によって、対象物体と背景とを分離しやすくし、自然背
景下の対象物体を正確に抽出することができる。
As described above, according to the present invention, the target object is irradiated with light and the brightness of the area of the target object is emphasized, so that the target object and the background are easily separated, and the natural background The target object can be accurately extracted.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明による画像処理装置の一実施の形態を示
すブロック図
FIG. 1 is a block diagram showing an embodiment of an image processing apparatus according to the present invention.

【図2】図1の機器制御手段3の第1の実施の形態を示
すブロック図
FIG. 2 is a block diagram showing a first embodiment of the device control means 3 of FIG.

【図3】図1の機器制御手段3の第2の実施の形態を示
すブロック図
FIG. 3 is a block diagram showing a second embodiment of the device control means 3 of FIG.

【図4】本発明の実施環境の一例を示す図FIG. 4 is a diagram showing an example of an implementation environment of the present invention.

【図5】(a),(b)撮像手段2において撮像した画像の一
例を示す図
5A and 5B are views showing an example of an image captured by the image capturing means 2.

【図6】照明手段1の照射する光の量の時間変化の一例
を示す図
FIG. 6 is a diagram showing an example of a temporal change in the amount of light emitted by the illumination means 1.

【符号の説明】[Explanation of symbols]

1 照明手段 2 撮像手段 3 機器制御手段 4 画像記憶手段 5 画像合成手段 6 物体抽出手段 31 照明制御部 32 撮像制御部 311 スイッチ 312 スイッチ制御部 321 光量センサ 322 撮像指示部 DESCRIPTION OF SYMBOLS 1 Illumination means 2 Imaging means 3 Device control means 4 Image storage means 5 Image composition means 6 Object extraction means 31 Illumination control section 32 Imaging control section 311 switch 312 switch control section 321 Light intensity sensor 322 Imaging instruction section

Claims (8)

【特許請求の範囲】[Claims] 【請求項1】対象物体および背景に光を照射する照明手
段と、前記対象物体および背景を撮像する撮像手段と、
前記照明手段と前記撮像手段の制御を行う機器制御手段
と、前記撮像手段で撮像した画像を記憶させておく画像
記憶手段と、前記画像記憶手段で記憶させた少なくとも
2枚の画像情報を合成する画像合成手段と、前記画像合
成手段で求めた画像情報から前記対象物体を抽出する物
体抽出手段とを備えることを特徴とする画像処理装置。
1. Illuminating means for irradiating a target object and a background with light, and imaging means for imaging the target object and the background.
A device control means for controlling the illumination means and the image pickup means, an image storage means for storing an image picked up by the image pickup means, and at least two pieces of image information stored in the image storage means are combined. An image processing apparatus comprising: an image synthesizing unit; and an object extracting unit that extracts the target object from the image information obtained by the image synthesizing unit.
【請求項2】機器制御手段は、画像合成手段で求めた画
像情報に基づいて機器の制御方法を逐次更新することを
特徴とする請求項1記載の画像処理装置。
2. The image processing apparatus according to claim 1, wherein the device control means sequentially updates the device control method based on the image information obtained by the image synthesizing means.
【請求項3】機器制御手段は、撮像手段の制御を行う撮
像制御部と、照明手段の明るさの制御を行う照明制御部
とで構成することを特徴とする請求項1または2記載の
画像処理装置。
3. The image according to claim 1, wherein the device control means comprises an image pickup control section for controlling the image pickup means and an illumination control section for controlling the brightness of the illumination means. Processing equipment.
【請求項4】照明制御部は、照明手段のスイッチと、前
記スイッチの制御を行うスイッチ制御部とで構成するこ
とを特徴とする請求項3記載の画像処理装置。
4. The image processing apparatus according to claim 3, wherein the illumination control unit includes a switch of the illumination unit and a switch control unit that controls the switch.
【請求項5】スイッチ制御部は、撮像手段の単位時間当
りの撮像枚数に合わせてスイッチの入切を繰り返すよう
に制御を行い、スイッチの入っている状態の画像と入っ
ていない状態の画像を画像記憶手段に送出することを特
徴とする請求項4記載の画像処理装置。
5. A switch control unit controls to repeat switching on and off according to the number of images taken by the imaging means per unit time, and displays an image of a state in which the switch is turned on and an image in a state of not turned on. The image processing apparatus according to claim 4, wherein the image processing means sends the image to the image storage means.
【請求項6】撮像制御部は、照明手段の照射する光の量
を検出する光束センサと、前記光束センサが計測した値
に基づいて撮像手段が対象物体を撮像するタイミングを
決定する撮像指示部とで構成することを特徴とする請求
項3記載の画像処理装置。
6. An image pickup control section, a light flux sensor for detecting an amount of light emitted by the illumination means, and an image pickup instruction section for deciding a timing at which the image pickup means picks up an image of a target object based on a value measured by the light flux sensor. The image processing apparatus according to claim 3, wherein the image processing apparatus is configured by:
【請求項7】画像記憶手段は、撮像手段が送出する画像
と予め撮像しておいた背景画像とを記憶することを特徴
とする請求項1または2記載の画像処理装置。
7. The image processing apparatus according to claim 1, wherein the image storage means stores the image sent by the image pickup means and the background image picked up in advance.
【請求項8】照明手段で用いる光線は非可視光線である
ことを特徴とする請求項1または2記載の画像処理装
置。
8. The image processing apparatus according to claim 1, wherein the light beam used by the illumination means is a non-visible light beam.
JP6773396A 1996-03-25 1996-03-25 Image processing device Pending JPH09259278A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP6773396A JPH09259278A (en) 1996-03-25 1996-03-25 Image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP6773396A JPH09259278A (en) 1996-03-25 1996-03-25 Image processing device

Publications (1)

Publication Number Publication Date
JPH09259278A true JPH09259278A (en) 1997-10-03

Family

ID=13353458

Family Applications (1)

Application Number Title Priority Date Filing Date
JP6773396A Pending JPH09259278A (en) 1996-03-25 1996-03-25 Image processing device

Country Status (1)

Country Link
JP (1) JPH09259278A (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005176031A (en) * 2003-12-12 2005-06-30 Nissan Motor Co Ltd In-vehicle obstacle detection device
JP2006155483A (en) * 2004-12-01 2006-06-15 Tokai Rika Co Ltd Object detection device
JP2008191760A (en) * 2007-02-01 2008-08-21 Toyota Central R&D Labs Inc Object detection device, object detection method, and program
JP2012531686A (en) * 2009-07-03 2012-12-10 シェンジェン タイシャン オンライン テクノロジー カンパニー リミテッド Target detection method and apparatus, and image collection apparatus
JP2015510169A (en) * 2012-01-17 2015-04-02 リープ モーション, インコーポレーテッドLeap Motion, Inc. Feature improvement by contrast improvement and optical imaging for object detection
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US9436998B2 (en) 2012-01-17 2016-09-06 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11994377B2 (en) 2012-01-17 2024-05-28 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US12154238B2 (en) 2014-05-20 2024-11-26 Ultrahaptics IP Two Limited Wearable augmented reality devices with object detection and tracking
US12260023B2 (en) 2012-01-17 2025-03-25 Ultrahaptics IP Two Limited Systems and methods for machine control
US12299207B2 (en) 2015-01-16 2025-05-13 Ultrahaptics IP Two Limited Mode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
US12314478B2 (en) 2014-05-14 2025-05-27 Ultrahaptics IP Two Limited Systems and methods of tracking moving hands and recognizing gestural interactions

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005176031A (en) * 2003-12-12 2005-06-30 Nissan Motor Co Ltd In-vehicle obstacle detection device
JP2006155483A (en) * 2004-12-01 2006-06-15 Tokai Rika Co Ltd Object detection device
JP2008191760A (en) * 2007-02-01 2008-08-21 Toyota Central R&D Labs Inc Object detection device, object detection method, and program
JP2012531686A (en) * 2009-07-03 2012-12-10 シェンジェン タイシャン オンライン テクノロジー カンパニー リミテッド Target detection method and apparatus, and image collection apparatus
US10699155B2 (en) 2012-01-17 2020-06-30 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US12086327B2 (en) 2012-01-17 2024-09-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US9436998B2 (en) 2012-01-17 2016-09-06 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US12260023B2 (en) 2012-01-17 2025-03-25 Ultrahaptics IP Two Limited Systems and methods for machine control
JP2016186793A (en) * 2012-01-17 2016-10-27 リープ モーション, インコーポレーテッドLeap Motion, Inc. Feature improvement by contrast improvement and optical imaging for object detection
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
US11994377B2 (en) 2012-01-17 2024-05-28 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US11308711B2 (en) 2012-01-17 2022-04-19 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US11782516B2 (en) 2012-01-17 2023-10-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US9652668B2 (en) 2012-01-17 2017-05-16 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9672441B2 (en) 2012-01-17 2017-06-06 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9741136B2 (en) 2012-01-17 2017-08-22 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9767345B2 (en) 2012-01-17 2017-09-19 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9778752B2 (en) 2012-01-17 2017-10-03 Leap Motion, Inc. Systems and methods for machine control
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US10366308B2 (en) 2012-01-17 2019-07-30 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10410411B2 (en) 2012-01-17 2019-09-10 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US10565784B2 (en) 2012-01-17 2020-02-18 Ultrahaptics IP Two Limited Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space
US9626591B2 (en) 2012-01-17 2017-04-18 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
JP2015510169A (en) * 2012-01-17 2015-04-02 リープ モーション, インコーポレーテッドLeap Motion, Inc. Feature improvement by contrast improvement and optical imaging for object detection
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US10097754B2 (en) 2013-01-08 2018-10-09 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9626015B2 (en) 2013-01-08 2017-04-18 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US12204695B2 (en) 2013-01-15 2025-01-21 Ultrahaptics IP Two Limited Dynamic, free-space user interactions for machine control
US11874970B2 (en) 2013-01-15 2024-01-16 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US12405673B2 (en) 2013-01-15 2025-09-02 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11693115B2 (en) 2013-03-15 2023-07-04 Ultrahaptics IP Two Limited Determining positional information of an object in space
US12306301B2 (en) 2013-03-15 2025-05-20 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US12333081B2 (en) 2013-04-26 2025-06-17 Ultrahaptics IP Two Limited Interacting with a machine using gestures in first and second user-specific virtual planes
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11776208B2 (en) 2013-08-29 2023-10-03 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US12086935B2 (en) 2013-08-29 2024-09-10 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11461966B1 (en) 2013-08-29 2022-10-04 Ultrahaptics IP Two Limited Determining spans and span lengths of a control object in a free space gesture control environment
US12236528B2 (en) 2013-08-29 2025-02-25 Ultrahaptics IP Two Limited Determining spans and span lengths of a control object in a free space gesture control environment
US11282273B2 (en) 2013-08-29 2022-03-22 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US12242312B2 (en) 2013-10-03 2025-03-04 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US12265761B2 (en) 2013-10-31 2025-04-01 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US12314478B2 (en) 2014-05-14 2025-05-27 Ultrahaptics IP Two Limited Systems and methods of tracking moving hands and recognizing gestural interactions
US12154238B2 (en) 2014-05-20 2024-11-26 Ultrahaptics IP Two Limited Wearable augmented reality devices with object detection and tracking
US12095969B2 (en) 2014-08-08 2024-09-17 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US12299207B2 (en) 2015-01-16 2025-05-13 Ultrahaptics IP Two Limited Mode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments

Similar Documents

Publication Publication Date Title
JPH09259278A (en) Image processing device
US8743274B2 (en) In-camera based method of detecting defect eye with high accuracy
US7974467B2 (en) Image-taking system
US7539335B2 (en) Image data processor, computer program product, and electronic endoscope system
CN109561817B (en) Electronic endoscope system
US20110141436A1 (en) Fundus camera and ophthalmologic image processing apparatus
WO2007135735A1 (en) Face authentication device, face authentication method, and face authentication program
US20060241498A1 (en) Image data processor, computer program product, and electronic endoscope system
WO2016170604A1 (en) Endoscope device
JP2011054110A (en) Image processing type measuring instrument and image processing measuring method
JP6695021B2 (en) Lighting equipment
JP2012239757A (en) Medical equipment and medical processor
CN107851382B (en) Lamp detection device and lamp detection method
US7822247B2 (en) Endoscope processor, computer program product, endoscope system, and endoscope image playback apparatus
JP5414379B2 (en) Optical wireless communication apparatus, optical wireless communication method, and program
JP2019527913A (en) Apparatus for providing semantic information and method of operating the same
JP2013026656A (en) Photographing device, photographing method, and photographing program
JPH1093859A (en) Image processing unit and image processing method
CN108351684B (en) Operation detection device, operation detection method and image display system
JP3283131B2 (en) Endoscope device
JP2000050115A (en) Photographing method and photographing device
CN115577726A (en) Image forming apparatus and image forming method
JP2006252363A (en) Vehicle surrounding object detection device and vehicle surrounding object detection method
JP6452355B2 (en) Image capturing apparatus, control method therefor, and program
JP2000275357A (en) Method and apparatus for detecting rainfall and snow conditions