[go: up one dir, main page]

JPH0583248B2 - - Google Patents

Info

Publication number
JPH0583248B2
JPH0583248B2 JP3132850A JP13285091A JPH0583248B2 JP H0583248 B2 JPH0583248 B2 JP H0583248B2 JP 3132850 A JP3132850 A JP 3132850A JP 13285091 A JP13285091 A JP 13285091A JP H0583248 B2 JPH0583248 B2 JP H0583248B2
Authority
JP
Japan
Prior art keywords
eyeball
head
movement
detection
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP3132850A
Other languages
Japanese (ja)
Other versions
JPH04357930A (en
Inventor
Hidetomo Sakaino
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ei Tei Aaru Tsushin Shisutemu Kenkyusho Kk
Original Assignee
Ei Tei Aaru Tsushin Shisutemu Kenkyusho Kk
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ei Tei Aaru Tsushin Shisutemu Kenkyusho Kk filed Critical Ei Tei Aaru Tsushin Shisutemu Kenkyusho Kk
Priority to JP3132850A priority Critical patent/JPH04357930A/en
Publication of JPH04357930A publication Critical patent/JPH04357930A/en
Publication of JPH0583248B2 publication Critical patent/JPH0583248B2/ja
Granted legal-status Critical Current

Links

Landscapes

  • Eye Examination Apparatus (AREA)
  • Position Input By Displaying (AREA)

Description

【発明の詳細な説明】[Detailed description of the invention]

【0001】[0001]

【産業上の利用分野】 この発明は視線方向検出
装置に関し、特に、人の眺めた方向、すなわち視
線方向に関する情報について、この情報を適用し
ようとする分野や連動装置などに用いられるよう
な視線方向検出装置に関する。
[Industrial Application Field] The present invention relates to a line-of-sight direction detection device, and in particular, regarding information regarding the direction in which a person looks, that is, the line-of-sight direction. This invention relates to a detection device.

【0002】[0002]

【従来の技術】 マンマシンインタフエースやカ
メラをリモートコントロールすることによつて遠
隔地に存在する情報を撮影するいわゆるテレイグ
ジスタンスを必要とする技術分野において、人の
視線方向の情報を諸制御系へ送信し、眺めた方向
に応じて、連動する視線検出器を設計することが
重要な課題の1つである。特に、安定で感度のあ
る視線の連動を実現することは強く望まれてい
る。すなわち、最適フイルタの設計が必要とされ
る。
[Prior Art] In technical fields that require so-called telexistence, in which information existing in a remote location is photographed by remotely controlling a man-machine interface or a camera, information on the direction of a person's line of sight is used in various control systems. One of the key challenges is to design a line-of-sight detector that transmits data to and interacts with the direction in which it is viewed. In particular, it is strongly desired to realize stable and sensitive line-of-sight linkage. That is, it is necessary to design an optimal filter.

【0003】 図6は従来の方法によつて視線方向を
検出する装置の一例を示す図である。図6を参照
して、自走式ロボツト1は左右、上下、左右回転
の6自由度を有するカメラ2とマニユピレータ3
とを有しており、命令系4からの命令によつて遠
隔的に誘導される。命令系4は人の目と頭部の動
き5が眼鏡6に設けられた頭部の向き検出部7と
眼球の向き検出部8とによつて検出され、手の動
きが操作部9によつて検出され、それぞれの検出
出力が伝送部10に与えられる。これらの検出出
力は伝送部10から自走式ロボツト1に無線など
によつて伝送される。自走式ロボツト1は操作部
9の検出出力に応じて走行し、マニユピレータ3
が動作し、検出部7,8によつて検出された方向
の画像がカメラ2によつて撮像され、命令系4に
伝送される。撮像された画像はモニタ11に映し
出される。理想的には、12に示すように、安定
に連続した景観が再生される。
[0003] FIG. 6 is a diagram illustrating an example of a device that detects the line of sight direction using a conventional method. Referring to FIG. 6, a self-propelled robot 1 includes a camera 2 and a manipulator 3 having six degrees of freedom: left and right, up and down, and left and right rotation.
and is remotely guided by commands from the command system 4. In the command system 4, movements 5 of a person's eyes and head are detected by a head orientation detection unit 7 and an eyeball orientation detection unit 8 provided on glasses 6, and hand movements are detected by an operation unit 9. and the respective detection outputs are provided to the transmission section 10. These detection outputs are transmitted from the transmission section 10 to the self-propelled robot 1 by wireless or the like. The self-propelled robot 1 moves according to the detection output of the operating section 9, and the manipulator 3
operates, and images in the directions detected by the detection units 7 and 8 are captured by the camera 2 and transmitted to the command system 4. The captured image is displayed on the monitor 11. Ideally, a stable continuous landscape is reproduced as shown in 12.

【0004】 図7は従来の方法で実際に目と頭部の
動きとに応じて再生された景観の一例を示す図で
ある。図6に示したモニタ11には、頭部の動き
検出部7と眼球の向き検出部8の検出出力に含ま
れる高周波ノイズが低域通過フイルタで除去さ
れ、その検出出力が特定の方法をじつと眺めた画
像として映し出される。ところが眼球運動に含ま
れる不随意運動とその不安定な挙動のために、意
思と無関係な景観が再生されてしまう。そこで、
頭部の向き検出部7と眼球の向き検出部8の検出
感度をかなり落とすことによつて、安定した画像
を再生できるが、その分眺めたい方向と再生され
る画像との間に、時間遅れを知覚するようにな
る。
[0004] FIG. 7 is a diagram showing an example of a landscape that is actually reproduced according to eye and head movements using a conventional method. In the monitor 11 shown in FIG. 6, high-frequency noise contained in the detection outputs of the head movement detection section 7 and the eyeball orientation detection section 8 is removed by a low-pass filter, and the detection outputs are processed according to a specific method. It is displayed as an image viewed at a glance. However, due to the involuntary movements involved in eye movements and their unstable behavior, landscapes unrelated to intention are reproduced. Therefore,
A stable image can be reproduced by significantly lowering the detection sensitivity of the head orientation detector 7 and the eyeball orientation detector 8, but this will result in a time lag between the direction you want to view and the reproduced image. come to perceive.

【0005】 図8は図6に示した頭部の向き検出部
および眼球の向き検出部による頭部と眼球の検出
系のアルゴリズムを示す図である。図8を参照し
て、検出部21は図6に示した頭部の向き検出部
7と眼球の向き検出部8とを含んでいて、変位情
報として頭部の向き座標(Hz、Hy、Hz)と眼
球の向き座標(Ex、Ey)を検出する。これらの
変位情報に含まれるノイズはフイルタ部22によ
つて除去され、各変位情報はHx→Hx1、Hy→
Hy1、Hz→Hz1、Ex→Ex1、Ey→Ey1となる。視
線方向算出部23はフイルタリングされた頭部と
眼球の変位情報とを合成する。これは眼球の変位
が頭部に対する変位であり、頭部が画面系に対す
る変位である場合、統一的に画面系における視線
方向を算出するものである。視線の各成分は図6
に示した伝送部10に含まれる送信部24を介し
て自走式ロボツト1に送信される。
[0005] FIG. 8 is a diagram showing an algorithm of a head and eyeball detection system using the head orientation detection unit and the eyeball orientation detection unit shown in FIG. 6. Referring to FIG. 8, the detection unit 21 includes the head orientation detection unit 7 and the eyeball orientation detection unit 8 shown in FIG. ) and the eyeball orientation coordinates (Ex, Ey). Noise included in these displacement information is removed by the filter section 22, and each displacement information is changed to Hx→Hx 1 , Hy→
Hy 1 , Hz→Hz 1 , Ex→Ex 1 , Ey→Ey 1 . The line-of-sight direction calculation unit 23 synthesizes the filtered head and eyeball displacement information. This is to uniformly calculate the viewing direction in the screen system when the displacement of the eyeball is a displacement with respect to the head, and the head is a displacement with respect to the screen system. Each component of the line of sight is shown in Figure 6.
The signal is transmitted to the self-propelled robot 1 via the transmitter 24 included in the transmitter 10 shown in FIG.

【0006】 図9は頭部運動と眼球運動とから検出
される波形の一例を説明するための図である。図
6に示した観察者5の頭部の向き検出部7の検出
出力は図9aに示す波形31のようになり、眼球
向き検出部8の検出出力は図9bに示す波形34
のようになる。これらの検出出力をそれぞれロー
パスフイルタ32,35を通過させると、滑らか
な波形33,36のようになる。頭部の向き検出
波形33はかなり滑らかになるが、眼球の向き検
出波形36についてはまだ滑らかにする余地があ
る。安定の感度のある連動を実現するためには、
眼球と頭部の挙動を考慮したフイルタリングが必
要となる。
[0006] FIG. 9 is a diagram for explaining an example of a waveform detected from head movement and eyeball movement. The detection output of the head orientation detection section 7 of the observer 5 shown in FIG. 6 has a waveform 31 shown in FIG. 9a, and the detection output of the eyeball orientation detection section 8 has a waveform 34 shown in FIG. 9b.
become that way. When these detection outputs are passed through low-pass filters 32 and 35, respectively, smooth waveforms 33 and 36 are obtained. Although the head orientation detection waveform 33 becomes considerably smooth, there is still room for smoothing the eyeball orientation detection waveform 36. In order to achieve stable and sensitive interlocking,
Filtering that takes into account the behavior of the eyes and head is required.

【0077】【0077】

【発明が解決しようとする課題】 上述のごと
く、従来の視線検出方法では、図9に示すよう
に、単純にローパスフイルタ32,35を備えて
いるだけであるため、不安定な眼球の軌道信号に
対しては、ある程度平滑化が達成されているもの
の、人の観察した方向と検出器から出力される方
向との誤差が大きく、また高周波成分が除去され
ているために、感度が低下しており、視覚的な違
和感が生じるなどの問題点があつた。
[Problems to be Solved by the Invention] As mentioned above, the conventional line of sight detection method simply includes low-pass filters 32 and 35 as shown in FIG. Although smoothing has been achieved to some extent for However, there were problems such as visual discomfort.

【0008】 それゆえに、この発明の主たる目的
は、実験で得られた人の眼球と頭部の運動制、協
調性に関しての知見に基づいて、最適フイルタを
含む視線検出アルゴリズムを組込んだ視線方向検
出装置を提供することである。
[0008] Therefore, the main purpose of the present invention is to develop a gaze direction system that incorporates a gaze detection algorithm that includes an optimal filter, based on the knowledge about human eyeball and head movement control and coordination obtained through experiments. An object of the present invention is to provide a detection device.

【0009】【0009】

【課題を解決するための手段】 この発明は視線
方向検出装置であつて、頭部の移動量と方向とを
検出する頭部向き検出手段と、眼球の移動量と方
向とを検出する眼球向き検出手段と、頭部向き検
出手段および眼球向き検出手段の検出出力の変位
成分に含まれる微小な変化成分を除去する第1の
フイルタ手段と、第1のフイルタ手段出力に含ま
れる頭部向きの変位成分に応じて、頭部の移動状
態を判定する頭部移動状態判定手段と、第1のフ
イルタ手段出力に含まれる眼球向きの変位成分に
応じて、眼球の軌道変化と眼球の移動方向とを算
出する眼球方向算出手段と、眼球方向算出手段に
よつて算出された眼球移動方向に応じた不随意運
動に伴う不要成分を眼球移動方向ごとに除去する
第2のフイルタ手段と、頭部移動状態判定手段の
出力と第2のフイルタ手段出力とに応じて視線の
移動情報を算出する移動情報算出手段とを備えて
構成される。
[Means for Solving the Problems] The present invention is a line-of-sight direction detecting device, which includes a head direction detecting means for detecting the amount and direction of movement of the head, and an eye direction detecting means for detecting the amount and direction of movement of the eyeball. a detection means; a first filter means for removing minute change components included in the displacement components of the detection outputs of the head orientation detection means and the eyeball orientation detection means; A head movement state determination means for determining the movement state of the head according to the displacement component; and a head movement state determination means for determining the movement state of the head, and a change in the orbit of the eyeball and a movement direction of the eyeball according to the displacement component in the direction of the eyeball included in the output of the first filter means. eyeball direction calculation means for calculating the eyeball direction calculation means; a second filter means for removing unnecessary components associated with involuntary movements according to the eyeball movement directions calculated by the eyeball direction calculation means for each eyeball movement direction; It is configured to include a movement information calculation means for calculating movement information of the line of sight according to the output of the state determination means and the output of the second filter means.

【0010】【0010】

【作用】 この発明に係る視線方向検出装置は、
検出した頭部向き検出出力および眼球向き検出出
力に含まれる微小な変化成分を第1のフイルタ手
段で除去し、頭部向きの変位成分に応じて頭部の
移動状態を判定し、眼球向きの変位成分に応じて
眼球の軌道変化と眼球の移動方向とを算出し、第
2のフイルタ手段によつて眼球移動方向に応じた
不随意運動に伴う不要成分を除去し、安定した頭
部移動状態と眼球移動方向とに応じて視線の移動
情報を算出する。
[Function] The line-of-sight direction detection device according to the present invention has the following features:
The first filter removes minute change components included in the detected head orientation detection output and eyeball orientation detection output, determines the movement state of the head according to the head orientation displacement component, and determines the eyeball orientation. The trajectory change of the eyeball and the movement direction of the eyeball are calculated according to the displacement component, and unnecessary components accompanying involuntary movements according to the direction of eyeball movement are removed by the second filter means, thereby achieving a stable head movement state. The line of sight movement information is calculated according to the eyeball movement direction.

【0011】[0011]

【実施例】 図1はこの発明に係る視線検出方向
装置の作動状況を説明するための図である。図1
においては、眼球の移動41と頭部の移動42と
を示しており、観察者は水平、斜め、垂直方向に
眺めた場合の眼球と頭部の運動性を示している。
視線の方向が変わる点44,45で眼球と頭部運
動との間には時間遅れ43が生じる。一方、一定
の方向に視線が移動している場合には、眼球の運
動がサツケードな成分が含まれるため、不安定な
挙動を示すが頭部の運動は比較的滑らか46であ
ることがわかつている。特に、移動する方向によ
つて眼球の不安定性が異なつている。
Embodiment FIG. 1 is a diagram for explaining the operating status of the visual line detection direction device according to the present invention. Figure 1
, a movement 41 of the eyeball and a movement 42 of the head are shown, and the movement of the eyeball and the head is shown when the observer looks in the horizontal, diagonal, and vertical directions.
A time delay 43 occurs between the eyeball and head movements at the points 44, 45 where the direction of the line of sight changes. On the other hand, it has been found that when the line of sight moves in a fixed direction, the movement of the eyeballs contains a saccadic component and exhibits unstable behavior, but the movement of the head is relatively smooth46. There is. In particular, the instability of the eyeball varies depending on the direction of movement.

【0012】 図2は視線の違いによる頭部と眼球運
動の不安定性の違いを説明するための図である。
頭部の運動47,49と眼球の運動48,50
は、観察者に一点を注視してもらうように指示を
促したときに検出される。座位にける頭部の運動
47および眼球の運動48と、立体における頭部
の運動49と眼球の運動50とで明らかに挙動が
異なつているのがわかる。また、眼球が一点にと
どまつていない。
[0012] FIG. 2 is a diagram for explaining the difference in instability of head and eye movements due to differences in line of sight.
Head movements 47, 49 and eye movements 48, 50
is detected when the observer is instructed to fixate on a single point. It can be seen that the behavior is clearly different between head movement 47 and eyeball movement 48 in a sitting position, and head movement 49 and eyeball movement 50 in a three-dimensional position. Also, the eyeballs don't stay on one point.

【0013】 図3はこの発明の一実施例の構成を示
す図であり、図4は図3に示した安定部の動作を
説明するためのフロー図である。
[0013] FIG. 3 is a diagram showing the configuration of an embodiment of the present invention, and FIG. 4 is a flowchart for explaining the operation of the stabilizing section shown in FIG. 3.

【0014】 まず、図3を参照して、検出部51は
頭部の向き検出部52と眼球の向き検出部53と
を含む。頭部の向き検出部52は頭部の向き成分
(Hx、Hy、Hz)を検出し、フイルタ部54に与
える。フイルタ部54は頭部の向き成分(Hx、
Hy、Hz)からノイズ成分を除去して変位量検出
部56に与える。変位量検出部56は頭部の向き
成分(Hx、Hy、Hz)のそれぞれの各成分の単
位時間あたりの位置変化、すなわち速度ΔHx、
ΔHy、ΔHzを算出する。また、速度の変位成分
ΔHx、ΔHy、ΔHzから方向ΔXhx=ΔHx/Δt、
ΔVhy=ΔHy/Δt、ΔVhz=ΔHz/Δtを算出す
る。
[0014] First, referring to FIG. 3, the detection unit 51 includes a head orientation detection unit 52 and an eyeball orientation detection unit 53. The head orientation detection unit 52 detects head orientation components (Hx, Hy, Hz) and provides them to the filter unit 54. The filter section 54 detects the head direction component (Hx,
Noise components are removed from the signal (Hy, Hz) and provided to the displacement amount detection section 56. The displacement amount detection unit 56 detects the position change per unit time of each component of the head orientation component (Hx, Hy, Hz), that is, the velocity ΔHx,
Calculate ΔHy and ΔHz. Also, from the velocity displacement components ΔHx, ΔHy, ΔHz, the direction ΔXhx = ΔHx/Δt,
Calculate ΔVhy=ΔHy/Δt and ΔVhz=ΔHz/Δt.

【0015】 同様にして、眼球の向き検出部53は
眼球の向き成分(Ex、Ey)を検出し、フイルタ
部55に与える。フイルタ55は眼球の向き成分
(Ex、Ey)からノイズ成分を除去して変位量検
出部57に与える。変位量検出部57は眼球の向
き成分(Ex、Ey)のそれぞれの各成分の単位時
間あたりの位置変化、すなわち速度ΔEx、ΔEy
を算出する。また、速度の変位成分ΔEx、ΔEy
から方向ΔVex=ΔEx/Δt、ΔVey=ΔEy/Δt、
Δte=tan-1(ΔEy/ΔEx)を算出する。
[0015] Similarly, the eyeball orientation detection unit 53 detects eyeball orientation components (Ex, Ey) and provides them to the filter unit 55. The filter 55 removes noise components from the eyeball direction components (Ex, Ey) and provides the result to the displacement detection section 57 . The displacement amount detection unit 57 detects the position change per unit time of each component of the eyeball orientation component (Ex, Ey), that is, the velocity ΔEx, ΔEy.
Calculate. Also, the velocity displacement components ΔEx, ΔEy
Direction from ΔVex=ΔEx/Δt, ΔVey=ΔEy/Δt,
Calculate Δte=tan -1 (ΔEy/ΔEx).

【0016】 判定部58は頭部または眼球の移動情
報を逐次選択する。ここで、図4を参照して、判
定部58の動作についてより詳細に説明する。ス
テツプ(図示ではSPと略称する)SP1において、
検出された頭部の速度変化に基づいて、頭部が静
止中であるかあるいは移動中であるかが判定され
る頭部が移動中であることが判定されると、ステ
ツプSP2において眼球の方向変化に基づいて眼球
運動の軌道が変化したか否かが判定される。眼球
運動の軌道に変化が生じていれば、ステツプSP3
において眼球の速度変化情報に基づいて眼球の方
向が検出される。ステツプSP4において、検出さ
れた眼球の移動する方向に応じて、横、縦、斜め
方向用の帯域の異なつたフイルタが逐次選択さ
れ、移動する方向の成分に含まれるノイズが除去
される。ステツプSP5において、頭部の移動する
方向と眼球の移動する方向とが比較され、お互い
の移動する方向が平行であればステツプSP7にお
いて頭部の移動情報が選択され、平行でなければ
ステツプSP6において眼球の移動情報が選択され
る。図3に示した視線方向算出部59はステツプ
SP8において、頭部または眼球の運動を逐次選択
し、視線の方向を算出する。
[0016] The determining unit 58 sequentially selects head or eye movement information. Here, the operation of the determination unit 58 will be explained in more detail with reference to FIG. 4. In step SP1 (abbreviated as SP in the diagram),
Based on the detected speed change of the head, it is determined whether the head is stationary or moving. If it is determined that the head is moving, the direction of the eyeballs is determined in step SP2. Based on the change, it is determined whether the trajectory of the eye movement has changed. If there is a change in the eye movement trajectory, proceed to step SP3.
The direction of the eyeball is detected based on the speed change information of the eyeball. At step SP4, filters with different bands for horizontal, vertical, and diagonal directions are sequentially selected according to the detected direction of movement of the eyeball, and noise included in the component in the direction of movement is removed. In step SP5, the direction in which the head moves and the direction in which the eyeballs move are compared, and if the directions of movement are parallel to each other, head movement information is selected in step SP7, and if they are not parallel, the information is selected in step SP6. Eye movement information is selected. The line-of-sight direction calculation unit 59 shown in FIG.
In SP8, head or eyeball movements are sequentially selected and the direction of the line of sight is calculated.

【0017】 図5はこの発明によつて視線検出アル
ゴリズムを含む視線方向検出装置による検出実験
を行なつた結果を示す図である。図5aに示すデ
イスプレイに表示された家の概略図61を観察し
た場合、頭部の軌跡62と眼球の軌跡63は、頭
部の軌跡62は滑らかになつても、眼球の軌跡6
3は不安定になつている。言換えると、眼球運動
の直線性が悪いのがわかる。これに対して、図5
bに示すように、本願発明を用いて視線の軌道を
検出すると、視線63は滑らかになる。比較のた
めに、従来のアルゴリズムを用いた場合の結果を
図5cに示す。明らかに、従来のアリゴリズムで
は特に、眼球運動の不安定な挙動に起因して、頭
部と眼球の軌跡の合成値、すなわち視線の軌跡6
4はそのまま乱れたものになる。これに対して、
この発明では、図5bに示すように視線63が移
動する場合には、安定した挙動を示す頭部の軌跡
に切換えていて、また視線の方向の変化には眼球
の軌跡の変化情報を用いており、視線の方向に応
じて帯域の異なつたフイルタを適応的に切換えて
いるため、従来の視線方向検出で達し得なかつた
安定で感度のある視線方向情報を抽出することが
実現できる。
[0017] FIG. 5 is a diagram showing the results of a detection experiment using a line-of-sight direction detection device including a line-of-sight detection algorithm according to the present invention. When observing the schematic diagram 61 of a house displayed on the display shown in FIG.
3 is becoming unstable. In other words, it can be seen that the linearity of eye movement is poor. On the other hand, Fig. 5
As shown in b, when the trajectory of the line of sight is detected using the present invention, the line of sight 63 becomes smooth. For comparison, the results using the conventional algorithm are shown in Figure 5c. Obviously, due to the unstable behavior of eye movements, the conventional algorithm cannot produce the composite value of the head and eyeball trajectories, i.e., the line of sight trajectory6.
4 becomes a mess as it is. On the contrary,
In this invention, when the line of sight 63 moves as shown in FIG. 5b, the trajectory is switched to the trajectory of the head that shows stable behavior, and information on changes in the trajectory of the eyeballs is used to change the direction of the line of sight. Since filters with different bands are adaptively switched depending on the direction of the line of sight, it is possible to extract stable and sensitive line-of-sight direction information that could not be achieved with conventional line-of-sight direction detection.

【0018】[0018]

【発明の効果】 以上のように、この発明によれ
ば、検出器からの頭部と眼球の移動情報に対し
て、それぞれの移動方向、移動速度、移動開始情
報などについての情報に基づいて、適応的にフイ
ルタを切換えているため、不随意運動を含む不安
定な眼球の挙動に対して、安定で感度の良い連動
系を実現できる。そのため、眺めた方向の画像を
再生するような場合、従来の連動に比べて検出誤
差を大きく軽減できる。
Effects of the Invention As described above, according to the present invention, based on information on the movement direction, movement speed, movement start information, etc. of the head and eyeballs from the detector, Since the filters are adaptively switched, a stable and sensitive interlocking system can be realized against unstable eyeball behavior including involuntary movements. Therefore, when reproducing an image in the direction in which it is viewed, detection errors can be greatly reduced compared to conventional interlocking.

【図面の簡単な説明】[Brief explanation of the drawing]

【図1】この発明に係る視線方向検出措置の作動
状況を説明するための図である。
FIG. 1 is a diagram for explaining the operating status of a line-of-sight direction detection device according to the present invention.

【図2】姿勢の違いによる頭部と眼球運動の不安
定性の違いを説明するための図である。
FIG. 2 is a diagram for explaining differences in instability of head and eye movements due to differences in posture.

【図3】この発明の一実施例の構成を示す図であ
る。
FIG. 3 is a diagram showing the configuration of an embodiment of the present invention.

【図4】図3に示した判定部の動作を説明するた
めのフロー図である。
FIG. 4 is a flow diagram for explaining the operation of the determination section shown in FIG. 3;

【図5】視線検出アルゴリズムを含む視線方向検
出装置による検出実験を行なつた結果を示す図で
ある。
FIG. 5 is a diagram showing the results of a detection experiment using a line-of-sight direction detection device including a line-of-sight detection algorithm.

【図6】従来の方法によつて視線方向を検出する
装置の一例を示す図である。
FIG. 6 is a diagram illustrating an example of a device that detects the line of sight direction using a conventional method.

【図7】従来の方法で実際に眼と頭部の後きに応
じて再生された景観の一例を示す図である。
FIG. 7 is a diagram showing an example of a landscape actually reproduced according to the back position of the eyes and head using a conventional method.

【図8】従来の視線方向検出アルゴリズムを用い
て視線の方向と獲得した景観との連動における問
題点を説明するための図である。
FIG. 8 is a diagram for explaining problems in linking the direction of the line of sight and the acquired scenery using a conventional line of sight direction detection algorithm.

【図9】頭部運動と眼球運動から検出される波形
の一例を説明するための図である。
FIG. 9 is a diagram for explaining an example of a waveform detected from head movement and eyeball movement.

【符号の説明】[Explanation of symbols]

51 検出部 52 頭部の向き検出部 53 眼球の向き検出部 54,55 フイルタ部 58 判定部 59 視線方向算出部。 51 Detection section 52 Head orientation detection unit 53 Eyeball orientation detection unit 54, 55 Filter section 58 Judgment section 59 Gaze direction calculation unit.

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 頭部の移動量と方向とを検出する
頭部向き検出手段、眼球の移動量と方向とを検出
する眼球向き検出手段、 前記頭部向き検出手段および前記眼球向き検出
手段の検出出力の変位成分に含まれる微小な変化
成分を除去する第1のフイルタ手段、 前記第1のフイルタ手段出力に含まれる頭部向
きの変位成分に応じて、頭部の移動状態を判定す
る頭部移動状態判定手段、 前記第1のフイルタ手段出力に含まれる眼球向
きの変位成分に応じて、眼球の軌道変化と眼球の
移動方向とを算出する眼球方向算出手段、 前記眼球方向算出手段によつて算出された眼球
移動方向に応じた不随意運動に伴う不要成分を眼
球移動方向ごとに除去する第2のフイルタ手段、
および 前記頭部移動状態判定手段の判定出力と前記第
2のフイルタ手段出力とに応じて、視線の移動情
報を算出する移動情報算出手段を備えた、視線方
向検出装置。
1. Head orientation detection means for detecting the amount of movement and direction of the head; eyeball direction detection means for detecting the amount and direction of eyeball movement; a first filter means for removing minute change components included in the displacement component of the detection output; a head for determining the movement state of the head according to a displacement component in the head direction included in the output of the first filter means; part movement state determination means; eyeball direction calculation means for calculating the orbital change of the eyeball and the movement direction of the eyeball according to the displacement component of the eyeball direction included in the output of the first filter means; a second filter means for removing unnecessary components associated with involuntary movements according to the calculated eyeball movement directions for each eyeball movement direction;
and a line-of-sight direction detecting device, comprising movement information calculation means for calculating line-of-sight movement information according to the determination output of the head movement state determination means and the output of the second filter means.
JP3132850A 1991-06-04 1991-06-04 Eye direction detector Granted JPH04357930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3132850A JPH04357930A (en) 1991-06-04 1991-06-04 Eye direction detector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3132850A JPH04357930A (en) 1991-06-04 1991-06-04 Eye direction detector

Publications (2)

Publication Number Publication Date
JPH04357930A JPH04357930A (en) 1992-12-10
JPH0583248B2 true JPH0583248B2 (en) 1993-11-25

Family

ID=15090983

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3132850A Granted JPH04357930A (en) 1991-06-04 1991-06-04 Eye direction detector

Country Status (1)

Country Link
JP (1) JPH04357930A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10276987A (en) * 1997-04-03 1998-10-20 Sony Corp Eye position detection device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07116119A (en) * 1993-10-21 1995-05-09 Nippon Telegr & Teleph Corp <Ntt> Chaotic state measuring method and measuring apparatus, and health state judging method and judging apparatus
JPH08179239A (en) * 1994-12-21 1996-07-12 Canon Inc Head mounted display device and drive control method thereof
US20150185831A1 (en) * 2013-12-26 2015-07-02 Dinu Petre Madau Switching between gaze tracking and head tracking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10276987A (en) * 1997-04-03 1998-10-20 Sony Corp Eye position detection device

Also Published As

Publication number Publication date
JPH04357930A (en) 1992-12-10

Similar Documents

Publication Publication Date Title
US20250093678A1 (en) Systems and methods for three-dimensional visualization during robotic surgery
US20250366947A1 (en) Surgical virtual reality user interface
EP3479202B1 (en) Augmenting virtual reality content with real world content
CA2747814C (en) Hands-free pointer system
CN101690165B (en) Control method based on a voluntary ocular signal, particularly for filming
US20210069894A1 (en) Remote control system, information processing method, and non-transitory computer-readable recording medium
CN204741528U (en) Stereo immersive somatosensory intelligent controller
KR20030007728A (en) Method for assisting an automated video tracking system in reacquiring a target
CN102958464A (en) Automated surgical system with improved controller
KR20170062439A (en) Control device, control method, and program
JPWO2021059359A1 (en) Animation production system
US12272006B2 (en) Rotational navigation in a virtual environment with a visual reference
JP2003510864A (en) Method and system for time / motion compensation for a head mounted display
JPH0583248B2 (en)
CN106474751A (en) For watching Mechatronic Systems and its control method of film
JP3201589B2 (en) Remote visual presentation device with gaze function
CN105828021A (en) Specialized robot image acquisition control method and system based on augmented reality technology
WO2021220407A1 (en) Head-mounted display device and display control method
JP4640191B2 (en) Surveillance camera system and surveillance camera direction control method
JPH0583249B2 (en)
JP7719133B2 (en) Information processing device, information processing system, information processing method, and program
CN112753066A (en) Image display control device, method, and program
JP2022184456A (en) Image processing device, image processing method, and program
JP2025078533A (en) Information processing device, information processing device system, information processing device control method, and program
KR20250046154A (en) Method and apparatus for controlling multi-view video rendering based on user viewing intent

Legal Events

Date Code Title Description
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 19940621

LAPS Cancellation because of no payment of annual fees