[go: up one dir, main page]

WO2007125794A1 - Data measuring device and data measuring method - Google Patents

Data measuring device and data measuring method Download PDF

Info

Publication number
WO2007125794A1
WO2007125794A1 PCT/JP2007/058430 JP2007058430W WO2007125794A1 WO 2007125794 A1 WO2007125794 A1 WO 2007125794A1 JP 2007058430 W JP2007058430 W JP 2007058430W WO 2007125794 A1 WO2007125794 A1 WO 2007125794A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
pupil
image
template
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2007/058430
Other languages
French (fr)
Japanese (ja)
Inventor
Shin-Ichiroh Kitoh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of WO2007125794A1 publication Critical patent/WO2007125794A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils

Definitions

  • the present invention relates to a data measuring device and a data measuring method, and more particularly to a data measuring device and a data measuring method for measuring data related to a pupil of a living body.
  • an apparatus for measuring data related to a pupil of a living body has been proposed for diagnostic purposes such as medical diagnosis.
  • a data measuring device there has been proposed a measuring device provided with various measuring means and diagnostic means using data relating to the pupil of a living body.
  • Patent Document 1 detects the amount of change in a living body part such as an eye blink and a pupil diameter from a moving image of a living body. A diagnostic device to diagnose is described.
  • Patent Document 2 describes a person state in which a person's face is photographed using a CCD camera and infrared light, and a person's state is determined by extracting the pupil position or shape of the person from the photographed image. A detection device is described.
  • Patent Document 3 a change in the pupil area (pupil diameter) of a living body when a stimulus is applied by infrared light is detected by a goggle-shaped measuring tool equipped with an infrared light source and an infrared CCD camera.
  • a pupil-to-light reaction measuring instrument for relaxation evaluation that evaluates the relaxation of a living body from the detection result is described.
  • Patent Document 1 Japanese Patent Laid-Open No. 7-124126
  • Patent Document 2 Japanese Patent Laid-Open No. 7-249197
  • Patent Document 3 Japanese Patent Laid-Open No. 2005-143684
  • Patent Document 3 needs to be provided with a special device in combination with an infrared light source and an infrared CCD camera.
  • a template matching method as one method for extracting parts such as eyes, pupils, and mouths from biological images such as human face images.
  • a representative image called a template is prepared, the two are compared while scanning the template position on the image data to be sought, and the best match is extracted. This is a method of detecting the location of the site to be extracted.
  • the template is typically representative and fixed, and the template size and content (shape, color, etc.) are not changed during scanning, so the template may be misaligned with the image being inspected (for example, bare skin) If the eye is made into a template and the make-up eye is to be inspected, etc.), or if it doesn't exist (for example, a black eye template is used to extract the blue eye part), the detection accuracy is It will fall.
  • the inspection target is a moving image
  • the positional relationship between the subject and the camera changes three-dimensionally for each frame, so the size of the detection target in the moving image data changes, and the template cannot follow the change.
  • the camera itself and the subject's movement itself are constantly changing due to the amount of light and physiological changes. Therefore, in the case of performing template matching with high accuracy on an object such as a pupil that changes by itself, it is difficult to use the conventional method of fixing the template.
  • the present invention provides a data measurement apparatus and a data measurement method that improve detection accuracy by updating template information using information obtained from a detection result of a target in a frame image with moving image data. With the goal.
  • An image capturing unit for capturing a first image by capturing a living body having an eye area
  • the image capturing unit captures the living body and inputs a second image
  • the invention according to claim 2 is the data measuring device according to claim 1,
  • the data relating to the pupil is a pixel value at the center of the pupil and diameter data of the pupil.
  • the invention according to claim 3 is the data measuring device according to claim 1 or claim 2, wherein the data processing unit 'analysis unit is provided for each frame.
  • the data processing unit 'analysis unit is provided for each frame.
  • the invention according to claim 4 is the data measuring device according to any one of claims 1 to 3, wherein the image photographing unit is A distance sensor or a displacement sensor for measuring a distance or displacement from a living body is provided, and the data analysis unit calculates an absolute value of data related to the pupil using the distance or displacement from the living body as a parameter. To do.
  • the invention according to claim 5 is the data measuring device according to claim 4, wherein the distance sensor or the displacement sensor includes a plurality of imaging elements, and The shape is measured.
  • the invention described in claim 6 is the data measuring device described in claim 5, wherein the image photographing unit is a visible camera. [0016] The invention described in claim 7
  • the second eye region force is also detected by the updated template.
  • the invention according to claim 8 is the data measurement method according to claim 7, wherein the data relating to the pupil is a pixel value of a pupil center and pupil diameter data. It is characterized by.
  • FIG. 1 is a block diagram showing an overall structure of a data measurement device according to the present embodiment.
  • FIG. 2 is a conceptual diagram of processing for extracting an eye region from face image data.
  • FIG. 3 is a conceptual diagram showing pupil template creation processing.
  • FIG. 4 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 5 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 6 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 7 is a graph showing the transition of the minimum pixel value in each frame.
  • FIG. 8 is a conceptual diagram showing extraction processing of pupil diameter data using pixel values.
  • FIG. 9 is a graph showing the transition of diameter data in each frame.
  • FIG. 11 is a flowchart showing a method for extracting pupil and diameter data by template matching.
  • FIG. 12 is a flowchart showing a method for updating a pupil template.
  • FIG. 1 is a block configuration diagram of the data measuring apparatus 1 according to the present embodiment.
  • an external device 2 is connected to the data measuring device 1 via a network that can communicate with each other, and the measurement results of the data measuring device 1 can be transmitted to the external device 2. .
  • the network in the present embodiment is not particularly limited as long as it means a communication network capable of data communication.
  • the Internet a LAN (Local Area Network), a WAN (Wide Area Network), a telephone line network ISDN (Integrated Services Digital Network) network, CATV (Cable Television) line, optical communication line, etc.
  • LAN Local Area Network
  • WAN Wide Area Network
  • ISDN Integrated Services Digital Network
  • CATV Consumer Television
  • optical communication line etc.
  • the external device 2 is constituted by a personal computer or the like, and is preferably installed in a place where some kind of consulting and diagnosis can be received.
  • the external device 2 may be configured as an Internet site from which consulting information can be obtained, or as a mobile terminal such as a consultant, doctor, or clerk.
  • the external device 2 may be configured as a data server for a home health management system.
  • the data measuring device 1 includes a control unit 3, an external communication unit 4, an image capturing unit 5, an illumination unit 6, a user interface unit 7, an IZO unit 8, a memory unit 9, and data processing.
  • An analysis unit 10, a parameter setting / management unit 11, a data storage unit 12, and a display unit 13 are provided.
  • the control unit 3 includes a CPU and a RAM, and drives and controls each component of the data measurement device 1. Since the data measuring device 1 of the present embodiment also handles moving images, it is desirable that the control unit 3 be configured with a chip that can control the operation as fast as possible.
  • the external communication unit 4 is configured to be able to perform information communication with the external device 2 by wired or wireless communication means.
  • the data measuring apparatus 1 of the present embodiment handles moving image data, and therefore preferably has a communication mode capable of high-speed transmission as much as possible.
  • the image capturing unit 5 captures a moving image around the subject's eyes, such as a camera module or other camera attached to a CCD camera, digital still camera, CMOS camera, video camera, mobile phone, or the like. Consists of. In addition, the image shooting unit 5 may be shifted in color shooting or monochrome shooting, but the following will be described in the case of monochrome shooting.
  • an infrared camera can be used as the image photographing unit 5, it is desirable to photograph with a visible camera in the present embodiment.
  • a visible camera it is desirable to use a camera with high sensitivity in the red region, focusing on the contrast of the captured image.
  • the sensitivity in the red region can be relatively increased by installing an optical filter with high transmittance in that band. Further, it may be used except for a camera filter provided with an infrared cut filter.
  • the image capturing unit 5 of the present embodiment is configured as a distance sensor or a displacement sensor for detecting the distance or displacement between the image capturing unit 5 and the subject.
  • the existing distance sensor or displacement sensor can be used.
  • the image capturing unit 5 associates the distance and displacement between the image capturing unit 5 and the subject with each captured frame and stores them in the data storage unit 12 or the parameter setting / management unit 11. It has become.
  • the distance and displacement from the subject can be determined by configuring the camera as the image capturing unit 5 with two eyes and using images (areas between the eyebrows, nose, and the other eye) captured by each power camera. It can also be obtained from measuring the 3D shape.
  • the absolute distance per pixel can be obtained without using a special sensor. This makes it possible to detect the absolute value of data relating to the pupil without providing a distance sensor or a displacement sensor.
  • the illumination unit 6 is an arbitrary component of the data measuring device 1, and can irradiate around the eyes of a subject with a light source when the surrounding environment is dark at the time of photographing.
  • a light source white to incandescent (color temperature around 3000K) visible light can be used. It is also possible to use an infrared light source as the light source. Further, by diffusing and illuminating light by the illumination unit 6, direct reflection of light on the surface of the subject's eyes can be mitigated, and the accuracy of image processing can be improved. Further, it is desirable that the illumination unit 6 has a mechanism that can vary the light intensity of the light source.
  • the illumination unit 6 has a mechanism capable of irradiating illumination light for stimulus addition.
  • the illumination light for applying the stimulus it is preferable to use strobe (flash) light that is visible light and stable in intensity and irradiation time. In this case, the intensity of the irradiated light is kept constant over time. The As a result, it is possible to measure the pupil response by applying a stimulus to the eye of the subject at the time of photographing by the image photographing unit 5. In addition, it is desirable that the flash (flash) light can adjust the light emission time if the light emission timing is reached.
  • the user interface unit 7 is composed of a keyboard, a mouse, a trackball, and the like.
  • the user interface unit 7 allows a user to input an instruction, and can transmit the status and request of the data measurement device 1 to the user.
  • the touch panel may be configured integrally with the display unit 13.
  • the IZO unit 8 is configured to be able to connect a portable device such as a CF card, an SD card, or a USB memory card.
  • a port for accessing an external device 2 such as Ethernet (registered trademark) can be connected.
  • Ethernet registered trademark
  • the memory unit 9 is composed of RAM, ROM, DIMM, etc., and the data processing 'analysis unit 10 etc. transfers data required by the data storage unit 12 etc.
  • Device 1 is designed to operate at high speed and stability.
  • the memory unit 9 of this embodiment needs to have a capacity that can execute moving image processing in real time without dropping frames.
  • the analysis unit 10 measures a time-series change of data related to the pupil of the living body by analyzing the moving image captured by the image capturing unit 5.
  • the data processing / analysis unit 10 extracts the eye region from the face image data.
  • eye region extraction conventional methods using eye template matching and positional relationship between the eye periphery can be applied.
  • an eye area in an arbitrary frame may be manually set, and the area may be used as an eye area in another frame. Also, set the eye area wider so that the eyes and iris do not protrude from other frames.
  • the data processing 'analysis unit 10 is adapted to create a pupil template.
  • the pupil template is created using data such as the size of the pupil, the average pixel value of the pupil, and the average pixel value of the iris, which are obtained by analyzing moving images of a plurality of people's eye regions in advance.
  • the size of the template is set so that the pupil can be accommodated, and a circle corresponding to the pupil is arranged at the center.
  • a pixel template is created by setting the pixel values of the circumference and the inside to the pixel values of the pupils of the plurality of people and the pixel values of the outside of the circle to the average of the pixel values of the irises of the plurality of people. be able to.
  • the pupil template may be obtained from image data when the pupil is contracted. As a result, the detection accuracy of the pupil center coordinates is increased even for image data in which the pupil is contracted.
  • the data processing 'analysis unit 10 extracts the pupil by performing the template matching of the pupil for each frame of the moving image with respect to the eye region from which the facial image data force has been extracted. At this time, when the difference between the image data of the eye area of each frame and the pupil template is minimized, it is determined that the center of the pupil. Thereby, the pupil center coordinates can be detected.
  • the search area of the next frame may be narrowed down using the pupil center coordinates extracted in the previous frame under the assumption that the moving distance of the subject (pupil) between the frames is small.
  • the size of the search area of the next frame can be set separately, and the “pupil center coordinater setting range” of the previous frame can be searched in that frame.
  • the feature amount of the difference between the image data of each frame and the template is as shown in the graphs of Figs.
  • the feature quantity of the difference between the image data of each frame and the template is as shown in the graph in FIG.
  • FIG. 7 is a graph showing the transition of the minimum value of the difference from the template in each frame when light stimulation is given to the pupil three times with a certain period of illumination light for stimulus addition.
  • Figure 7 As shown, the minimum value appears as a discontinuous value due to blinking of the subject.
  • the pupil center coordinate appears as a discontinuous value when the subject blinks or closes his eyes even in the graph showing the pupil center coordinate transition, etc. .
  • the data processing 'analysis unit 10 extracts image data of a part in which transitions such as pixel values and pupil center coordinates are discontinuous from the moving image data, and when the subject blinks, Is excluded as image data when is closed.
  • a predetermined threshold value is separately set for the minute amount (variation amount) in the graph of FIG. 7, and when the differential amount is equal to or greater than the predetermined threshold value, it is possible to determine “blink”.
  • the graph showing the transition of the minimum value in Fig. 7 is smoothed multiple times, the difference between the minimum value before smoothing and after smoothing is taken for each frame, and the image data when the difference increases is blinked. It can also be excluded as image data when the eyes are closed.
  • a force using a light stimulus as a stimulus to be applied to the subject may be sound, smell, smell, or touching the subject.
  • the stimulus to be applied to the subject may be sound, smell, smell, or touching the subject.
  • plosives, explosive sounds, incense, and aromas are possible.
  • the subject's stress may be alleviated or the subject may be stressed.
  • hot springs, footbaths, hot air, and cold air can be considered.
  • the "blink" data obtained when extracting the image data when the subject blinks or when the eyes are closed is the power of diverting the data to other analyzes as “blink count” data.
  • the data may be transferred to the external device 2. It is also possible to extract changes at the time of stimulus response as data by searching for the start point, apex and end point position of the change in pixel values and pupil center coordinates at the time of stimulus response.
  • the start position of the pupil response due to light stimulation can be detected by, for example, storing the light emission timing of stimulation light in association with moving image data and using the light emission timing data during image processing.
  • the search of the start point, the apex point, and the end point position of the change at the time of stimulus response can be performed using other known methods.
  • the data processing 'analysis unit 10 starts with the pupil center coordinates obtained by matching the template of the pupil as the starting point, and ends the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image.
  • pupil diameter data is extracted.
  • the pixel value at the search position is in the range from 0 to “pixel value at the pupil center coordinate + threshold value” using the pixel value at the pupil center coordinate and the threshold value set separately. If it is outside, it can be determined to be outside the pupil.
  • Pupil diameter data can also be obtained from the end point force of the pupil thus obtained.
  • the pixel value force average value and standard deviation amount around the center of the pupil are calculated when searching for the end of the pupil, and ranges such as ⁇ average value + 3 times the standard deviation '' It may be performed by providing a lens and considering it in the pupil.
  • This diameter data may be averaged as a single feature value as shown in FIG. 9, which may be a separate feature value based on the diameter data in the horizontal direction or the diameter data in the vertical direction.
  • Fig. 9 shows the pupil diameter data in each frame. The pupil diameter data changes depending on the stimulus response.
  • the data processing 'analysis unit 10 updates the pupil template based on the pixel value at the center of the pupil and the diameter data of the pupil obtained by the processing of the immediately preceding frame. This is applied to pupil extraction of the next frame.
  • the data used for updating the pupil template may be data obtained by averaging not only the data obtained in the immediately preceding frame but also the data obtained in all the previous frames.
  • the pupil template is customized for each subject, so that the pupil extraction accuracy is improved.
  • the data processing 'analysis unit 10 considers that a template suitable for the subject has been created, Thereafter, the template update can be terminated. As a result, unnecessary template creation steps can be omitted, and the process can be performed quickly.
  • the data processing 'analysis unit 10 does not use the updated data when updating the pupil template, if the immediately preceding frame is image data when blinking or closing eyes. To do. Also, when averaging the data of all frames, do not use the image data when blinking or closing eyes.
  • the data processing ' analysis unit 10 performs a correction to make the data related to the pupil of each frame relative to a predetermined reference value.
  • the pupil diameter data obtained by the above processing is the data obtained as relative values in pixel units within one frame. Data and not a uniform value between frames. Therefore, the data processing 'analysis unit 10 obtains the distance between a plurality of points whose positional relationship does not change in time series such as the corner of the eye and the top of each eye in each frame, and calculates the relative distance between these points between each frame. Accordingly, the pupil diameter data is corrected so that the pupil diameter data is unified between the frames. For example, as shown in FIG.
  • the distance between the corner of the eye and the eye is D
  • the distance Dn between the corner of the eye and the eye is obtained in the nth frame
  • the correction amount DnZD is obtained.
  • the data processing 'analysis unit 10 measures the absolute value of pupil diameter data. That is, the diameter data of the pupil obtained by the above processing is a relative value in pixel units, and the absolute value of the diameter data varies depending on the distance between the image capturing unit 5 and the subject. Therefore, the data processing / analysis unit 10 uses the distance sensor or the displacement sensor as the image capturing unit 5 to measure and store the distance and displacement between the image capturing unit 5 and the subject in each frame as parameters. The absolute distance is now calculated!
  • a reference object (patch, sticker, etc.) with a known size and length is placed around the subject, and one image is drawn from the size of the object on the image.
  • the absolute value of the length per element may be calculated. It is also possible to take an image using a subject that can measure the length of the subject, such as a scaled seal, and calculate the absolute length per pixel in the same way.
  • the length and size of the area around the eyes such as the eyelashes of the subject, the nose, and the distance from the corners of the eyes to the eyes, can be separately measured and given as reference parameters.
  • the parameter setting / management unit 11 is configured to be able to set parameters necessary for processing and control of the data measuring device 1, and manages the set parameters. ing.
  • the data storage unit 12 is configured by an HDD or the like, and includes image data input from the outside, moving image data captured by the image capturing unit 5, and a moving image that has been subjected to image processing by the data processing / analyzing unit 10. It manages and holds image data or temporary data during image processing.
  • the display unit 13 may be a CRT, liquid crystal, organic EL, plasma, or projection display.
  • the data processing / analysis unit 10 also relates to the state of each component of the data measuring device 1. Information and information given from the external device 2 are displayed. It is also possible to adopt a configuration that also functions as the user interface unit 7 such as a touch panel.
  • control unit 3 external communication unit 4, IZO unit 8, memory unit 9, data processing / analysis unit 10, parameter setting / management unit 11, data storage unit 12, and display
  • the unit 13 can be configured as a general personal computer, and the data measuring device 1 can be configured by attaching the image capturing unit 5, the illumination unit 6, and the user interface unit 7 thereto.
  • the illumination unit 6 illuminates the subject's eyes and the image photographing unit 5 photographs the subject's eyes with a moving image. At this time, the illumination unit 6 may irradiate illumination light for stimulus addition.
  • the data processing / analysis unit 10 extracts an eye area from the face image data photographed by the image photographing unit 5.
  • the data processing 'analysis unit 10 creates a pupil template.
  • the pupil template is created using data such as pupil size, pupil pixel value average, and iris pixel value average obtained in advance by analyzing moving images of a plurality of human eye regions.
  • the data processing 'analysis unit 10 extracts the pupil by performing template matching of the pupil for each frame of the moving image for the extracted eye region of the face image data force.
  • Figure 11 shows a flowchart of template matching and pupil extraction methods.
  • step S1 when face image data is input to the data processing 'analysis unit 10 (step S1), the data processing' analysis unit 10 extracts the eye region (step S2) and creates it. Template matching is performed using the pupil template (step S3). The center of the pupil is when the difference between the image data of the eye area of each frame and the pupil template is minimized. By determining, pupil center coordinates are detected (step S4). At this time, the data processing / analysis unit 10 extracts image data of a part of the moving image data in which transitions such as pixel values and pupil center coordinates are discontinuous. Excluded as image data when closed.
  • the pupil diameter data is obtained by searching for the edge of the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image, starting from the pupil center coordinates obtained by matching the template of the pupil. Is extracted (step S5). Subsequently, it is determined whether or not there is a next frame (step S6). If there is a next frame, steps S1 to S5 are repeated, and if there is no next frame, the process is terminated.
  • FIG. 12 shows a flowchart of a method for updating the pupil template.
  • step S11 when the first frame of the moving image is input to the data processing 'analysis unit 10 (step S11), the data processing / analysis unit 10 extracts the eye region (step S12), Template matching is performed using the created pupil template (step S13). Thereby, pupil center coordinates are detected (step S14), and pupil diameter data is extracted (step S15). Next, the data processing / analysis unit 10 updates the pupil template based on the data obtained by the processing of the immediately preceding frame or the average data of all the frames (step S16).
  • the data processing 'analysis unit 10 determines whether or not the difference between the templates before and after the update is equal to or less than a predetermined threshold value that is set separately. If it is updated and the value is below the predetermined threshold, template matching can be performed without updating the template thereafter.
  • the update of the template will be described in detail.
  • the pupil diameter data obtained by pupil extraction and the pixel values around the pupil center are averaged and calculated. Pixel values and pixel values outside the pupil are used.
  • the pixel value outside the pupil is calculated by averaging pixel values in a predetermined area outside the area determined to be outside the pupil, that is, the area inside the pupil calculated from the diameter data of the pupil.
  • the area to be averaged is the range of the portion surrounded by concentric circles outside the diameter of the pupil by a predetermined number of pixels or distance.
  • the predetermined number of pixels or distance is set by the parameter setting unit of the image processing apparatus.
  • the partial force other than the white-eye area of the eye is inside the pupil. You can use the area where the area is deleted as the extra-pupil area (resulting in an iris area), with the pixel values in the area averaged.
  • step S17 when the next frame is input to the data processing / analysis unit 10 (step S17), the data processing / analysis unit 10 extracts the eye region (step S18), and the generated pupil template is used. Template matching is performed again (step S19). Then, using the updated template, pupil center coordinates are detected (step S20), and pupil diameter data is extracted (step S21). Next, it is determined whether or not there is a next frame (step S22) .If there is a next frame, the pupil template is updated again (step S16), and if there is no next frame, the process ends. To do.
  • the data processing 'analysis unit 10 performs correction so that pupil diameter data is unified between the frames. That is, the diameter data of the pupil is a relative value between the frames.
  • the data processing 'analysis unit 10 measures the absolute value of the diameter data of the pupil. That is, the absolute distance per pixel is calculated using the distance and displacement between the image capturing unit 5 and the subject in each frame measured and stored by the distance sensor or displacement sensor as the image capturing unit 5 as parameters.
  • the data processing / analyzing unit 10 transmits the measurement result to the external device 2.
  • the pupil template is updated using the data of the frame in which the data measurement has already been performed, so that the pupil is determined for each living body.
  • This template can be customized to improve the pupil extraction accuracy.
  • the data is Although it is only a relative value within one frame in pixel units, the absolute distance per pixel is calculated by using the distance or displacement from the living body as a parameter, and the absolute value of data related to the pupil is calculated. It becomes possible.
  • the apparatus configuration can be simplified. It becomes possible to do.
  • the data measuring device 1 of the present embodiment can be used after the concentration of subjects during personal computer work and to determine the degree of fatigue. It can also be used to determine the subject's psychological stress level by adding stimuli other than light stimuli (sound, aroma, incense, footbath, etc.). Furthermore, it can also be used to measure the degree of emotion and attention of a subject who is awarding a movie, a TV program, or news.
  • the data measuring device and the data measuring method of the present invention it is possible to measure data relating to the pupil of the living body with high accuracy without contacting the living body.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

Provided are a data measuring device and a data measuring method capable of accurately measuring data on a pupil of a human body in a non-contact manner with the human body. The data measuring device (1) includes: an image capturing unit (5) for capturing a dynamic image around an eye of the human body; and a data analysis unit (10) for extracting the pupil by matching with a template for each frame of the dynamic image captured by the image capturing unit (5) and measuring the data on the pupil according to a change by time. The data analysis unit (10) updates the template for each frame by using the data on the pupil of the frame for which data measurement has been already performed. The data on the pupil is formed by a pixel value around the pupil and pupil diameter data.

Description

明 細 書  Specification

データ測定装置及びデータ測定方法  Data measuring apparatus and data measuring method

技術分野  Technical field

[0001] 本発明はデータ測定装置及びデータ測定方法に関し、特に生体の瞳孔に関する データを測定するデータ測定装置及びデータ測定方法に関する。  The present invention relates to a data measuring device and a data measuring method, and more particularly to a data measuring device and a data measuring method for measuring data related to a pupil of a living body.

背景技術  Background art

[0002] 従来から、医療診断などの診断目的で、生体の瞳孔に関するデータを測定する装 置が提案されている。このようなデータ測定装置としては、様々な測定手段や生体の 瞳孔に関するデータを使用する診断手段を備えた測定装置が提案されている。  Conventionally, an apparatus for measuring data related to a pupil of a living body has been proposed for diagnostic purposes such as medical diagnosis. As such a data measuring device, there has been proposed a measuring device provided with various measuring means and diagnostic means using data relating to the pupil of a living body.

[0003] 例えば、特許文献 1には、生体の動画像から眼の瞬目、瞳孔直径などの生体部位 の変化量を検出し、その検出結果より、生体の治療目的で生体の生理状態などを診 断する診断装置が記載されて ヽる。 [0003] For example, Patent Document 1 detects the amount of change in a living body part such as an eye blink and a pupil diameter from a moving image of a living body. A diagnostic device to diagnose is described.

[0004] また、特許文献 2には、 CCDカメラ及び赤外光を用いて人物の顔面を撮影し、この 撮影画像から人物の瞳孔位置又は瞳孔形状を抽出して人物の状態を判断する人物 状態検出装置が記載されている。 [0004] Patent Document 2 describes a person state in which a person's face is photographed using a CCD camera and infrared light, and a person's state is determined by extracting the pupil position or shape of the person from the photographed image. A detection device is described.

[0005] また、特許文献 3には、赤外光光源と赤外線 CCDカメラを備えたゴーグル状の計測 具により、赤外光による刺激付加時の生体の瞳孔面積 (瞳孔直径)の変化を検出し、 その検出結果から生体のリラックス感を評価するリラックス感評価用瞳孔対光反応計 測具が記載されている。 [0005] In Patent Document 3, a change in the pupil area (pupil diameter) of a living body when a stimulus is applied by infrared light is detected by a goggle-shaped measuring tool equipped with an infrared light source and an infrared CCD camera. In addition, a pupil-to-light reaction measuring instrument for relaxation evaluation that evaluates the relaxation of a living body from the detection result is described.

特許文献 1:特開平 7— 124126号公報  Patent Document 1: Japanese Patent Laid-Open No. 7-124126

特許文献 2:特開平 7 - 249197号公報  Patent Document 2: Japanese Patent Laid-Open No. 7-249197

特許文献 3 :特開 2005— 143684号公報  Patent Document 3: Japanese Patent Laid-Open No. 2005-143684

発明の開示  Disclosure of the invention

発明が解決しょうとする課題  Problems to be solved by the invention

[0006] しかし、特許文献 1記載の発明では、生体の動画像から生体部位の変化量を検出 するものの、生体の動画像力 生体データを取得する具体的手段については記載さ れていなかった。 [0007] また、特許文献 2記載の発明では、撮影画像からの瞳孔位置又は瞳孔形状の抽出 を 2値ィ匕処理によって行うことから、ノイズの影響を受けやすぐ人物の状態を精密に 判断することができないという問題があった。 [0006] However, in the invention described in Patent Document 1, although the amount of change in the living body part is detected from the moving image of the living body, no specific means for acquiring the moving image force and living body data of the living body has been described. [0007] Further, in the invention described in Patent Document 2, the extraction of the pupil position or the pupil shape from the photographed image is performed by a binary key process, so that the state of the person can be accurately determined immediately under the influence of noise. There was a problem that I could not.

[0008] また、特許文献 3記載の発明では、赤外光光源や赤外線 CCDカメラと ヽつた特殊 な機器を備える必要があった。また、測定時に被験者がゴーグルを着用する必要が あり、生体に生理的 ·心理的負担がかかると!、う問題があった。  [0008] In addition, the invention described in Patent Document 3 needs to be provided with a special device in combination with an infrared light source and an infrared CCD camera. In addition, it is necessary for the subject to wear goggles at the time of measurement, and there is a problem that a physiological and psychological burden is placed on the living body!

さらに、人間の顔画像などの生体画像から目や瞳孔、口などの部位を抽出する方法 の 1つに、テンプレートマッチング法がある。テンプレートマッチング法とは、テンプレ ートと呼ばれる、抽出したい部位の代表的な画像を用意し、搜索対象の画像データ 上でテンプレート位置を走査しながら両者を比較し、両者が最も整合したところを抽 出したい部位の存在箇所として検出する方法である。 し力し通常、テンプレートは 代表的なもので固定されており、走査中にテンプレートの大きさや内容 (形、色など) を変更しないため、テンプレートが検査対象の画像とずれていたり(例えば、素肌の 目をテンプレートにして、化粧された目を検査対象にする等)、あっていなかったり( 例えば、青目の部分を抽出する場合に黒目のテンプレートを使う等)した場合に、検 出精度が落ちてしまう。また、検査対象が動画の場合、被写体とカメラの位置関係は フレーム毎に 3次元的に変わるため、動画データ内の検出対象の大きさが変化し、そ の変化にもテンプレートは追従できない。瞳孔が検出対象の場合、カメラや被験者本 人の動きに関係なぐそれ自体も光量や生理的変化によって絶えず変化している。 従って、自身が変化する瞳孔のような対象を高精度にテンプレートマッチングする場 合は、テンプレートを固定する従来の方法では困難である。  Furthermore, there is a template matching method as one method for extracting parts such as eyes, pupils, and mouths from biological images such as human face images. In the template matching method, a representative image called a template is prepared, the two are compared while scanning the template position on the image data to be sought, and the best match is extracted. This is a method of detecting the location of the site to be extracted. However, the template is typically representative and fixed, and the template size and content (shape, color, etc.) are not changed during scanning, so the template may be misaligned with the image being inspected (for example, bare skin) If the eye is made into a template and the make-up eye is to be inspected, etc.), or if it doesn't exist (for example, a black eye template is used to extract the blue eye part), the detection accuracy is It will fall. When the inspection target is a moving image, the positional relationship between the subject and the camera changes three-dimensionally for each frame, so the size of the detection target in the moving image data changes, and the template cannot follow the change. When the pupil is the detection target, the camera itself and the subject's movement itself are constantly changing due to the amount of light and physiological changes. Therefore, in the case of performing template matching with high accuracy on an object such as a pupil that changes by itself, it is difficult to use the conventional method of fixing the template.

[0009] 本発明では、動画データのあるフレーム画像における、対象の検出結果から得た 情報を用いてテンプレート情報を更新することで、検出精度を向上するデータ測定 装置及びデータ測定方法を提供することを目的とする。  [0009] The present invention provides a data measurement apparatus and a data measurement method that improve detection accuracy by updating template information using information obtained from a detection result of a target in a frame image with moving image data. With the goal.

課題を解決するための手段  Means for solving the problem

[0010] 上記課題を解決するため請求の範囲第 1項に記載の発明は、 [0010] In order to solve the above problem, the invention described in claim 1 is

目領域を有する生体を撮影して第 1の画像を取得する画像撮影部と、  An image capturing unit for capturing a first image by capturing a living body having an eye area;

前記第 1の画像から第 1の目領域を抽出し、 瞳孔に関する所定テンプレートにより前記第 1の目領域から瞳孔を検出し、 検出された瞳孔に関するデータを測定し、 Extracting a first eye region from the first image; Detecting a pupil from the first eye region with a predetermined template relating to the pupil, measuring data relating to the detected pupil;

測定したデータによって所定テンプレートを更新するデータ処理 '解析部を有し、 前記画像撮影部は、前記生体を撮影して第 2の画像を入力すると、  Data processing to update a predetermined template with measured data 'Analysis unit, The image capturing unit captures the living body and inputs a second image,

前記データ処理 ·解析部は、  The data processing / analysis unit

第 2の画像から第 2の目領域を抽出し、  Extract the second eye area from the second image,

更新されたテンプレートにより第 2の目領域力 瞳孔を検出する  Detect second eye area pupil with updated template

ことを特徴とする。  It is characterized by that.

[0011] 請求の範囲第 2項に記載の発明は、請求の範囲第 1項に記載のデータ測定装置で あって、  [0011] The invention according to claim 2 is the data measuring device according to claim 1,

前記瞳孔に関するデータは瞳孔中心の画素値及び瞳孔の直径データであることを 特徴とする。  The data relating to the pupil is a pixel value at the center of the pupil and diameter data of the pupil.

[0012] 請求の範囲第 3項に記載の発明は、請求の範囲第 1項又は請求の範囲第 2項に記 載のデータ測定装置であって、前記データ処理部'解析部は前記フレームごとに時 系列的に位置関係が変わらない複数点間の距離を取得し、前記複数点間の距離の 相対関係に応じて前記フレームごとに前記瞳孔に関するデータを補正することにより 、前記フレームの各々の前記瞳孔に関するデータを所定の基準値に対する相対値と することを特徴とする。  [0012] The invention according to claim 3 is the data measuring device according to claim 1 or claim 2, wherein the data processing unit 'analysis unit is provided for each frame. By acquiring the distance between a plurality of points whose positional relationship does not change in time series and correcting the data regarding the pupil for each frame according to the relative relationship of the distance between the plurality of points, The data regarding the pupil is a relative value with respect to a predetermined reference value.

[0013] 請求の範囲第 4項に記載の発明は、請求の範囲第 1項〜請求の範囲第 3項のいず れカ 1項に記載のデータ測定装置であって、前記画像撮影部は生体との距離又は 変位を測定する距離センサ又は変位センサを備えており、前記データ解析部は前記 生体との距離又は変位をパラメータとして、前記瞳孔に関するデータの絶対値を算 出することを特徴とする。  [0013] The invention according to claim 4 is the data measuring device according to any one of claims 1 to 3, wherein the image photographing unit is A distance sensor or a displacement sensor for measuring a distance or displacement from a living body is provided, and the data analysis unit calculates an absolute value of data related to the pupil using the distance or displacement from the living body as a parameter. To do.

[0014] 請求の範囲第 5項に記載の発明は、請求の範囲第 4項に記載のデータ測定装置で あって、前記距離センサ又は変位センサは複数の撮像素子により構成され、生体の 三次元形状を測定することを特徴とする。  [0014] The invention according to claim 5 is the data measuring device according to claim 4, wherein the distance sensor or the displacement sensor includes a plurality of imaging elements, and The shape is measured.

[0015] 請求の範囲第 6項に記載の発明は、請求の範囲第 5項に記載のデータ測定装置で あって、前記画像撮影部は可視カメラであることを特徴とする。 [0016] 請求の範囲第 7項に記載の発明は、 [0015] The invention described in claim 6 is the data measuring device described in claim 5, wherein the image photographing unit is a visible camera. [0016] The invention described in claim 7

目領域を有する生体を撮影して第 1の画像を取得する工程と、  Capturing a first image by imaging a living body having an eye area;

前記第 1の画像力 第 1の目領域を抽出する工程と、  Extracting the first image force first eye region;

瞳孔に関する所定テンプレートにより前記第 1の目領域から瞳孔を検出する工程と、 検出された瞳孔に関するデータを測定する工程と、  Detecting a pupil from the first eye region with a predetermined template for the pupil; measuring data about the detected pupil;

測定したデータによって所定テンプレートを更新する工程と、  Updating the predetermined template with the measured data;

前記生体を撮影して第 2の画像を入力する工程と、  Photographing the living body and inputting a second image;

第 2の画像から第 2の目領域を抽出する工程と、  Extracting a second eye region from the second image;

更新されたテンプレートにより第 2の目領域力も瞳孔を検出することを特徴とする。  The second eye region force is also detected by the updated template.

[0017] 請求の範囲第 8項に記載の発明は、請求の範囲第 7項に記載のデータ測定方法で あって、前記瞳孔に関するデータは瞳孔中心の画素値及び瞳孔の直径データであ ることを特徴とする。 [0017] The invention according to claim 8 is the data measurement method according to claim 7, wherein the data relating to the pupil is a pixel value of a pupil center and pupil diameter data. It is characterized by.

発明の効果  The invention's effect

[0018] 請求の範囲第 1項または請求の範囲第 7項に記載の発明によれば、生体ごとに瞳 孔のテンプレートをカスタマイズさせ、瞳孔の抽出精度を向上させることが可能となる  [0018] According to the invention described in claim 1 or claim 7, it is possible to customize the pupil template for each living body and improve the extraction accuracy of the pupil.

[0019] 請求の範囲第 2項または請求の範囲第 8項に記載の発明によれば、瞳孔のテンプ レートを作成又は更新することが可能となる。 [0019] According to the invention of claim 2 or claim 8, it is possible to create or update a pupil template.

[0020] 請求の範囲第 3項に記載の発明によれば、瞳孔に関するデータを全フレーム間で 統一された値にすることが可能となる。 [0020] According to the invention described in claim 3 of the claim, it is possible to make the data related to the pupil a unified value for all frames.

請求の範囲第 4項に記載の発明によれば、瞳孔に関するデータの絶対値を算出す ることが可能となる。  According to the invention described in claim 4, it is possible to calculate an absolute value of data relating to the pupil.

[0021] 請求の範囲第 5項に記載の発明によれば、装置構成を単純化することが可能とな る。  [0021] According to the invention described in claim 5, it is possible to simplify the apparatus configuration.

[0022] 請求の範囲第 6項に記載の発明によれば、装置構成を単純化して製造コストを低 廉ィ匕することが可能となる。  [0022] According to the invention described in claim 6, it is possible to simplify the device configuration and reduce the manufacturing cost.

図面の簡単な説明  Brief Description of Drawings

[0023] [図 1]本実施形態に係るデータ測定装置の全体構造を示すブロック図である。 [図 2]顔画像データから目領域を抽出する処理の概念図である。 FIG. 1 is a block diagram showing an overall structure of a data measurement device according to the present embodiment. FIG. 2 is a conceptual diagram of processing for extracting an eye region from face image data.

[図 3]瞳孔のテンプレートの作成処理を示す概念図である。 FIG. 3 is a conceptual diagram showing pupil template creation processing.

[図 4]各フレームにおける瞳孔の画素値と虹彩の画素値の変移を示すグラフである。  FIG. 4 is a graph showing transition of pupil pixel values and iris pixel values in each frame.

[図 5]各フレームにおける瞳孔の画素値と虹彩の画素値の変移を示すグラフである。 FIG. 5 is a graph showing transition of pupil pixel values and iris pixel values in each frame.

[図 6]各フレームにおける瞳孔の画素値と虹彩の画素値の変移を示すグラフである。 FIG. 6 is a graph showing transition of pupil pixel values and iris pixel values in each frame.

[図 7]各フレームにおける画素値の最小値の変移を示すグラフである。 FIG. 7 is a graph showing the transition of the minimum pixel value in each frame.

圆 8]画素値を用いた瞳孔の直径データの抽出処理を示す概念図である。 [8] FIG. 8 is a conceptual diagram showing extraction processing of pupil diameter data using pixel values.

[図 9]各フレームにおける直径データの変移を示すグラフである。 FIG. 9 is a graph showing the transition of diameter data in each frame.

圆 10]各フレームにおける複数点間の距離を用いて直径データを補正する処理を示 す概念図である。 [10] It is a conceptual diagram showing the process of correcting the diameter data using the distance between multiple points in each frame.

[図 11]テンプレートマッチングによる瞳孔及び直径データの抽出方法を示すフローチ ヤートである。  FIG. 11 is a flowchart showing a method for extracting pupil and diameter data by template matching.

[図 12]瞳孔のテンプレートの更新方法を示すフローチャートである。  FIG. 12 is a flowchart showing a method for updating a pupil template.

符号の説明 Explanation of symbols

1 データ測定装置  1 Data measurement device

2 外部装置  2 External device

3 制御部  3 Control unit

4 外部通信部  4 External communication section

5 画像撮影部  5 Image capture unit

6 照明部  6 Lighting section

7 ユーザインターフェイス部  7 User interface section

8 IZO部  8 IZO part

9 メモリ部  9 Memory section

10 データ処理 '解析部  10 Data processing 'Analysis Department

11 パラメータ設定 ·管理部  11 Parameter setting / Management section

12 データ蓄積部  12 Data storage unit

13 表示部  13 Display

発明を実施するための最良の形態 [0025] 以下、本発明の実施形態について図面を参照して説明する。 BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, embodiments of the present invention will be described with reference to the drawings.

[0026] 図 1は、本実施形態に係るデータ測定装置 1のブロック構成図である。図 1に示すよ うに、データ測定装置 1には互いに通信可能なネットワークを介して外部装置 2が接 続されており、データ測定装置 1による測定結果を外部装置 2に送信できるようになつ ている。 FIG. 1 is a block configuration diagram of the data measuring apparatus 1 according to the present embodiment. As shown in FIG. 1, an external device 2 is connected to the data measuring device 1 via a network that can communicate with each other, and the measurement results of the data measuring device 1 can be transmitted to the external device 2. .

[0027] なお、本実施形態におけるネットワークはデータ通信可能である通信網を意味する ものであれば特に限定されず、例えばインターネット、 LAN (Local Area Network)、 WAN (Wide Area Network)、電話回線網、 ISDN (Integrated Services Digital Netw ork)回線網、 CATV (Cable Television)回線、光通信回線などを含めることができる 。また、有線のみならず無線によって通信可能な構成としてもよい。  The network in the present embodiment is not particularly limited as long as it means a communication network capable of data communication. For example, the Internet, a LAN (Local Area Network), a WAN (Wide Area Network), a telephone line network ISDN (Integrated Services Digital Network) network, CATV (Cable Television) line, optical communication line, etc. Moreover, it is good also as a structure which can communicate not only by wire but by radio | wireless.

外部装置 2はパーソナルコンピュータなどによって構成されており、何らかのコンサル ティングや診断が受けられる場所に設置されていることが望ましい。また、外部装置 2 をコンサルティング情報が得られるインターネットサイトや、コンサルタントや医師、店 員などの携帯端末として構成してもよい。また、外部装置 2は、在宅健康管理システ ムのデータサーバとして構成してもよ 、。  The external device 2 is constituted by a personal computer or the like, and is preferably installed in a place where some kind of consulting and diagnosis can be received. In addition, the external device 2 may be configured as an Internet site from which consulting information can be obtained, or as a mobile terminal such as a consultant, doctor, or clerk. Also, the external device 2 may be configured as a data server for a home health management system.

[0028] データ測定装置 1は、図 1に示すように、制御部 3、外部通信部 4、画像撮影部 5、 照明部 6、ユーザインターフェイス部 7、 IZO部 8、メモリ部 9、データ処理 '解析部 10 、パラメータ設定 ·管理部 11、データ蓄積部 12及び表示部 13を備えている。 [0028] As shown in FIG. 1, the data measuring device 1 includes a control unit 3, an external communication unit 4, an image capturing unit 5, an illumination unit 6, a user interface unit 7, an IZO unit 8, a memory unit 9, and data processing. An analysis unit 10, a parameter setting / management unit 11, a data storage unit 12, and a display unit 13 are provided.

[0029] 制御部 3は、 CPU及び RAMを備え、データ測定装置 1の各構成部分を駆動制御 するようになつている。本実施形態のデータ測定装置 1は動画も扱うため、制御部 3 はできる限り高速動作 '制御が可能なチップにより構成することが望ましい。  [0029] The control unit 3 includes a CPU and a RAM, and drives and controls each component of the data measurement device 1. Since the data measuring device 1 of the present embodiment also handles moving images, it is desirable that the control unit 3 be configured with a chip that can control the operation as fast as possible.

[0030] 外部通信部 4は、有線又は無線の通信手段により外部装置 2と情報通信ができるよ うに構成されている。なお、本実施形態のデータ測定装置 1は動画データを扱うため 、できる限り高速伝送できる通信形態であることが望ましい。  [0030] The external communication unit 4 is configured to be able to perform information communication with the external device 2 by wired or wireless communication means. Note that the data measuring apparatus 1 of the present embodiment handles moving image data, and therefore preferably has a communication mode capable of high-speed transmission as much as possible.

[0031] 画像撮影部 5は、被験者の目周辺を動画で撮影するものであり、 CCDカメラ、デジ タルスチルカメラ、 CMOSカメラ、ビデオカメラ、携帯電話などに付属のカメラモジュ ールその他のカメラなどにより構成される。また、画像撮影部 5はカラー撮影又はモノ クロ撮影の 、ずれを行ってもよ ヽが、以下はモノクロ撮影を行う場合にっ ヽて説明す る。 [0031] The image capturing unit 5 captures a moving image around the subject's eyes, such as a camera module or other camera attached to a CCD camera, digital still camera, CMOS camera, video camera, mobile phone, or the like. Consists of. In addition, the image shooting unit 5 may be shifted in color shooting or monochrome shooting, but the following will be described in the case of monochrome shooting. The

[0032] また、画像撮影部 5としては、赤外線カメラを用いることも可能であるが、本実施形 態では可視カメラで撮影することが望ましい。可視カメラを用いる場合は、撮影画像 のコントラストを重視して、赤領域の感度が高いカメラを用いることが望ましい。赤領域 の感度を相対的に上げるのは、その帯域の透過率が高い光学フィルタを取り付ける ことなどにより可能である。また、赤外線カットフィルタが備えられているカメラのフィル タを除いて用いてもよい。  [0032] Although an infrared camera can be used as the image photographing unit 5, it is desirable to photograph with a visible camera in the present embodiment. When using a visible camera, it is desirable to use a camera with high sensitivity in the red region, focusing on the contrast of the captured image. The sensitivity in the red region can be relatively increased by installing an optical filter with high transmittance in that band. Further, it may be used except for a camera filter provided with an infrared cut filter.

[0033] また、本実施形態の画像撮影部 5は、画像撮影部 5と被験者との距離又は変位を 検出するための距離センサ又は変位センサとして構成されている。距離センサ又は 変位センサは既存のものを用いることができる。これにより、画像撮影部 5は、画像撮 影部 5と被験者との距離や変位を撮影された各フレームと対応付けて、データ蓄積 部 12又はパラメータ設定 ·管理部 11に保持しておくようになっている。  In addition, the image capturing unit 5 of the present embodiment is configured as a distance sensor or a displacement sensor for detecting the distance or displacement between the image capturing unit 5 and the subject. The existing distance sensor or displacement sensor can be used. As a result, the image capturing unit 5 associates the distance and displacement between the image capturing unit 5 and the subject with each captured frame and stores them in the data storage unit 12 or the parameter setting / management unit 11. It has become.

[0034] また、被験者との距離や変位は、画像撮影部 5としてのカメラを二眼で構成し、各力 メラで撮影される領域 (眉間、鼻、もう一方の目)画像を用いて、三次元形状を測定す ること〖こより取得することも可能である。また、特徴的な 2点以上の間の距離を測定す ることにより、特殊なセンサを用いずに 1画素あたりの絶対距離を得ることもできる。こ れにより、距離センサ又は変位センサを設けることなく瞳孔に関するデータの絶対値 を検出することが可能となる。  [0034] In addition, the distance and displacement from the subject can be determined by configuring the camera as the image capturing unit 5 with two eyes and using images (areas between the eyebrows, nose, and the other eye) captured by each power camera. It can also be obtained from measuring the 3D shape. In addition, by measuring the distance between two or more characteristic points, the absolute distance per pixel can be obtained without using a special sensor. This makes it possible to detect the absolute value of data relating to the pupil without providing a distance sensor or a displacement sensor.

[0035] 照明部 6は、データ測定装置 1の任意の構成部分であり、撮影時に周辺環境が暗 い場合などに、光源により被験者の目周辺を照射することができるようになつている。 光源としては、白色〜白熱色 (色温度 3000K程度)の可視光などを用いることが可 能である。また、光源として赤外光源を用いることも可能である。また、照明部 6により 光を拡散照明することで、被験者の目の表面における光の直接反射が緩和され、画 像処理の精度を高めることができる。また、照明部 6は、光源の光強度を可変できる 機構を有することが望ましい。  [0035] The illumination unit 6 is an arbitrary component of the data measuring device 1, and can irradiate around the eyes of a subject with a light source when the surrounding environment is dark at the time of photographing. As the light source, white to incandescent (color temperature around 3000K) visible light can be used. It is also possible to use an infrared light source as the light source. Further, by diffusing and illuminating light by the illumination unit 6, direct reflection of light on the surface of the subject's eyes can be mitigated, and the accuracy of image processing can be improved. Further, it is desirable that the illumination unit 6 has a mechanism that can vary the light intensity of the light source.

[0036] また、照明部 6は刺激付加用の照明光を照射することも可能な機構を有している。  [0036] The illumination unit 6 has a mechanism capable of irradiating illumination light for stimulus addition.

刺激付加用の照明光としては、可視光であって、強度や照射時間が安定したストロボ (フラッシュ)光を用いるとよい。また、この場合は照射光の強度を時間的に一定にす る。これにより、画像撮影部 5による撮影時に被験者の目に刺激を付加して、その瞳 孔応答を測定することが可能となる。また、ストロボ (フラッシュ)光は、発光のタイミン グゃ発光時間を調整できることが望まし 、。 As the illumination light for applying the stimulus, it is preferable to use strobe (flash) light that is visible light and stable in intensity and irradiation time. In this case, the intensity of the irradiated light is kept constant over time. The As a result, it is possible to measure the pupil response by applying a stimulus to the eye of the subject at the time of photographing by the image photographing unit 5. In addition, it is desirable that the flash (flash) light can adjust the light emission time if the light emission timing is reached.

[0037] ユーザインターフェイス部 7は、キーボード、マウス、トラックボールなどから構成され 、ユーザの指示入力を可能とすると共に、ユーザにデータ測定装置 1の状況や要求 を伝達することを可能としている。なお、ユーザの負担が少ない装置構成とすることが 望ましいことから、表示部 13と一体としてタツチパネルを構成してもよい。  [0037] The user interface unit 7 is composed of a keyboard, a mouse, a trackball, and the like. The user interface unit 7 allows a user to input an instruction, and can transmit the status and request of the data measurement device 1 to the user. Note that since it is desirable to have a device configuration with less burden on the user, the touch panel may be configured integrally with the display unit 13.

[0038] IZO部 8は、 CFカード、 SDカード、 USBメモリカードなどの可搬型デバイスを接続 できるように構成されている。また、イーサネット (登録商標)などの外部装置 2にァク セスするためのポートも接続できるようになつている。これにより、データ測定装置 1の 動作設定に必要な各種パラメータ及びデータの入力や、外部装置 2への測定結果 の送信などが可能となって 、る。  [0038] The IZO unit 8 is configured to be able to connect a portable device such as a CF card, an SD card, or a USB memory card. In addition, a port for accessing an external device 2 such as Ethernet (registered trademark) can be connected. As a result, it is possible to input various parameters and data necessary for setting the operation of the data measuring device 1 and to transmit the measurement results to the external device 2.

[0039] メモリ部 9は、 RAM, ROM, DIMMなどから構成され、データ処理 '解析部 10など において必要なデータをデータ蓄積部 12など力も転送して一時的に蓄えることによ り、データ測定装置 1を高速かつ安定に動作させるようになつている。また、本実施形 態のメモリ部 9は、動画処理をコマ落ちなくリアルタイムで実行できる程度の容量が必 要である。  [0039] The memory unit 9 is composed of RAM, ROM, DIMM, etc., and the data processing 'analysis unit 10 etc. transfers data required by the data storage unit 12 etc. Device 1 is designed to operate at high speed and stability. In addition, the memory unit 9 of this embodiment needs to have a capacity that can execute moving image processing in real time without dropping frames.

[0040] データ処理 '解析部 10は、画像撮影部 5により撮影された動画像の解析により、生 体の瞳孔に関するデータの時系列変化を測定するようになっている。  [0040] Data processing The analysis unit 10 measures a time-series change of data related to the pupil of the living body by analyzing the moving image captured by the image capturing unit 5.

[0041] まず、データ処理 '解析部 10は、図 2に示すように、顔画像データから目領域を抽 出するようになっている。目領域の抽出には、目のテンプレートマッチングや、目周辺 部の位置関係を用いる従来の方法を適用することができる。この際、任意のフレーム における目領域を手動で設定し、他のフレームでその領域を目領域として使用しても よい。また、他のフレームで目や虹彩がはみ出さないように、目領域は広めに設定す るとよ 、。  First, as shown in FIG. 2, the data processing / analysis unit 10 extracts the eye region from the face image data. For eye region extraction, conventional methods using eye template matching and positional relationship between the eye periphery can be applied. At this time, an eye area in an arbitrary frame may be manually set, and the area may be used as an eye area in another frame. Also, set the eye area wider so that the eyes and iris do not protrude from other frames.

[0042] また、データ処理 '解析部 10は、瞳孔のテンプレートを作成するようになっている。  [0042] Further, the data processing 'analysis unit 10 is adapted to create a pupil template.

瞳孔のテンプレートは、予め複数人の目領域の動画像を解析して得られた瞳孔の大 きさ、瞳孔の画素値平均、虹彩の画素値平均などのデータを用いて作成する。例え ば、図 3に示すように、テンプレートのサイズを瞳孔が収まる大きさとし、中央部に瞳 孔に対応する円を配置する。そして、その円周及び内部の画素値を上記複数人の瞳 孔の画素値平均とし、円の外部の画素値を上記複数人の虹彩の画素値平均とする ことによって、瞳孔のテンプレートを作成することができる。 The pupil template is created using data such as the size of the pupil, the average pixel value of the pupil, and the average pixel value of the iris, which are obtained by analyzing moving images of a plurality of people's eye regions in advance. example For example, as shown in FIG. 3, the size of the template is set so that the pupil can be accommodated, and a circle corresponding to the pupil is arranged at the center. Then, a pixel template is created by setting the pixel values of the circumference and the inside to the pixel values of the pupils of the plurality of people and the pixel values of the outside of the circle to the average of the pixel values of the irises of the plurality of people. be able to.

[0043] なお、瞳孔のテンプレートは瞳孔が縮んでいるときの画像データから取得するとよ い。これにより、瞳孔が縮んだ画像データについても瞳孔中心座標の検出精度が高 くなる。 It should be noted that the pupil template may be obtained from image data when the pupil is contracted. As a result, the detection accuracy of the pupil center coordinates is increased even for image data in which the pupil is contracted.

[0044] また、データ処理 '解析部 10は、顔画像データ力 抽出した目領域について、動画 像のフレームごとに瞳孔のテンプレートマッチングを行うことにより、瞳孔を抽出するよ うになつている。この際、各フレームの目領域の画像データと瞳孔のテンプレートとの 差分が最小となるときは瞳孔の中心であると判断する。これにより、瞳孔中心座標を 検出することができる。  [0044] Further, the data processing 'analysis unit 10 extracts the pupil by performing the template matching of the pupil for each frame of the moving image with respect to the eye region from which the facial image data force has been extracted. At this time, when the difference between the image data of the eye area of each frame and the pupil template is minimized, it is determined that the center of the pupil. Thereby, the pupil center coordinates can be detected.

また、瞳孔のテンプレートマッチングにおいて、フレーム間における被写体 (瞳孔)の 移動距離は小さいとの仮定の下で、前フレームで抽出された瞳孔中心座標を用いて 、次フレームの探索領域を絞りこんでもよい。例えば、次フレームの探索領域の大き さを別途設定しておき、そのフレームにおいて前フレームの「瞳孔中心座標士設定範 囲」を探索することもできる。  In the template matching of the pupil, the search area of the next frame may be narrowed down using the pupil center coordinates extracted in the previous frame under the assumption that the moving distance of the subject (pupil) between the frames is small. . For example, the size of the search area of the next frame can be set separately, and the “pupil center coordinater setting range” of the previous frame can be searched in that frame.

[0045] なお、画像撮影部 5の光学系の歪がテンプレートマッチングの精度に影響するため 、測定前に格子チャートを撮影し、歪の補正用画像データを取得しておくことが望ま しい。 Note that since distortion of the optical system of the image capturing unit 5 affects the accuracy of template matching, it is desirable to capture a lattice chart and acquire distortion correction image data before measurement.

[0046] ここで、被験者が目を開いているときは、各フレームの画像データとテンプレートと の差の特徴量は、図 4及び図 5に示すグラフのようになる。一方、被験者が目を閉じ ている場合は、各フレームの画像データとテンプレートとの差の特徴量は、図 6に示 すグラフのようになる。このように、撮影時に被験者が瞬きをするか目を閉じた場合は 、テンプレートとの差の最小値と瞳孔の位置は無関係で、動画像データからこれらの データを排除する必要がある。  [0046] Here, when the subject opens his eyes, the feature amount of the difference between the image data of each frame and the template is as shown in the graphs of Figs. On the other hand, when the subject closes his eyes, the feature quantity of the difference between the image data of each frame and the template is as shown in the graph in FIG. Thus, when the subject blinks or closes his eyes at the time of photographing, the minimum value of the difference from the template and the position of the pupil are irrelevant, and it is necessary to exclude these data from the moving image data.

[0047] 図 7は、刺激付加用の照明光によりある周期で 3回瞳孔に光刺激を与えた場合の、 各フレームにおけるテンプレートとの差の最小値の変移を示すグラフである。図 7に 示すように、被写体の瞬きにより、最小値が不連続な値として現れる。なお、図 7では 最小値の変移を示したが、瞳孔中心座標の変移などを示すグラフにおいても、被験 者が瞬きをするか目を閉じた場合は、瞳孔中心座標が不連続な値として現れる。 FIG. 7 is a graph showing the transition of the minimum value of the difference from the template in each frame when light stimulation is given to the pupil three times with a certain period of illumination light for stimulus addition. Figure 7 As shown, the minimum value appears as a discontinuous value due to blinking of the subject. Although the minimum value transition is shown in FIG. 7, the pupil center coordinate appears as a discontinuous value when the subject blinks or closes his eyes even in the graph showing the pupil center coordinate transition, etc. .

[0048] そこで、データ処理 '解析部 10は、動画像データのうち、画素値や瞳孔中心座標な どの変移が不連続となった部分の画像データを抽出し、被験者が瞬きをしたときや目 を閉じたときの画像データとして排除するようになっている。例えば、図 7のグラフの微 分量 (変動量)に対して所定の閾値を別途設定し、その微分量が所定の閾値以上と なった場合は「瞬き」と判断することが可能である。また、図 7の最小値の変移を示す グラフを複数回平滑ィ匕処理して、フレームごとに平滑化前と平滑化後の最小値の差 をとり、差が大きくなるときの画像データを瞬きや目を閉じたときの画像データとして 排除することも可能である。  [0048] Therefore, the data processing 'analysis unit 10 extracts image data of a part in which transitions such as pixel values and pupil center coordinates are discontinuous from the moving image data, and when the subject blinks, Is excluded as image data when is closed. For example, a predetermined threshold value is separately set for the minute amount (variation amount) in the graph of FIG. 7, and when the differential amount is equal to or greater than the predetermined threshold value, it is possible to determine “blink”. Also, the graph showing the transition of the minimum value in Fig. 7 is smoothed multiple times, the difference between the minimum value before smoothing and after smoothing is taken for each frame, and the image data when the difference increases is blinked. It can also be excluded as image data when the eyes are closed.

[0049] なお、上記の例では、被験者に付加する刺激として光刺激を用いた力 被験者に 付加する刺激は、音、匂い、臭い又は被験者に触れるものであってもよい。例えば、 破裂音、爆音、御香、ァロマなどが考えられる。その他、被験者のストレスを緩和した り、逆に被験者にストレスを与えたりするものでもよい。例えば、温泉、足湯、熱風、冷 風などが考えられる。  [0049] In the above example, a force using a light stimulus as a stimulus to be applied to the subject. The stimulus to be applied to the subject may be sound, smell, smell, or touching the subject. For example, plosives, explosive sounds, incense, and aromas are possible. In addition, the subject's stress may be alleviated or the subject may be stressed. For example, hot springs, footbaths, hot air, and cold air can be considered.

[0050] また、被験者が瞬きをしたときや目を閉じたときの画像データを抽出する際に得られ た「瞬き」のデータは、別途「瞬き回数」のデータとして他の解析に転用する力、外部 装置 2に転送してもよい。また、刺激応答時の画素値や瞳孔中心座標の変化の始点 、頂点及び終点位置を探索することにより、刺激応答時の変化をデータとして抽出す ることも可能である。光刺激による瞳孔応答の始点位置は、例えば、刺激光の発光タ イミングと動画データとを関連づけてデータ保存し、画像処理時に発光タイミングの データを用いることにより検出することができる。また、刺激応答時の変化の始点、頂 点及び終点位置の探索は、その他公知の手法を用いて行うことができる。  [0050] In addition, the "blink" data obtained when extracting the image data when the subject blinks or when the eyes are closed is the power of diverting the data to other analyzes as "blink count" data. The data may be transferred to the external device 2. It is also possible to extract changes at the time of stimulus response as data by searching for the start point, apex and end point position of the change in pixel values and pupil center coordinates at the time of stimulus response. The start position of the pupil response due to light stimulation can be detected by, for example, storing the light emission timing of stimulation light in association with moving image data and using the light emission timing data during image processing. In addition, the search of the start point, the apex point, and the end point position of the change at the time of stimulus response can be performed using other known methods.

[0051] また、データ処理 '解析部 10は、瞳孔のテンプレートのマッチングにより得た瞳孔中 心座標を始点として、動画像のフレームごとに、左右、上下、斜め方向の少なくとも一 方向に瞳孔の端部を探索することにより、瞳孔の直径データを抽出するようになって いる。 [0052] この際、図 8に示すように、瞳孔中心座標の画素値と、別途設定した閾値とを用い て、探索位置の画素値が 0から「瞳孔中心座標の画素値 +閾値」の範囲外であれば 瞳孔外と判定することができる。こうして得られた瞳孔の端点力も瞳孔の直径データ を求めることができる。また、瞳孔の直径データの抽出は、瞳孔の端部を探索する際 に、瞳孔中心周辺の画素値力 平均値と標準偏差量を計算し、「平均値 +標準偏差 の 3倍」などの範囲を設けて瞳孔内とみなすことなどによって行ってもよい。この直径 データは、横方向の直径又は縦方向の直径データなどにより別々の特徴量としても よぐ図 9に示すように、平均化して 1つの特徴量としてもよい。図 9では、各フレーム における瞳孔の直径データを示すものであり、刺激応答時により瞳孔の直径データ が変移して 、る様子が表れて 、る。 [0051] Further, the data processing 'analysis unit 10 starts with the pupil center coordinates obtained by matching the template of the pupil as the starting point, and ends the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image. By searching for a part, pupil diameter data is extracted. At this time, as shown in FIG. 8, the pixel value at the search position is in the range from 0 to “pixel value at the pupil center coordinate + threshold value” using the pixel value at the pupil center coordinate and the threshold value set separately. If it is outside, it can be determined to be outside the pupil. Pupil diameter data can also be obtained from the end point force of the pupil thus obtained. In addition, when extracting the pupil diameter data, the pixel value force average value and standard deviation amount around the center of the pupil are calculated when searching for the end of the pupil, and ranges such as `` average value + 3 times the standard deviation '' It may be performed by providing a lens and considering it in the pupil. This diameter data may be averaged as a single feature value as shown in FIG. 9, which may be a separate feature value based on the diameter data in the horizontal direction or the diameter data in the vertical direction. Fig. 9 shows the pupil diameter data in each frame. The pupil diameter data changes depending on the stimulus response.

[0053] また、データ処理 '解析部 10は、動画像のフレームごとに、直前のフレームの処理 で得られた瞳孔中心の画素値及び瞳孔の直径データに基づき、瞳孔のテンプレート を更新して、次のフレームの瞳孔抽出に適用するようになっている。また、瞳孔のテン プレートの更新に用いるデータは、直前のフレームで得られたデータのみならず、直 前までの全フレームで得られたデータを平均化したデータであってもよい。  [0053] Further, for each frame of the moving image, the data processing 'analysis unit 10 updates the pupil template based on the pixel value at the center of the pupil and the diameter data of the pupil obtained by the processing of the immediately preceding frame. This is applied to pupil extraction of the next frame. The data used for updating the pupil template may be data obtained by averaging not only the data obtained in the immediately preceding frame but also the data obtained in all the previous frames.

[0054] このように瞳孔のテンプレートを更新することにより、被験者ごとに瞳孔のテンプレー トがカスタマイズされることから、瞳孔の抽出精度が向上する。  [0054] By updating the pupil template in this way, the pupil template is customized for each subject, so that the pupil extraction accuracy is improved.

[0055] また、データ処理 '解析部 10は、更新前後のテンプレートの差分が別途設定した所 定の閾値以下となった場合は、その被験者に適合するテンプレートが作成されたも のとみなして、以後はテンプレートの更新を終了することもできる。これにより、不要な テンプレート作成の工程を省略して処理の迅速ィ匕を図ることができる。  [0055] In addition, if the difference between the templates before and after the update is equal to or less than a predetermined threshold value set separately, the data processing 'analysis unit 10 considers that a template suitable for the subject has been created, Thereafter, the template update can be terminated. As a result, unnecessary template creation steps can be omitted, and the process can be performed quickly.

[0056] なお、データ処理 '解析部 10は、瞳孔のテンプレート更新において、直前のフレー ムが瞬き又は目を閉じた際の画像データである場合は、更新時のデータに使用しな いようにする。また、全フレームのデータを平均化する際にも、瞬き又は目を閉じた際 の画像データは使用しな 、ようにする。  [0056] It should be noted that the data processing 'analysis unit 10 does not use the updated data when updating the pupil template, if the immediately preceding frame is image data when blinking or closing eyes. To do. Also, when averaging the data of all frames, do not use the image data when blinking or closing eyes.

[0057] また、データ処理 '解析部 10は、各フレームの瞳孔に関するデータを所定の基準 値に対する相対値とするための補正を行うようになっている。すなわち、上記の処理 で得られる瞳孔の直径データは 1フレーム内で画素単位の相対値として得られたデ ータであり、各フレーム間で統一された値ではない。そこで、データ処理 '解析部 10 は、各フレーム内において目尻と目頭など時系列的に位置関係が変わらない複数点 間の距離を取得し、各フレーム間でこの複数点間の距離の相対関係に応じて瞳孔の 直径データを補正することにより、瞳孔の直径データを各フレーム間で統一された値 とするようになつている。例えば、図 10に示すように、目尻と目頭の距離を Dとし、第 n フレームにおいて目尻と目頭の距離 Dnを取得して補正量 DnZDを求め、この補正 量 DnZDを利用して第 nフレームにおける瞳孔の直径データを補正することにより、 この瞳孔の直径データを他のフレームと統一された値とすることが可能である。 [0057] Further, the data processing ' analysis unit 10 performs a correction to make the data related to the pupil of each frame relative to a predetermined reference value. In other words, the pupil diameter data obtained by the above processing is the data obtained as relative values in pixel units within one frame. Data and not a uniform value between frames. Therefore, the data processing 'analysis unit 10 obtains the distance between a plurality of points whose positional relationship does not change in time series such as the corner of the eye and the top of each eye in each frame, and calculates the relative distance between these points between each frame. Accordingly, the pupil diameter data is corrected so that the pupil diameter data is unified between the frames. For example, as shown in FIG. 10, the distance between the corner of the eye and the eye is D, the distance Dn between the corner of the eye and the eye is obtained in the nth frame, and the correction amount DnZD is obtained. By correcting the diameter data of the pupil, it is possible to make the diameter data of the pupil uniform with other frames.

[0058] また、データ処理 '解析部 10は、瞳孔の直径データの絶対値を測定するようになつ ている。すなわち、上記の処理で得られる瞳孔の直径データは画素単位の相対値で あり、画像撮影部 5と被験者との距離などにより直径データの絶対値は異なる。そこで 、データ処理 '解析部 10は、画像撮影部 5としての距離センサ又は変位センサにより 計測 ·保存した各フレームにおける画像撮影部 5と被験者との距離や変位をパラメ一 タとして、 1画素あたりの絶対距離を算出するようになって!/、る。  [0058] Further, the data processing 'analysis unit 10 measures the absolute value of pupil diameter data. That is, the diameter data of the pupil obtained by the above processing is a relative value in pixel units, and the absolute value of the diameter data varies depending on the distance between the image capturing unit 5 and the subject. Therefore, the data processing / analysis unit 10 uses the distance sensor or the displacement sensor as the image capturing unit 5 to measure and store the distance and displacement between the image capturing unit 5 and the subject in each frame as parameters. The absolute distance is now calculated!

[0059] また、直径データの絶対値の測定では、被験者の周辺に大きさ寸法や長さ寸法が 既知の参照物体 (パッチやシールなど)を置き、その物体の画像上の大きさから 1画 素あたりの長さの絶対値を算出してもよい。また、目盛り付きシールなど被験者の長さ 寸法を計測できるものを用いて撮影し、同様に 1画素あたりの長さの絶対値を計算し てもよい。また、被験者のまつげ、鼻、目尻から目頭の距離など、目周辺の部位の長 さ寸法や大きさ寸法を別途計測し、参照パラメータとして与えてもょ 、。  [0059] In measuring the absolute value of the diameter data, a reference object (patch, sticker, etc.) with a known size and length is placed around the subject, and one image is drawn from the size of the object on the image. The absolute value of the length per element may be calculated. It is also possible to take an image using a subject that can measure the length of the subject, such as a scaled seal, and calculate the absolute length per pixel in the same way. In addition, the length and size of the area around the eyes, such as the eyelashes of the subject, the nose, and the distance from the corners of the eyes to the eyes, can be separately measured and given as reference parameters.

[0060] パラメータ設定 ·管理部 11は、データ測定装置 1の処理や制御のために必要なパ ラメータを設定することができるように構成されており、設定されたパラメータを管理す るようになっている。  [0060] The parameter setting / management unit 11 is configured to be able to set parameters necessary for processing and control of the data measuring device 1, and manages the set parameters. ing.

[0061] データ蓄積部 12は、 HDDなどにより構成され、外部から入力された画像データ、 画像撮影部 5により撮影された動画像データ、データ処理 ·解析部 10による画像処 理が行われた動画像データ又は画像処理途中のテンポラリデータなどを管理して保 持するようになっている。  [0061] The data storage unit 12 is configured by an HDD or the like, and includes image data input from the outside, moving image data captured by the image capturing unit 5, and a moving image that has been subjected to image processing by the data processing / analyzing unit 10. It manages and holds image data or temporary data during image processing.

[0062] 表示部 13は、 CRT,液晶,有機 EL,プラズマ又は投影方式などのディスプレイか ら構成されており、データ処理 ·解析部 10で画像処理中の動画像データ又はデータ 蓄積部 12で保持された動画像データなどを表示するほか、データ測定装置 1の各構 成部分の状態に関する情報や、外部装置 2から与えられた情報などを表示するよう になっている。なお、タツチパネルとするなどユーザインターフェイス部 7と機能を兼ね る構成とすることも可能である。 [0062] The display unit 13 may be a CRT, liquid crystal, organic EL, plasma, or projection display. In addition to displaying the moving image data being processed by the data processing / analysis unit 10 or the moving image data held by the data storage unit 12, the data processing / analysis unit 10 also relates to the state of each component of the data measuring device 1. Information and information given from the external device 2 are displayed. It is also possible to adopt a configuration that also functions as the user interface unit 7 such as a touch panel.

[0063] なお、上記各構成部分のうち、制御部 3、外部通信部 4、 IZO部 8、メモリ部 9、デー タ処理 ·解析部 10、パラメータ設定 ·管理部 11、データ蓄積部 12及び表示部 13は 一般的なパーソナルコンピュータとして構成することが可能であり、それに画像撮影 部 5、照明部 6及びユーザインターフェイス部 7を取り付けることにより、データ測定装 置 1を構成することが可能である。 [0063] Of the above components, control unit 3, external communication unit 4, IZO unit 8, memory unit 9, data processing / analysis unit 10, parameter setting / management unit 11, data storage unit 12, and display The unit 13 can be configured as a general personal computer, and the data measuring device 1 can be configured by attaching the image capturing unit 5, the illumination unit 6, and the user interface unit 7 thereto.

次に、上述のデータ測定装置 1を使用した本発明のデータ測定方法について、図 11 及び図 12のフローチャートを参照して説明する。  Next, the data measurement method of the present invention using the above-described data measurement apparatus 1 will be described with reference to the flowcharts of FIG. 11 and FIG.

まず、周辺環境が暗い場合は照明部 6で被験者の目周辺を照射し、画像撮影部 5に より被験者の目周辺を動画で撮影する。この際、照明部 6により刺激付加用の照明光 を照射してもよい。  First, when the surrounding environment is dark, the illumination unit 6 illuminates the subject's eyes and the image photographing unit 5 photographs the subject's eyes with a moving image. At this time, the illumination unit 6 may irradiate illumination light for stimulus addition.

[0064] 次に、データ処理 '解析部 10は、図 2に示すように、画像撮影部 5により撮影した顔 画像データから目領域を抽出する。  Next, as shown in FIG. 2, the data processing / analysis unit 10 extracts an eye area from the face image data photographed by the image photographing unit 5.

[0065] 続いて、データ処理 '解析部 10は、瞳孔のテンプレートを作成する。瞳孔のテンプ レートは、予め複数人の目領域の動画像を解析して得られた瞳孔の大きさ、瞳孔の 画素値平均、虹彩の画素値平均などのデータを用いて作成する。  Subsequently, the data processing 'analysis unit 10 creates a pupil template. The pupil template is created using data such as pupil size, pupil pixel value average, and iris pixel value average obtained in advance by analyzing moving images of a plurality of human eye regions.

[0066] 続いて、データ処理 '解析部 10は、顔画像データ力 抽出した目領域について、 動画像のフレームごとに瞳孔のテンプレートマッチングを行うことにより瞳孔を抽出す る。図 11に、テンプレートマッチング及び瞳孔の抽出方法についてのフローチャート を示す。  [0066] Subsequently, the data processing 'analysis unit 10 extracts the pupil by performing template matching of the pupil for each frame of the moving image for the extracted eye region of the face image data force. Figure 11 shows a flowchart of template matching and pupil extraction methods.

[0067] 図 11に示すように、データ処理 '解析部 10に顔画像データが入力されると (ステツ プ S1)、データ処理 '解析部 10は目領域を抽出して (ステップ S2)、作成した瞳孔の テンプレートによりテンプレートマッチングを行う(ステップ S3)。そして、各フレームの 目領域の画像データと瞳孔のテンプレートとの差分が最小となるときが瞳孔の中心と 判断することにより、瞳孔中心座標を検出する (ステップ S4)。この際、データ処理 '解 析部 10は、動画像データのうち、画素値や瞳孔中心座標などの変移が不連続となつ た部分の画像データを抽出し、被験者が瞬きをしたときや目を閉じたときの画像デー タとして排除する。続いて、瞳孔のテンプレートのマッチングにより得た瞳孔中心座標 を始点として、動画像のフレームごとに、左右、上下、斜め方向の少なくとも一方向に 瞳孔の端部を探索することにより、瞳孔の直径データを抽出する (ステップ S5)。続い て、次フレームがあるか否かの判断を行い (ステップ S6)、次フレームがある場合はス テツプ S1〜ステップ S5を繰り返し、次フレームがない場合は処理を終了する。 [0067] As shown in FIG. 11, when face image data is input to the data processing 'analysis unit 10 (step S1), the data processing' analysis unit 10 extracts the eye region (step S2) and creates it. Template matching is performed using the pupil template (step S3). The center of the pupil is when the difference between the image data of the eye area of each frame and the pupil template is minimized. By determining, pupil center coordinates are detected (step S4). At this time, the data processing / analysis unit 10 extracts image data of a part of the moving image data in which transitions such as pixel values and pupil center coordinates are discontinuous. Excluded as image data when closed. Next, the pupil diameter data is obtained by searching for the edge of the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image, starting from the pupil center coordinates obtained by matching the template of the pupil. Is extracted (step S5). Subsequently, it is determined whether or not there is a next frame (step S6). If there is a next frame, steps S1 to S5 are repeated, and if there is no next frame, the process is terminated.

[0068] また、データ処理 '解析部 10は、動画像のフレームごとに、瞳孔のテンプレートを更 新する。図 12に、瞳孔のテンプレートの更新方法についてのフローチャートを示す。  [0068] Further, the data processing 'analysis unit 10 updates the pupil template for each frame of the moving image. FIG. 12 shows a flowchart of a method for updating the pupil template.

[0069] 図 12に示すように、データ処理 '解析部 10に動画像の先頭フレームが入力される と (ステップ S11)、データ処理 ·解析部 10は目領域を抽出して (ステップ S12)、作成 した瞳孔のテンプレートによりテンプレートマッチングを行う(ステップ S 13)。これによ り瞳孔中心座標を検出し (ステップ S14)、瞳孔の直径データを抽出する (ステップ S1 5)。次に、データ処理 '解析部 10は、直前フレームの処理で得られたデータ又は全 フレームの平均データに基づき、瞳孔のテンプレートを更新する(ステップ S16)。  [0069] As shown in FIG. 12, when the first frame of the moving image is input to the data processing 'analysis unit 10 (step S11), the data processing / analysis unit 10 extracts the eye region (step S12), Template matching is performed using the created pupil template (step S13). Thereby, pupil center coordinates are detected (step S14), and pupil diameter data is extracted (step S15). Next, the data processing / analysis unit 10 updates the pupil template based on the data obtained by the processing of the immediately preceding frame or the average data of all the frames (step S16).

[0070] この際、データ処理 '解析部 10は更新前後のテンプレートの差分が別途設定した 所定の閾値以下となったか否かの判断を行い、その結果、所定の閾値より大きい場 合はテンプレートを更新し、所定の閾値以下となっている場合は、以後はテンプレー トの更新を行わずにテンプレートマッチングを行うこともできる。  [0070] At this time, the data processing 'analysis unit 10 determines whether or not the difference between the templates before and after the update is equal to or less than a predetermined threshold value that is set separately. If it is updated and the value is below the predetermined threshold, template matching can be performed without updating the template thereafter.

ここで、テンプレートの更新について詳述すれば、テンプレートを更新するときは、前 述したように瞳孔抽出により取得された瞳孔の直径データ、瞳孔中心周辺の画素値 を平均化して算出した瞳孔中心の画素値、瞳孔外の画素値を用いる。  Here, the update of the template will be described in detail. When updating the template, as described above, the pupil diameter data obtained by pupil extraction and the pixel values around the pupil center are averaged and calculated. Pixel values and pixel values outside the pupil are used.

瞳孔外の画素値とは、瞳孔外と判定されたエリア、すなわち、瞳孔の直径データから 計算される瞳孔内エリアの外側所定領域の画素値を平均化して算出する。平均化す るときの領域は、瞳孔の直径部を端部としてそこ力 所定の画素数または距離分外 側の同心円で囲まれた部分の範囲とする。所定の画素数または距離は、画像処理 装置のパラメータ設定部で設定する。または、 目の白目エリア以外の部分力も瞳孔内 エリア部分を削除した部分を瞳孔外エリア (結果的に虹彩のエリアとなる)としてその エリア内の画素値を平均化したものを用いてもょ 、。 The pixel value outside the pupil is calculated by averaging pixel values in a predetermined area outside the area determined to be outside the pupil, that is, the area inside the pupil calculated from the diameter data of the pupil. The area to be averaged is the range of the portion surrounded by concentric circles outside the diameter of the pupil by a predetermined number of pixels or distance. The predetermined number of pixels or distance is set by the parameter setting unit of the image processing apparatus. Or, the partial force other than the white-eye area of the eye is inside the pupil. You can use the area where the area is deleted as the extra-pupil area (resulting in an iris area), with the pixel values in the area averaged.

[0071] 続いて、データ処理 '解析部 10に次フレームが入力されると (ステップ S17)、デー タ処理 ·解析部 10は目領域を抽出し (ステップ S18)、作成した瞳孔のテンプレートに より再びテンプレートマッチングを行う(ステップ S19)。そして、更新されたテンプレー トを用いて瞳孔中心座標の検出 (ステップ S20)、瞳孔の直径データの抽出 (ステップ S21)を行う。続いて、次フレームがあるか否かの判断を行い (ステップ S22)、次フレ ームがある場合は再び瞳孔のテンプレートを更新し (ステップ S 16)、次フレームがな い場合は処理を終了する。  [0071] Subsequently, when the next frame is input to the data processing / analysis unit 10 (step S17), the data processing / analysis unit 10 extracts the eye region (step S18), and the generated pupil template is used. Template matching is performed again (step S19). Then, using the updated template, pupil center coordinates are detected (step S20), and pupil diameter data is extracted (step S21). Next, it is determined whether or not there is a next frame (step S22) .If there is a next frame, the pupil template is updated again (step S16), and if there is no next frame, the process ends. To do.

[0072] 続いて、データ処理 '解析部 10は、瞳孔の直径データを各フレーム間で統一され たデータとする補正を行う。すなわち、瞳孔の直径データを各フレーム間の相対値と する。  [0072] Subsequently, the data processing 'analysis unit 10 performs correction so that pupil diameter data is unified between the frames. That is, the diameter data of the pupil is a relative value between the frames.

[0073] また、データ処理 '解析部 10は、瞳孔の直径データの絶対値を測定する。すなわ ち、画像撮影部 5としての距離センサ又は変位センサにより計測'保存した各フレー ムにおける画像撮影部 5と被験者との距離や変位をパラメータとして、 1画素あたりの 絶対距離を算出する。  [0073] Further, the data processing 'analysis unit 10 measures the absolute value of the diameter data of the pupil. That is, the absolute distance per pixel is calculated using the distance and displacement between the image capturing unit 5 and the subject in each frame measured and stored by the distance sensor or displacement sensor as the image capturing unit 5 as parameters.

[0074] 次に、データ処理 ·解析部 10は、外部装置 2に測定結果を送信する。  Next, the data processing / analyzing unit 10 transmits the measurement result to the external device 2.

[0075] このように本実施形態に係るデータ測定装置 1及びデータ測定方法によれば、既 にデータ測定が行われたフレームのデータを用いて瞳孔のテンプレートを更新する ことにより、生体ごとに瞳孔のテンプレートがカスタマイズされ、瞳孔の抽出精度を向 上させることが可會となる。  As described above, according to the data measuring apparatus 1 and the data measuring method according to the present embodiment, the pupil template is updated using the data of the frame in which the data measurement has already been performed, so that the pupil is determined for each living body. This template can be customized to improve the pupil extraction accuracy.

[0076] また、瞳孔中心の画素値及び瞳孔の直径データを用いて瞳孔のテンプレートを作 成又は更新することが可能となる。  In addition, it is possible to create or update a pupil template using the pixel value at the center of the pupil and the diameter data of the pupil.

[0077] また、動画像のフレームごとに瞳孔に関するデータを測定した場合、そのデータは 1フレーム内の相対値にすぎないが、時系列的に位置関係が変わらない複数点間の 距離の相対関係に応じて各フレームのデータを補正することにより、瞳孔に関するデ 一タを全フレーム間で統一された値にすることが可能となる。  [0077] In addition, when data relating to the pupil is measured for each frame of the moving image, the data is only a relative value within one frame, but the relative relationship of the distances between multiple points whose positional relationship does not change in time series By correcting the data of each frame according to the above, it becomes possible to make the data relating to the pupil a uniform value among all the frames.

[0078] また、動画像の各フレーム内で瞳孔に関するデータを測定した場合、そのデータは 画素単位による 1フレーム内の相対値にすぎないが、生体との距離又は変位をパラメ ータとすることにより、 1画素あたりの絶対距離を算出して、瞳孔に関するデータの絶 対値を算出することが可能となる。 [0078] Further, when data about the pupil is measured in each frame of the moving image, the data is Although it is only a relative value within one frame in pixel units, the absolute distance per pixel is calculated by using the distance or displacement from the living body as a parameter, and the absolute value of data related to the pupil is calculated. It becomes possible.

[0079] また、距離センサ又は変位センサなどを用いることなぐ画像データのみを用いて 生体の三次元形状を測定し、瞳孔に関するデータの絶対値を算出することができる ため、装置構成を単純ィ匕することが可能となる。  [0079] In addition, since the three-dimensional shape of a living body can be measured using only image data without using a distance sensor or a displacement sensor, and the absolute value of data relating to the pupil can be calculated, the apparatus configuration can be simplified. It becomes possible to do.

[0080] また、撮影時に生体に光を照射することにより、画像撮影部による撮影を正確に行 い瞳孔に関するデータの測定精度を向上させることが可能となる。 Further, by irradiating the living body with light at the time of imaging, it is possible to accurately perform imaging by the image capturing unit and improve the measurement accuracy of data related to the pupil.

[0081] また、可視光源及び可視カメラを用いて生体の撮影を行うことから、赤外光源や赤 外力メラなどの特殊な機器を用いることなぐ装置構成を単純ィ匕して製造コストを低廉 化することが可能となる。 [0081] In addition, since a living body is imaged using a visible light source and a visible camera, the device configuration without using a special device such as an infrared light source or an infrared force mela is simplified to reduce manufacturing costs. It becomes possible to do.

[0082] 本実施形態のデータ測定装置 1は、パーソナルコンピュータ作業時の被験者の集 中後や疲労度を判定するために使用することが可能である。また、光刺激以外の刺 激 (音、ァロマ、御香、足湯など)を付加することにより、被験者の心理'ストレス度を判 定するために使用することも可能である。更に、映画やテレビ番組、ニュースなどを鑑 賞中の被験者の感動度や注目度を計測するために使用することも可能である。 [0082] The data measuring device 1 of the present embodiment can be used after the concentration of subjects during personal computer work and to determine the degree of fatigue. It can also be used to determine the subject's psychological stress level by adding stimuli other than light stimuli (sound, aroma, incense, footbath, etc.). Furthermore, it can also be used to measure the degree of emotion and attention of a subject who is awarding a movie, a TV program, or news.

[0083] 以上詳細に説明したように、本発明のデータ測定装置及びデータ測定方法によれ ば、生体に非接触で生体の瞳孔に関するデータを高精度に測定することが可能とな る。 As described in detail above, according to the data measuring device and the data measuring method of the present invention, it is possible to measure data relating to the pupil of the living body with high accuracy without contacting the living body.

Claims

請求の範囲 The scope of the claims [1] 目領域を有する生体を撮影して第 1の画像を取得する画像撮影部と、  [1] An image photographing unit for photographing a living body having an eye region and acquiring a first image; 前記第 1の画像から第 1の目領域を抽出し、  Extracting a first eye region from the first image; 瞳孔に関する所定テンプレートにより前記第 1の目領域から瞳孔を検出し、 検出された瞳孔に関するデータを測定し、  Detecting a pupil from the first eye region with a predetermined template relating to the pupil, measuring data relating to the detected pupil; 測定したデータによって所定テンプレートを更新するデータ処理 '解析部を有し、 前記画像撮影部は、前記生体を撮影して第 2の画像を入力すると、  Data processing to update a predetermined template with measured data 'Analysis unit, The image capturing unit captures the living body and inputs a second image, 前記データ処理 ·解析部は、  The data processing / analysis unit 第 2の画像から第 2の目領域を抽出し、  Extract the second eye area from the second image, 更新されたテンプレートにより第 2の目領域力 瞳孔を検出する  Detect second eye area pupil with updated template ことを特徴とするデータ測定装置。  A data measuring device characterized by that. [2] 前記瞳孔に関するデータは瞳孔中心の画素値及び瞳孔の直径データであることを 特徴とする請求の範囲第 1項に記載のデータ測定装置。 2. The data measuring apparatus according to claim 1, wherein the data relating to the pupil is a pixel value at the center of the pupil and diameter data of the pupil. [3] 前記データ処理 '解析部は前記フレームごとに時系列的に位置関係が変わらない 複数点間の距離を取得し、前記複数点間の距離の相対関係に応じて前記フレーム ごとに前記瞳孔に関するデータを補正することにより、前記フレームの各々の前記瞳 孔に関するデータを所定の基準値に対する相対値とすることを特徴とする請求の範 囲第 1項または第 2項に記載のデータ測定装置。 [3] The data processing 'analysis unit acquires a distance between a plurality of points whose positional relationship does not change in time series for each frame, and the pupil for each frame according to a relative relationship of the distances between the plurality of points. 3. The data measuring device according to claim 1, wherein the data relating to the pupil of each of the frames is set to a relative value with respect to a predetermined reference value by correcting the data relating to the frame. . [4] 前記画像撮影部は生体との距離又は変位を測定する距離センサ又は変位センサ を備えており、前記データ処理 '解析部は前記生体との距離又は変位をパラメータと して、前記瞳孔に関するデータの絶対値を算出することを特徴とする請求の範囲第 1 項〜第 3項のいずれか 1項に記載のデータ測定装置。 [4] The image photographing unit includes a distance sensor or a displacement sensor that measures a distance or displacement from a living body, and the data processing / analyzing unit relates to the pupil using the distance or displacement from the living body as a parameter. The data measurement device according to any one of claims 1 to 3, wherein an absolute value of data is calculated. [5] 前記距離センサ又は変位センサは複数の撮像素子により構成され、生体の三次元 形状を測定することを特徴とする請求の範囲第 4項に記載のデータ測定装置。 5. The data measuring apparatus according to claim 4, wherein the distance sensor or the displacement sensor is constituted by a plurality of image sensors and measures a three-dimensional shape of a living body. [6] 前記画像撮影部は可視カメラであることを特徴とする請求の範囲第 5項に記載のデ ータ測定装置。 6. The data measuring device according to claim 5, wherein the image photographing unit is a visible camera. [7] 目領域を有する生体を撮影して第 1の画像を取得する工程と、  [7] acquiring a first image by imaging a living body having an eye region; 前記第 1の画像力 第 1の目領域を抽出する工程と、 瞳孔に関する所定テンプレートにより前記第 1の目領域から瞳孔を検出する工程と、 検出された瞳孔に関するデータを測定する工程と、 Extracting the first image force first eye region; Detecting a pupil from the first eye region with a predetermined template for the pupil; measuring data about the detected pupil; 測定したデータによって所定テンプレートを更新する工程と、 Updating the predetermined template with the measured data; 前記生体を撮影して第 2の画像を入力する工程と、 Photographing the living body and inputting a second image; 第 2の画像から第 2の目領域を抽出する工程と、 Extracting a second eye region from the second image; 更新されたテンプレートにより第 2の目領域力 瞳孔を検出する工程と、 Detecting the second eye region force pupil with the updated template; を有することを特徴とするデータ測定方法。 A data measurement method characterized by comprising: 前記瞳孔に関するデータは瞳孔中心の画素値及び瞳孔の直径データであることを 特徴とする請求の範囲第 7項に記載のデータ測定方法。  8. The data measurement method according to claim 7, wherein the data relating to the pupil is a pixel value at the center of the pupil and diameter data of the pupil.
PCT/JP2007/058430 2006-04-27 2007-04-18 Data measuring device and data measuring method Ceased WO2007125794A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-123776 2006-04-27
JP2006123776 2006-04-27

Publications (1)

Publication Number Publication Date
WO2007125794A1 true WO2007125794A1 (en) 2007-11-08

Family

ID=38655323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/058430 Ceased WO2007125794A1 (en) 2006-04-27 2007-04-18 Data measuring device and data measuring method

Country Status (1)

Country Link
WO (1) WO2007125794A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009106382A (en) * 2007-10-26 2009-05-21 Naoji Kitajima Acoustic pupillary reaction test system
JP2014515291A (en) * 2011-05-20 2014-06-30 アイフルエンス,インコーポレイテッド System and method for measuring head, eye, eyelid and pupil response
CN104114079A (en) * 2011-10-24 2014-10-22 Iriss医疗科技有限公司 System and method for identifying eye conditions
CN104173063A (en) * 2014-09-01 2014-12-03 北京工业大学 Visual attention detection method and system
WO2015075894A1 (en) * 2013-11-19 2015-05-28 日本電気株式会社 Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method
CN105310703A (en) * 2014-07-02 2016-02-10 北京邮电大学 Method for obtaining subjective satisfaction on basis of pupil diameter data of user
JP2016159050A (en) * 2015-03-04 2016-09-05 富士通株式会社 Pupil diameter measuring device, pupil diameter measuring method and its program
CN112331003A (en) * 2021-01-06 2021-02-05 湖南贝尔安亲云教育有限公司 Exercise generation method and system based on differential teaching
CN112450875A (en) * 2020-12-21 2021-03-09 江苏省肿瘤医院 Accurate pupil measuring apparatu

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08105724A (en) * 1994-10-05 1996-04-23 Fujitsu Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2002056394A (en) * 2000-08-09 2002-02-20 Matsushita Electric Ind Co Ltd Eye position detection method and eye position detection device
WO2004012142A1 (en) * 2002-07-26 2004-02-05 Mitsubishi Denki Kabushiki Kaisha Image processing apparatus
JP2005078311A (en) * 2003-08-29 2005-03-24 Fujitsu Ltd Facial part tracking device, eye state determination device, and computer program
JP2005348832A (en) * 2004-06-08 2005-12-22 National Univ Corp Shizuoka Univ Real-time pupil position detection system
WO2006013803A1 (en) * 2004-08-03 2006-02-09 Matsushita Electric Industrial Co., Ltd. Imaging device and imaging method
JP2006099718A (en) * 2004-08-30 2006-04-13 Toyama Prefecture Personal authentication apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08105724A (en) * 1994-10-05 1996-04-23 Fujitsu Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2002056394A (en) * 2000-08-09 2002-02-20 Matsushita Electric Ind Co Ltd Eye position detection method and eye position detection device
WO2004012142A1 (en) * 2002-07-26 2004-02-05 Mitsubishi Denki Kabushiki Kaisha Image processing apparatus
JP2005078311A (en) * 2003-08-29 2005-03-24 Fujitsu Ltd Facial part tracking device, eye state determination device, and computer program
JP2005348832A (en) * 2004-06-08 2005-12-22 National Univ Corp Shizuoka Univ Real-time pupil position detection system
WO2006013803A1 (en) * 2004-08-03 2006-02-09 Matsushita Electric Industrial Co., Ltd. Imaging device and imaging method
JP2006099718A (en) * 2004-08-30 2006-04-13 Toyama Prefecture Personal authentication apparatus

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009106382A (en) * 2007-10-26 2009-05-21 Naoji Kitajima Acoustic pupillary reaction test system
JP2014515291A (en) * 2011-05-20 2014-06-30 アイフルエンス,インコーポレイテッド System and method for measuring head, eye, eyelid and pupil response
US9931069B2 (en) 2011-05-20 2018-04-03 Google Llc Systems and methods for measuring reactions of head, eyes, eyelids and pupils
CN104114079A (en) * 2011-10-24 2014-10-22 Iriss医疗科技有限公司 System and method for identifying eye conditions
JP2014530730A (en) * 2011-10-24 2014-11-20 アイリス・メディカル・テクノロジーズ・リミテッド System and method for identifying eye symptoms
WO2015075894A1 (en) * 2013-11-19 2015-05-28 日本電気株式会社 Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method
CN105310703B (en) * 2014-07-02 2018-01-19 北京邮电大学 A kind of method based on user's pupil diameter data acquisition subjective satisfaction
CN105310703A (en) * 2014-07-02 2016-02-10 北京邮电大学 Method for obtaining subjective satisfaction on basis of pupil diameter data of user
CN104173063B (en) * 2014-09-01 2015-08-12 北京工业大学 A kind of detection method of vision attention and system
WO2016033950A1 (en) * 2014-09-01 2016-03-10 北京工业大学 Visual focus detection method and system
CN104173063A (en) * 2014-09-01 2014-12-03 北京工业大学 Visual attention detection method and system
JP2016159050A (en) * 2015-03-04 2016-09-05 富士通株式会社 Pupil diameter measuring device, pupil diameter measuring method and its program
CN112450875A (en) * 2020-12-21 2021-03-09 江苏省肿瘤医院 Accurate pupil measuring apparatu
CN112331003A (en) * 2021-01-06 2021-02-05 湖南贝尔安亲云教育有限公司 Exercise generation method and system based on differential teaching
CN112331003B (en) * 2021-01-06 2021-03-23 湖南贝尔安亲云教育有限公司 Exercise generation method and system based on differential teaching

Similar Documents

Publication Publication Date Title
WO2007125794A1 (en) Data measuring device and data measuring method
KR101998595B1 (en) Method and Apparatus for jaundice diagnosis based on an image
US10945637B2 (en) Image based jaundice diagnosing method and apparatus and image based jaundice diagnosis assisting apparatus
JP7178423B2 (en) Guidance method and system for teledental imaging
TWI669103B (en) Information processing device, information processing method and program
US20140316235A1 (en) Skin imaging and applications
Jongerius et al. Eye-tracking glasses in face-to-face interactions: Manual versus automated assessment of areas-of-interest
WO2018095994A1 (en) Method and system for classifying optic nerve head
KR20150107565A (en) Electronic apparatus for providing health status information, method for controlling the same, and computer-readable storage medium
KR20140079864A (en) System and method for identifying eye conditions
US20170311872A1 (en) Organ image capture device and method for capturing organ image
US20180184899A1 (en) System and method for detection and monitoring of a physical condition of a user
JPWO2016067892A1 (en) Health level output device, health level output system and program
CN115317304A (en) Intelligent massage method and system based on physiological characteristic detection
JP5698293B2 (en) Portable medical image display terminal and operating method thereof
KR100874186B1 (en) Method and apparatus for photographing snow-collected images of subjects by themselves
JP2016198140A (en) Organ image capturing device
US20240386546A1 (en) Analysis of fundus autofluorescence images
TW201441944A (en) Optical apparatus and operating method thereof
CN114240934B (en) A method and system for image data analysis based on acromegaly
JP5474663B2 (en) Measles support system
Paul et al. Fundus Imaging Based Affordable Eye Care.
CN112674714A (en) Mobile phone image examination optometry method combining filter and peripheral equipment
EP4333695A1 (en) System for assisting with provision of diagnostic information
US12209902B1 (en) Environment classification based on light assessment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07741866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07741866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP