WO2025182923A1 - Ophthalmic device, control method therefor, program, and recording medium - Google Patents
Ophthalmic device, control method therefor, program, and recording mediumInfo
- Publication number
- WO2025182923A1 WO2025182923A1 PCT/JP2025/006402 JP2025006402W WO2025182923A1 WO 2025182923 A1 WO2025182923 A1 WO 2025182923A1 JP 2025006402 W JP2025006402 W JP 2025006402W WO 2025182923 A1 WO2025182923 A1 WO 2025182923A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cross
- unit
- vascular
- image
- blood vessel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/12—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
Definitions
- This disclosure relates to an ophthalmic device, a control method therefor, a program, and a recording medium.
- Imaging modalities are used in ophthalmology.
- Representative examples include fundus cameras, scanning laser ophthalmoscopy (SLO), slit lamp microscopes, and optical coherence tomography (OCT).
- SLO scanning laser ophthalmoscopy
- OCT optical coherence tomography
- Structural imaging using OCT is a technique that represents the spatial distribution of OCT signal intensity, which changes depending on the structure of the test object, as an image. Images generated using this technique are called OCT intensity images or simply intensity images.
- OCT blood flow measurement is a Doppler measurement that uses OCT to determine blood flow dynamics, and is also called Doppler OCT.
- OCT blood flow measurement is a technique that repeatedly scans the cross section of a blood vessel with OCT measurement light to collect a data set, and determines the Doppler signal due to blood flow from the difference in this data set, as well as the angle (Doppler angle) between the blood vessel and the OCT measurement light, thereby determining the magnitude of retinal blood flow velocity. Furthermore, the blood flow volume can be calculated by multiplying the obtained blood flow velocity by the cross-sectional area of the blood vessel.
- OCT blood flow measurement is typically applied to blood vessels in the fundus, and particularly retinal blood vessels. However, there have also been reports of OCT blood flow measurement of choroidal blood vessels.
- the purpose of this disclosure is to improve fundus hemodynamic measurement using Doppler OCT.
- an ophthalmologic apparatus includes a scanning unit, a cross-sectional image generating unit, a vascular region identifying unit, a vascular region matching unit, and a vascular map generating unit.
- the scanning unit is configured to collect data by applying an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye.
- the cross-sectional image generating unit is configured to generate multiple cross-sectional images corresponding to multiple cross sections of the fundus based on the data collected by the scanning unit.
- the vascular region identifying unit is configured to identify multiple vascular region groups corresponding to the multiple cross-sectional images by detecting vascular region groups from each of the multiple cross-sectional images generated.
- the vascular region matching unit is configured to match vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified from the multiple cross-sectional images.
- the vascular map generating unit is configured to generate a vascular map indicating the distribution of blood vessels based on the results of the vascular region matching.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 10 is a flowchart illustrating the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 2 is a schematic diagram for explaining the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- 10 is a flowchart illustrating the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- 10 is a flowchart illustrating the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- 10 is a flowchart illustrating the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- FIG. 1 is a schematic diagram illustrating a configuration of an ophthalmologic apparatus according to a non-limiting embodiment.
- 10 is a flowchart illustrating the operation of an ophthalmologic apparatus according to a non-limiting embodiment.
- ophthalmic devices e.g., ophthalmic blood flow measuring devices, ophthalmic imaging devices, etc.
- methods for controlling ophthalmic devices embodiments of programs, and embodiments of recording media.
- the categories of embodiments of the present disclosure are not limited to these.
- Embodiments of the present disclosure can be employed to solve problems that arise when measuring fundus hemodynamics using Doppler OCT. There are a variety of problems that arise when measuring fundus hemodynamics using Doppler OCT.
- Some embodiments of the present disclosure are intended to improve the quality of processes and tasks related to fundus hemodynamic measurements.
- processes and tasks of interest include, but are not limited to, alignment of the device optical system with the fundus, estimation of Doppler angles, and search for blood vessels to which measurements are to be applied.
- Some embodiments provide a novel alignment method that utilizes the positions of fundus blood vessels obtained from OCT images.
- Some embodiments provide a novel method for generating orientation information (Doppler angle, its suitability, etc.) of fundus blood vessels. Furthermore, some embodiments use the generated orientation information to facilitate and reduce the effort required to specify blood flow measurement positions (blood vessels to be measured, cross sections to be measured, etc.). Furthermore, some embodiments improve the precision and accuracy of specifying blood flow measurement positions. Furthermore, some embodiments make it possible to predict blood flow measurement positions to a certain extent. This makes it possible to limit the measurement application area and analysis application area before actually applying fundus hemodynamic measurement, further facilitating and reducing the effort required to specify blood flow measurement positions.
- Some embodiments of the present disclosure also address the following problem:
- the phase signal obtained by fundus hemodynamic measurement has a good contrast mechanism.
- the detected signal strength is low, which can result in poor measurement quality.
- veins which have a weaker pulsation than arteries, or when measurement is performed at an unsuitable Doppler angle, the detected signal strength can also be reduced, resulting in poor measurement quality.
- fundus blood vessels run and are distributed in a complex three-dimensional manner. The actual blood vessel course can be determined by performing fundus imaging. Conventionally, this information has not been utilized.
- the ophthalmic apparatus according to the embodiment has a function of performing OCT blood flow measurement and a function of processing data obtained by the OCT blood flow measurement.
- the ophthalmic device of the aspect primarily described in this disclosure functions as an OCT device capable of performing OCT blood flow measurement (OCT scanning and image generation processing). Some other aspects of the ophthalmic device may not be capable of performing at least some of the processing involved in OCT blood flow measurement.
- the OCT method may be any method, for example, spectral domain OCT or swept-source OCT.
- Spectral domain OCT is a method in which light from a low-coherence light source is split into measurement light and reference light, and return light from the test object is superimposed on the reference light to generate interference light. The spectral distribution of this interference light is detected with a spectrometer, and the detected spectral distribution is subjected to processing such as Fourier transform to construct an image.
- Swept-source OCT is a method in which light from a tunable light source is split into measurement light and reference light, and return light from the test object is superimposed on the reference light to generate interference light.
- spectral domain OCT is an OCT method that acquires spectral distributions using spatial division
- swept-source OCT is an OCT method that acquires spectral distributions using time division
- Other OCT methods such as time domain OCT, may also be used.
- the ophthalmic device of the embodiment primarily described in this disclosure functions as a fundus camera capable of photographing the fundus.
- Other embodiments of the ophthalmic device may function as any ophthalmic imaging modality, such as an SLO, a slit lamp microscope, or a surgical microscope, in addition to or instead of functioning as a fundus camera.
- image data and the "image” that is the visual information based on it. Furthermore, unless otherwise specified, no distinction is made between the part or tissue of the subject's eye and its image (image data).
- the ophthalmologic apparatus does not need to have a fundus imaging function.
- Such an ophthalmologic apparatus has the function of acquiring a front image of the fundus from a storage device or recording medium.
- a storage device is a medical image archiving system (medical image filing system).
- typical examples of recording media are a hard disk drive or an optical disk.
- the circuitry or processing circuitry may include any of a general purpose processor, a special purpose processor, an integrated circuit, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), a programmable logic device (e.g., a Simple Programmable Logic Device (SPLD), a Complex Programmable Logic Device (CPLD), a Field Programmable Gate Array (FPGA)), conventional circuitry, and any combination thereof, configured and/or programmed to perform at least some of the disclosed functions.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- ASIC Application Specific Integrated Circuit
- SPLD Simple Programmable Logic Device
- CPLD Complex Programmable Logic Device
- FPGA Field Programmable Gate Array
- a circuit configuration, unit, means, or similar term refers to hardware that performs at least some of the disclosed functions or that is programmed to perform at least some of the disclosed functions.
- the hardware may be hardware disclosed herein or known hardware that is programmed and/or configured to perform at least some of the described functions. If the hardware is a processor, which can be considered a type of circuit configuration, the circuit configuration, unit, means, or similar term refers to a combination of hardware and software, and the software is used to configure the hardware and/or processor.
- the configuration of an exemplary ophthalmic device is shown in Figures 1 to 4.
- the ophthalmic device 1 in this example includes a fundus camera unit 2, an OCT unit 100, and an arithmetic and control unit 200.
- the fundus camera unit 2 is equipped with elements of a fundus camera capable of photographing the fundus and the anterior segment, and elements of an OCT scanner.
- the OCT unit 100 is equipped with elements of an OCT scanner.
- the arithmetic and control unit 200 includes one or more processors configured to perform various processes (calculation, analysis, control, etc.).
- the fundus camera unit 2 includes an optical system for photographing the fundus Ef (and the anterior segment) of the subject's eye E.
- the digital image acquired by the fundus camera unit 2 is typically a front image.
- the fundus camera unit 2 can, for example, acquire observation images by video capture using near-infrared fixed light as illumination light, and can acquire photographed images by capture using visible flash light as illumination light.
- the fundus camera unit 2 comprises an illumination optical system 10 and an imaging optical system 30.
- the illumination optical system 10 irradiates illumination light onto the subject's eye E.
- the imaging optical system 30 detects the return light of the illumination light irradiated onto the subject's eye E. In other words, the imaging optical system 30 photographs the subject's eye E illuminated by the illumination light.
- the OCT measurement light provided by the OCT unit 100 is guided to the subject's eye E via an optical path within the fundus camera unit 2.
- the return light of the OCT measurement light applied to the subject's eye E is guided to the OCT unit 100 via an optical path within the fundus camera unit 2.
- the observation illumination light output from the observation light source 11 of the illumination optical system 10 is reflected by the concave mirror 12, passes through the condenser lens 13, and passes through the visible cut filter 14 to become near-infrared light.It is then focused near the imaging light source 15, reflected by the mirror 16, and passed through the relay lens system 17, relay lens 18, aperture 19, and relay lens system 20 to be guided to the perforated mirror 21, reflected by the mirror portion surrounding the central hole of the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the subject's eye E (fundus Ef).
- the return light of the observation illumination light projected onto the subject's eye E is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the central hole in the perforated mirror 21, passes through the dichroic mirror 55, passes through the photographing focusing lens 31, is reflected by the mirror 32, passes through the half mirror 33A, is reflected by the dichroic mirror 33, and is imaged on the light-receiving surface of the image sensor 35 by the imaging lens 34.
- the image sensor 35 detects the return light at regular time intervals (frame rate).
- the focus of the photographing optical system 30 is adjusted according to the photographing area.
- the imaging illumination light output from the imaging light source 15 travels the same path as the observation illumination light and is projected onto the fundus Ef.
- the return light of the imaging illumination light from the subject's eye E travels the same path as the return light of the observation illumination light and is guided to the dichroic mirror 33, passes through the dichroic mirror 33, is reflected by the mirror 36, and is imaged by the imaging lens 37 on the light-receiving surface of the image sensor 38.
- the liquid crystal display (LCD) 39 displays a fixation target (fixation target image) for guiding and fixing the gaze.
- the light beam output from the LCD display 39 is reflected by the half mirror 33A, reflected by the mirror 32, passes through the photographic focusing lens 31 and dichroic mirror 55, passes through the central hole in the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus Ef. This allows the subject to visually recognize the fixation target.
- the alignment optical system 50 generates an alignment index for aligning the ophthalmic apparatus 1 with the subject's eye E.
- the alignment light output from the light-emitting diode (LED) 51 passes through the diaphragm 52, the diaphragm 53, and the relay lens 54, is reflected by the dichroic mirror 55, passes through the central hole in the perforated mirror 21, transmits through the dichroic mirror 46, and is projected onto the subject's eye E via the objective lens 22.
- the return light of the alignment light from the subject's eye E is guided to the image sensor 35 along the same path as the return light of the observation illumination light.
- Manual alignment and automatic alignment can be performed by referring to the received light image (alignment index image).
- the focusing optical system 60 generates a split index used to adjust the focus of the subject's eye E.
- the focusing optical system 60 moves along the optical path (illumination optical path) of the illumination optical system 10 in conjunction with the movement of the photographing focusing lens 31 along the optical path (photographing optical path) of the photographing optical system 30.
- the reflecting rod 67 is inserted and removed from the illumination optical path. When adjusting the focus, the reflective surface of the reflecting rod 67 is positioned at an angle to the illumination optical path.
- the focusing light output from the LED 61 passes through the relay lens 62, is separated into two beams of light by the split index plate 63, passes through the two-hole diaphragm 64, is reflected by the mirror 65, is first imaged and reflected by the condenser lens 66 on the reflective surface of the reflecting rod 67, passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, and is projected onto the subject's eye E via the objective lens 22.
- the return light of the focusing light from the subject's eye E is guided to the image sensor 35 along the same path as the return light of the alignment light.
- Manual focusing and autofocusing can be performed by referring to the received light image (split target image).
- a diopter correction lens 70 (plus lens) is placed in the imaging optical path between the perforated mirror 21 and the dichroic mirror 55.
- a diopter correction lens 71 (minus lens) is placed.
- the dichroic mirror 46 combines the optical path for imaging by the fundus camera unit 2 with the optical path for OCT (measurement arm).
- the dichroic mirror 46 reflects light in the wavelength band for OCT and transmits light in the wavelength band for imaging by the fundus camera unit 2.
- the measurement arm is equipped with, in order from the OCT unit 100 side, a collimator lens unit 40, a retroreflector 41, a dispersion compensation element 42, an OCT focusing lens 43, an optical scanner 44, and a relay lens 45.
- the retroreflector 41 is movable along the optical path of the OCT measurement light incident on it and is used to correct the optical path length according to the axial length and adjust the interference state.
- the dispersion compensation element 42 is used for dispersion compensation between the measurement arm and the reference arm.
- the OCT focusing lens 43 is movable along the measurement arm and is used to adjust the focus of the measurement arm. Focus adjustment of the ophthalmologic apparatus 1 is performed by coordinating the movement of the imaging focusing lens 31, the movement of the focusing optical system 60, and the movement of the OCT focusing lens 43.
- the optical scanner 44 is positioned at a position substantially conjugate with the pupil of the subject's eye E through alignment, and changes the direction of travel of the OCT measurement light.
- the optical scanner 44 is, for example, a galvano scanner capable of two-dimensional scanning.
- the OCT unit 100 will now be described.
- the OCT unit 100 shown in Figure 2 is equipped with a spectral domain OCT optical system.
- This OCT optical system includes an interference optical system.
- This interference optical system splits light from a low-coherence light source (broadband light source) into measurement light LS (OCT measurement light) and reference light LR, and generates interference light LC by superimposing the return light of the measurement light LS projected onto the subject's eye E on the reference light LR.
- the generated interference light LC is detected by the spectroscope 130. This provides a signal indicating the spectral distribution of the interference light LC. This detection signal is sent to the arithmetic and control unit 200.
- the light source unit 101 outputs broadband low-coherence light L0.
- the light source unit 101 includes an optical output device such as a superluminescent diode (SLD), LED, or semiconductor optical amplifier (SOA).
- SLD superluminescent diode
- SOA semiconductor optical amplifier
- the low-coherence light L0 output from the light source unit 101 is guided by optical fiber 102 to polarization controller 103, where its polarization state is adjusted, and then guided by optical fiber 104 to fiber coupler 105, where it is split into measurement light LS and reference light LR.
- the measurement light LS is guided by the measurement arm, and the reference light LR is guided by the reference arm.
- the reference light LR is guided by optical fiber 110 to collimator 111, where it is converted into a parallel beam. It then passes through optical path length compensation element 112, which compensates for the optical distance between the measurement arm and reference arm, and dispersion compensation element 113, which compensates for dispersion between the measurement arm and reference arm, before being guided to retroreflector 114. Retroreflector 114 is movable along the optical path of the reference light LR incident thereon, and is used to correct the optical path length according to the axial length and adjust the interference state. After passing through retroreflector 114, the reference light LR passes through dispersion compensation element 113 and optical path length compensation element 112, and is converted from a parallel beam into a focused beam by collimator 116. It is then guided via optical fiber 117 to polarization controller 118, where its polarization state is adjusted, and then through optical fiber 119 to attenuator 120, where its light intensity is adjusted. It then reaches fiber coupler 122 via optical fiber 121.
- the measurement light LS is guided through the optical fiber 127 to the collimator lens unit 40, where it is converted into a parallel beam of light, passes through the retroreflector 41, dispersion compensation member 42, OCT focusing lens 43, optical scanner 44, and relay lens 45, is reflected by the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the subject's eye E.
- the measurement light LS is scattered and reflected at various depth positions in the subject's eye E.
- the return light of the measurement light LS from the subject's eye E travels backward through the measurement arm, is guided to the fiber coupler 105, and reaches the fiber coupler 122 via the optical fiber 128.
- Fiber coupler 122 generates interference light LC by superimposing measurement light LS incident via optical fiber 128 and reference light LR incident via optical fiber 121.
- the generated interference light LC is guided to spectrometer 130 via optical fiber 129.
- spectrometer 130 converts the incident interference light LC into a parallel beam using a collimator lens, resolves the parallel beam of interference light LC into multiple spectral components using a diffraction grating, and projects the multiple spectral components generated by the diffraction grating onto an image sensor via a lens.
- This image sensor is, for example, a line sensor, and detects the multiple spectral components of the interference light LC to generate an electrical signal (detection signal).
- the generated detection signal contains information on the spectral distribution of the interference light LC and is sent to the arithmetic and control unit 200.
- the OCT unit 100 in Figure 2 described above employs the spectral domain OCT method.
- the light source unit 101 includes a tunable light source (e.g., a near-infrared tunable laser) that rapidly changes the wavelength of the emitted light.
- the interference light LC generated by superimposing the measurement light LS and the reference light LR is split at a predetermined splitting ratio (e.g., 1:1) to generate a pair of interference light beams, which are then detected by a photodetector.
- This photodetector includes a balanced photodiode.
- the balanced photodiode includes a pair of photodetectors that respectively detect the pair of interference light beams, and outputs the difference between the pair of detection signals obtained by the pair of photodetectors.
- the photodetector sends this difference signal to a data acquisition system (DAQ).
- a clock is supplied to the data acquisition system from the light source unit 101. This clock is generated in the light source unit 101 in synchronization with the output timing of each wavelength swept within a predetermined wavelength range by the tunable light source. For example, the light source unit 101 splits light of each output wavelength to generate two split lights, optically delays one of the split lights, then combines the two split lights, detects the resulting combined light, and generates a clock based on the detection signal.
- the data collection system uses the clock provided by the light source unit 101 to sample the detection signal (differential signal) input from the photodetector. The data obtained by this sampling is provided for processing such as image generation.
- optical path length changing elements are provided in both the measurement arm and the reference arm, but only one of them may be provided.
- the optical path length changing elements are not limited to retroreflectors.
- the optical path length changing element of the reference arm may be a movable reflecting member (reference mirror).
- the ophthalmic device according to the present disclosure includes an element configured to relatively change the measurement arm length and the reference arm length (i.e., an element configured to change the optical path length difference between the measurement arm and the reference arm), and this element can be used to move the coherence gate position.
- the arithmetic and control unit 200 performs various processes, such as controlling each part of the ophthalmic apparatus 1, various calculations, and various analyses. For example, the arithmetic and control unit 200 calculates the reflection intensity profile of a line (A-line) extending in the depth direction (z-direction) at each projection position of the measurement light LS by performing signal processing such as Fourier transform on the spectral distribution (interference signal, interferogram) acquired by the spectroscope 130. Furthermore, the arithmetic and control unit 200 generates image data by imaging the reflection intensity profile of each A-line. The arithmetic and control unit 200 may perform the same calculations as in image generation using conventional spectral domain OCT.
- the arithmetic and control unit 200 includes, for example, a processor, RAM, ROM, a hard disk drive, a communication interface, etc. Various computer programs are stored in storage devices such as hard disk drives.
- the arithmetic and control unit 200 may also include an operation device, an input device, a display device, etc.
- the user interface 240 shown in Figure 3 will now be described.
- the user interface 240 has a display unit 241 and an operation unit 242.
- the display unit 241 includes, for example, the display device 3 of Figure 1.
- the operation unit 242 includes various operation devices and input devices.
- the user interface 240 may include a touch panel. In some exemplary embodiments, at least a portion of the user interface is provided as a peripheral device connected to the ophthalmologic apparatus 1.
- the moving mechanism 150 shown in Figure 3 will now be described.
- the moving mechanism 150 is configured to move the optical system of the ophthalmologic apparatus 1.
- the moving mechanism 150 for example, moves at least the fundus camera unit 2 three-dimensionally.
- the data input/output unit 290 shown in Figure 3 will be described.
- the data input/output unit 290 inputs data to the ophthalmic device 1 and outputs data from the ophthalmic device 1.
- a non-limiting embodiment of the data input/output unit 290 has a function for communicating with an external device (not shown), for example.
- the communication unit 290 has a communication interface according to the connection form with the external device.
- the external device may be, for example, any ophthalmic device.
- the external device may also be any information processing device, such as a Hospital Information System (HIS) server, a DICOM (Digital Imaging and Communication in Medicine) server, a doctor's terminal, a mobile terminal, a personal terminal, or a cloud server.
- HIS Hospital Information System
- DICOM Digital Imaging and Communication in Medicine
- Some exemplary embodiments of the data input/output unit 290 include a device that reads information from a recording medium (data reader) and a device that writes information to a recording medium (data writer).
- the embodiments of the data input/output unit 290 are not limited to these.
- the processing system (arithmetic and control system) of the ophthalmic apparatus 1 will now be described.
- An example configuration of the processing system is shown in Figures 3 and 4.
- the control unit 210 and data processing unit 230 are provided in the arithmetic and control unit 200.
- the control unit 210 includes a processor and controls each component of the ophthalmic apparatus 1.
- the control unit 210 includes a main control unit 211 and a memory unit 212.
- the main control unit 211 includes a processor and is configured to control each component of the ophthalmic apparatus 1 (including the components shown in Figures 1 to 3).
- the main control unit 211 may also be configured to control apparatuses, devices, and systems connected to the ophthalmic apparatus 1.
- the functions of the main control unit 211 are realized, for example, by cooperation between hardware including circuits and control software.
- the memory unit 212 stores various types of data.
- the memory unit 212 includes storage devices such as hard disk drives and solid state drives.
- the main controller 211 controls the imaging focusing driver (not shown) to synchronously move the imaging focusing lens 31 and the focus optical system 60.
- the main controller 211 controls the retroreflector (RR) driver 41A to move the retroreflector 41 of the measurement arm.
- the main controller 211 controls the OCT focusing driver 43A to move the OCT focusing lens 43 of the measurement arm.
- the main controller 211 controls the optical scanner 44 to deflect the measurement light LS according to a preset scan pattern.
- the main controller 211 controls the retroreflector (RR) driver 114A to move the retroreflector 114 of the reference arm.
- the main controller 211 controls the movement mechanism 150 to move the optical system (e.g., the fundus camera unit 2 and OCT unit 100).
- the data processing unit 230 performs various types of data processing.
- the data processing unit 230 applies various types of processing to images (fundus images, anterior segment images, etc.) acquired by the fundus camera unit 2.
- the data processing unit 230 also applies various types of processing to images acquired using OCT scanning (OCT images).
- OCT images OCT images.
- the data processing unit 230 includes a processor.
- the data processing unit 230 is realized, for example, by the cooperation of hardware including circuits and data processing software.
- the data processing unit 230 includes an image generation unit 220.
- the image generation unit 220 processes data collected by applying an OCT scan to the fundus Ef of the subject's eye E to generate OCT image data.
- the image generation unit 220 includes a processor.
- the functions of the image generation unit 220 are realized, for example, by cooperation between hardware including circuits and image generation software.
- the image generation unit 220 is configured to perform a process of generating an OCT intensity image that represents the intensity of the interference signal as visual information, and a process of generating a phase image that represents the phase information of the interference signal as visual information.
- a process of generating an intensity image will be described.
- a non-limiting example of the process of generating a phase image will be described later, along with an explanation of the theoretical aspects of OCT blood flow measurement.
- the image generation unit 220 generates an intensity image based on the data (interference signal) acquired by the spectrometer 130. Similar to conventional spectral-domain OCT, this intensity image generation process includes signal processing such as A/D conversion, denoising, filtering, and fast Fourier transform (FFT). The fast Fourier transform converts the interference signal acquired by the spectrometer 130 into an A-line profile (a reflection intensity profile along the z direction). The A-line profile is visualized by applying imaging processing (a process that assigns pixel values to reflection intensity values) to the A-line profile. This results in A-scan image data.
- image processing a process that assigns pixel values to reflection intensity values
- a cross-sectional image (e.g., B-scan image data, circle scan image data, etc.) corresponding to that scan pattern is constructed. If another OCT method is used, the cross-sectional image generation unit 221 performs known processing appropriate for that type.
- the intensity image may be a dataset including a group of A-scan image data obtained by visualizing the reflection intensity profile of multiple A-lines arranged in the area where the OCT scan was applied.
- the intensity image may be a dataset including a group of A-scan image data and their position information (coordinates).
- the intensity image may be stack data constructed by embedding multiple B-scan images in a single three-dimensional coordinate system, that is, a dataset including multiple B-scan images and their position information.
- the intensity image may be volume data (voxel data) generated by applying a voxelization process to the stack data.
- Stack data and volume data are non-limiting examples of three-dimensional image data in which pixel coordinates are defined using a three-dimensional coordinate system. The process of generating the three-dimensional image data is performed by the image generation unit 220.
- the image generation unit 220 can process three-dimensional image data.
- the image generation unit 220 can generate new image data by applying rendering to the three-dimensional image data.
- Rendering techniques include volume rendering, surface rendering, multi-planar reconstruction (MPR), maximum intensity projection (MIP), minimum intensity projection (MinIP), and average intensity projection (AIP).
- the image generation unit 220 can construct projection data by integrating (projecting) the three-dimensional image data in the z direction.
- the image generation unit 220 can construct a shadowgram by integrating (projecting) a portion of the three-dimensional image data (three-dimensional partial image data) in the z direction.
- the three-dimensional partial image data is extracted from the three-dimensional image data using any image segmentation method.
- the ophthalmic device 1 can apply OCT blood flow measurement to the fundus Ef.
- OCT blood flow measurement we will explain the theoretical aspects of OCT blood flow measurement, as well as some non-limiting aspects of OCT blood flow measurement.
- a non-limiting embodiment of blood flow measurement applies two types of scans (main scan and supplementary scan) to the fundus Ef.
- the main scan repeatedly scans a region of interest (cross section of interest) that intersects the blood vessel of interest in the fundus Ef at a position of interest with the measurement light LS to obtain phase image data.
- the supplementary scan scans a predetermined cross section (supplementary cross section) with the measurement light LS to estimate the inclination of the blood vessel of interest in the cross section of interest.
- the supplementary cross section may be, for example, a cross section (first supplementary cross section) that intersects the blood vessel of interest and is located near the cross section of interest.
- the supplementary cross section may be a cross section (second supplementary cross section) that intersects the cross section of interest and is aligned with the blood vessel of interest.
- the inclination of the blood vessel of interest is the angle between the measurement light LS projected onto the cross section of interest and the blood vessel of interest, which is the Doppler angle in Doppler OCT.
- FIG. 5A An example of when the first supplemental cross section is applied is shown in Figure 5A.
- one cross section of interest C0 located near the optic disc Da of the fundus Ef, and two supplemental cross sections C1 and C2 located nearby are set to intersect with the blood vessel of interest Db.
- One of the two supplemental cross sections C1 and C2 is located upstream of the blood vessel of interest Db relative to the cross section of interest C0, and the other is located downstream.
- the cross section of interest C0 and the supplemental cross sections C1 and C2 are oriented, for example, approximately perpendicular to the running direction of the blood vessel of interest Db.
- Figure 5B shows an example of when the second supplementary cross section is applied.
- a cross section of interest C0 similar to the example shown in Figure 5A is set so as to be approximately perpendicular to the blood vessel of interest Db
- a supplementary cross section Cp is set so as to be approximately perpendicular to the cross section of interest C0.
- the supplementary cross section Cp is set along the blood vessel of interest Db.
- the supplementary cross section Cp may be set so as to pass through the central axis of the blood vessel of interest Db at the position of the cross section of interest C0.
- the time for performing the main scan may be a fixed, predetermined time, or a time set for each subject or test. This fixed time has traditionally been set to a time (e.g., 2 seconds) that is sufficiently longer than a standard cardiac cycle. Furthermore, the time set for each subject or test has traditionally been determined by referring to data from a biosignal detector such as an electrocardiograph.
- the image generation unit 220 includes a cross-sectional image generation unit 221 and a phase image generation unit 222.
- the cross-sectional image generation unit 221 includes a processor, and its functions are realized, for example, by cooperation between hardware including circuits and cross-sectional image generation software.
- the phase image generation unit 222 includes a processor, and its functions are realized, for example, by cooperation between hardware including circuits and phase image generation software.
- the cross-sectional image generation unit 221 generates an intensity image based on data collected by OCT scanning of the fundus oculi Ef.
- the intensity image generation process may be similar to conventional image generation methods used in spectral domain OCT.
- the cross-sectional image generation unit 221 generates cross-sectional images (main cross-sectional images) representing time-series changes in the morphology of the cross-section of interest based on interference signals obtained by the spectroscope 130 during main scanning of the cross-section of interest of the fundus oculi Ef.
- the ophthalmic device 1 applies repeated scans to the cross-section of interest C0.
- This repeated scan includes multiple B-scans for the cross-section of interest C0.
- the interference signals generated sequentially by the spectroscope 130 in the multiple B-scans are input sequentially to the cross-sectional image generation unit 221.
- the cross-sectional image generation unit 221 generates one main cross-sectional image corresponding to the cross-section of interest C0 based on the interference signals corresponding to each B-scan.
- the cross-sectional image generation unit 221 repeats this process the number of times the B-scan is repeated in the main scanning, thereby generating a series of main cross-sectional images in chronological order. In this way, the cross-sectional image generation unit 221 generates multiple intensity images corresponding to multiple B-scans based on the data set collected by repeated scanning of the main scanning.
- the image quality of the principal cross-sectional images can be improved by dividing a series of principal cross-sectional images obtained by main scanning into multiple groups, and applying image synthesis (e.g., averaging) to the principal cross-sectional images included in each group to generate multiple synthesized images.
- image synthesis e.g., averaging
- the cross-sectional image generation unit 221 generates a cross-sectional image (supplementary cross-sectional image) representing the morphology of the supplementary cross-section based on the interference signal obtained by the spectroscope 130 during supplementary scanning of the supplementary cross-section of the fundus oculi Ef.
- the process of generating the supplementary cross-sectional image is performed in the same manner as the process of generating the main cross-sectional image.
- the supplementary cross-sectional image may be one cross-sectional image, or two or more cross-sectional images.
- the image quality of the supplementary cross-sectional image can be improved by scanning the supplementary cross-section multiple times to generate multiple cross-sectional images, and then applying image synthesis to these cross-sectional images to generate a synthesized image.
- the cross-sectional image generation unit 221 When the supplementary cross-sections C1 and C2 illustrated in FIG. 5A are applied, the cross-sectional image generation unit 221 generates a supplementary cross-sectional image corresponding to the supplementary cross-section C1 and a supplementary cross-sectional image corresponding to the supplementary cross-section C2.
- the cross-sectional image generation unit 221 When the supplementary cross-section Cp illustrated in FIG. 5B is applied, the cross-sectional image generation unit 221 generates a supplementary cross-sectional image corresponding to the supplementary cross-section Cp.
- the phase image generation unit 222 generates a phase image that represents the time-series changes in phase difference in the cross section of interest based on the interference signal obtained by the spectroscope 130 during the main scan.
- the interference signal used to generate the phase image may be the same as the interference signal used to generate the main cross section image by the cross section image generation unit 221.
- a natural positional correspondence is defined between the pixels of the main cross section image and the pixels of the phase image, making it easy to align the main cross section image and the phase image.
- the main cross section image and the phase image may be generated from different interference signals. In this case, for example, it is possible to align the main cross section image and the phase image using a known image registration method.
- the phase image in this example is obtained by calculating the phase difference between adjacent A-line complex signals (i.e., signals corresponding to adjacent scanning points).
- the phase image in this example is generated based on the time-series changes in pixel values (brightness values) of the principal cross-sectional image.
- the phase image generation unit 222 creates a graph showing the time-series changes in the brightness value of that pixel.
- This phase difference ⁇ is then defined as the phase difference ⁇ (t1) at point t1 (or more generally, any point in time between points t1 and t2).
- the time-series changes in the phase difference at that pixel can be obtained.
- the time-series changes in the phase difference can be obtained by making the time interval ⁇ t sufficiently small to ensure phase correlation.
- the scanning (main scanning) of the measurement light LS performs oversampling with the time interval ⁇ t set to a value smaller than the time corresponding to the resolution of the cross-sectional image.
- a phase image is an image obtained by visually representing the phase difference value for each pixel at each time point (imaging process).
- This imaging process includes, for example, processing that represents the phase difference value using predetermined display parameters (e.g., display color, brightness, etc.).
- display parameters e.g., display color, brightness, etc.
- Some forms of imaging process can use different display colors to indicate an increase in phase over time and a decrease in phase over time. For example, an increase in phase over time can be represented by red, and a decrease by blue.
- some forms of imaging process can represent the magnitude of phase change (amount of phase change) as the intensity of the display color.
- Some forms of imaging process described herein make it possible to visualize the direction and magnitude of blood flow.
- a phase image is generated by performing such imaging process on each pixel.
- the data processing unit 230 includes, as exemplary elements for determining hemodynamic information, a vascular region identification unit 231 and a hemodynamic information generation unit 232.
- the hemodynamic information generation unit 232 may include a Doppler angle calculation unit 233, a blood flow velocity calculation unit 234, a vascular diameter calculation unit 235, and a blood flow amount calculation unit 236.
- the vascular region identification unit 231 includes, for example, a processor operable according to a vascular region identification program.
- the hemodynamic information generation unit 232 includes, for example, a processor operable according to a hemodynamic information generation program.
- the Doppler angle calculation unit 233 includes, for example, a processor operable according to a Doppler angle calculation program.
- the blood flow velocity calculation unit 234 includes, for example, a processor operable according to a blood flow velocity calculation program.
- the vascular diameter calculation unit 235 includes, for example, a processor operable according to a vascular diameter calculation program.
- the blood flow volume calculation unit 236 includes, for example, a processor operable according to a blood flow volume calculation program.
- the vascular region identification unit 231 analyzes an OCT image of the fundus and identifies an image region (vascular region) corresponding to a blood vessel in this OCT image.
- the vascular region identification unit 231 also analyzes a front image of the fundus (e.g., an observed image or photographed image acquired by the fundus camera unit 2) and identifies an image region (vascular region) corresponding to a blood vessel in this front image.
- the vascular region identification process performed by the vascular region identification unit 231 may be image processing using any image segmentation, and is performed, for example, by analyzing pixel values in the target image (e.g., threshold processing).
- the vascular region identification unit 231 identifies a vascular region corresponding to the blood vessel of interest Db from each of the principal cross-sectional image, supplementary cross-sectional image, and phase image.
- the principal and supplementary cross-sectional images have sufficient resolution to be analyzed in the vascular region identification process, but the phase image does not have enough resolution to identify the boundaries of the vascular region. Even in such cases, since blood flow dynamics information is generated based on the phase image, it is necessary to identify the vascular region in the phase image with high accuracy. To achieve this, the process described below can be used, for example.
- the vascular region identification unit 231 can perform a process of analyzing the principal cross-sectional image to identify the vascular region, and a process of identifying the image region in the phase image that corresponds to the vascular region in the principal cross-sectional image based on the positional correspondence relationship.
- the image region in the phase image is used as the vascular region in this phase image. This allows the vascular region in the phase image to be determined with high accuracy.
- the vascular region in the phase image can be determined by using the results of image registration (described above) between the principal cross-sectional image and the phase image instead of the natural positional correspondence relationship described above.
- the hemodynamic information generation unit 232 generates information indicating the hemodynamics of blood flow in the fundus blood vessels (hemodynamic information).
- the hemodynamic information may be information on any parameter (hemodynamic parameter) indicating the fundus hemodynamics. While this disclosure describes blood flow velocity and blood volume, hemodynamic parameters are not limited to these.
- the hemodynamic information generation unit 232 generates hemodynamic information related to the blood vessel of interest Db. As described above, some embodiments of the hemodynamic information generation unit 232 include a Doppler angle calculation unit 233, a blood flow velocity calculation unit 234, a blood vessel diameter calculation unit 235, and a blood flow rate calculation unit 236.
- the Doppler angle calculation unit 233 calculates an estimated value of the tilt of the blood vessel of interest based on the data of the supplementary cross section (cross-sectional data, supplementary cross-sectional image) collected by the supplementary scan.
- the calculated value may be, for example, a value based on the measured value of the tilt of the blood vessel of interest in the cross section of interest, or an approximate value thereof.
- the tilt of the blood vessel of interest is a parameter equivalent to the Doppler angle in Doppler OCT.
- the Doppler angle is the angle between the incident direction of the measurement light LS in the main scan for the cross section of interest and the direction of the axis of the blood vessel of interest (i.e., the tilt of the blood vessel of interest), the tilt of the blood vessel of interest is equivalent to the Doppler angle.
- the Doppler angle calculation unit 233 can calculate the gradient of the blood vessel of interest Db on the cross section of interest C0 based on the positional relationship between the cross section of interest C0, the supplementary cross sections C1, and the supplementary cross sections C2, and the vascular region identification result obtained by the vascular region identification unit 231.
- the method for calculating the gradient of the blood vessel of interest Db will be described with reference to Figure 6A.
- the symbols G0, G1, and G2 respectively indicate the principal cross-sectional image of the cross-section of interest C0, the supplementary cross-sectional image of the supplementary cross-section C1, and the supplementary cross-sectional image of the supplementary cross-section C2.
- the symbols V0, V1, and V2 respectively indicate the vascular region in the principal cross-sectional image G0, the vascular region in the supplementary cross-sectional image G1, and the vascular region in the supplementary cross-sectional image G2.
- the z coordinate axis shown in Figure 6A substantially coincides with the incident direction of the measurement light LS.
- the distance between the principal cross-sectional image G0 (cross-section of interest C0) and the supplementary cross-sectional image G1 (supplementary cross-section C1) is indicated by d
- the distance between the principal cross-sectional image G0 (cross-section of interest C0) and the supplementary cross-sectional image G2 (supplementary cross-section C2) is also indicated by d.
- the distance between adjacent cross-sectional images, i.e., the distance between adjacent cross-sections, is called the inter-section distance.
- the Doppler angle calculation unit 233 can calculate the gradient A of the blood vessel of interest Db in the cross section of interest C0 based on the positional relationship between the three blood vessel regions V0, V1, and V2. This positional relationship can be determined, for example, by connecting the three blood vessel regions V0, V1, and V2. As a specific example, the Doppler angle calculation unit 233 can identify the characteristic positions of each of the three blood vessel regions V0, V1, and V2 and connect these characteristic positions. This characteristic position may be, for example, one of the center position, center of gravity position, top (position with the smallest z coordinate value), and bottom (position with the largest z coordinate value). The characteristic positions may be connected by any method, such as connecting them with a line segment or an approximate curve (spline curve, Bezier curve, etc.).
- the Doppler angle calculation unit 233 calculates the gradient A of the blood vessel of interest Db in the cross section of interest C0 based on the connecting lines connecting the characteristic positions identified from the three vascular regions V0, V1, and V2. If the connecting lines are line segments, the Doppler angle calculation unit 233 can calculate the gradient A based on the gradient of a first line segment connecting the characteristic position of the cross section of interest C0 to the characteristic position of the supplementary cross section C1, and the gradient of a second line segment connecting the characteristic position of the cross section of interest C0 to the characteristic position of the supplementary cross section C2. A non-limiting example of this calculation process may be calculating the average gradient of the two line segments.
- the Doppler angle calculation unit 233 can calculate the gradient A as the gradient of the approximated curve at the position where the approximated curve intersects with the cross section of interest C0.
- the inter-section distance d is used, for example, when embedding the cross-sectional images G0 to G2 in an xyz coordinate system to find the connecting lines.
- the gradient may be calculated by considering two cross sections.
- the gradient A of the blood vessel Db of interest in the cross section C0 may be calculated as the gradient of the first line segment or the gradient of the second line segment.
- the gradient A of the blood vessel Db of interest in the cross section C0 may be calculated based on two supplementary cross section images G1 and G2.
- the Doppler angle calculation unit 233 can analyze the supplementary cross section image corresponding to the supplementary cross section Cp to calculate an approximate value of the gradient of the blood vessel of interest Db on the cross section of interest C0.
- the method for approximating the gradient of the blood vessel of interest Db will be explained with reference to Figure 6B.
- the symbol Gp indicates the supplementary cross-sectional image at the supplementary cross-section Cp.
- the symbol A indicates the gradient of the blood vessel of interest Db at the cross-section of interest C0, similar to the example shown in Figure 6A.
- the Doppler angle calculation unit 233 can analyze the supplementary cross-sectional image Gp to identify an image region corresponding to a specific tissue of the fundus oculi Ef.
- the Doppler angle calculation unit 233 can identify an image region (internal limiting membrane region) M corresponding to the internal limiting membrane (ILM), which is a superficial tissue of the retina.
- ILM internal limiting membrane
- the Doppler angle calculation unit 233 calculates the gradient A app of the internal limiting membrane region M on the cross section C0 of interest.
- the gradient A app of the internal limiting membrane region M on the cross section C0 of interest is used as an approximation of the gradient A of the blood vessel Db of interest on the cross section C0 of interest.
- the slope A shown in Figures 6A and 6B is a vector representing the orientation of the blood vessel of interest Db, and its value may be defined arbitrarily.
- the value of the slope A can be defined as the angle (Doppler angle) formed between the slope (vector) A and the z-axis.
- the slope A app shown in Figure 6B is a vector representing the orientation of the internal limiting membrane region M, and its value may be defined arbitrarily.
- the value of the slope A app can be defined as the angle (approximate value of the Doppler angle) formed between the slope (vector) A app and the z-axis.
- the orientation of the z-axis substantially coincides with the incident direction of the measurement light LS.
- the Doppler angle calculation unit 233 can analyze the supplementary cross-sectional image Gp shown in FIG. 6B to identify an image region corresponding to the blood vessel of interest Db and determine the gradient of that image region at a position corresponding to the cross section of interest C0. At this time, the Doppler angle calculation unit 233 can, for example, curve-approximate the boundary or central axis of the image region corresponding to the blood vessel of interest Db, and determine the gradient of that approximate curve at a position corresponding to the cross section of interest C0. A similar curve approximation can also be applied to the image region corresponding to a specific tissue of the fundus Ef described above (for example, the internal limiting membrane region M).
- the processing performed by the Doppler angle calculation unit 233 is not limited to the above example, and may be any processing that can obtain an estimated value of the inclination of the blood vessel Db of interest (e.g., the inclination value of the blood vessel Db itself, its approximate value, etc.) based on cross-sectional data collected by applying an OCT scan to a cross section of the fundus Ef.
- an estimated value of the inclination of the blood vessel Db of interest e.g., the inclination value of the blood vessel Db itself, its approximate value, etc.
- the blood flow velocity calculation unit 234 calculates the blood flow velocity of blood flowing through the blood vessel Db at the cross section of interest C0 based on information on the time series change in phase difference obtained as a phase image.
- the calculated information may be the value of the blood flow velocity at a specific time point (blood flow velocity value), or the time series change in the blood flow velocity value (blood flow velocity change information).
- the blood flow velocity value may be a value at a specific cardiac phase selected from the cardiac cycle (for example, the R wave phase).
- the period for which the blood flow velocity change information is defined may be the entire period during which the main scan is applied to the cross section of interest C0, or a selected portion of that period.
- the blood flow velocity calculation unit 234 may calculate a statistical value of the blood flow velocity during the measurement period.
- This statistical value may be, for example, any of the mean value, standard deviation, variance, median, mode, maximum value, minimum value, local maximum value, and local minimum value. However, it is not limited to these.
- the change in blood flow velocity can be visualized to generate visual information (for example, a graph, histogram, etc.).
- ⁇ f represents the Doppler shift experienced by the scattered light of the measurement light LS
- n denotes the refractive index of the medium
- v denotes the flow velocity of the medium (blood flow velocity)
- ⁇ represents the angle between the incident direction of the measurement light LS and the flow vector of the medium
- ⁇ indicates the center wavelength of the measurement light LS.
- n and ⁇ are known, ⁇ f is obtained from the time-series change in the phase difference, and ⁇ is the Doppler angle (obtained from the slope A or the approximate value A app ).
- the blood vessel diameter calculation unit 235 calculates the diameter of the blood vessel of interest Db in the cross section of interest C0. Examples of this calculation method include a first calculation method using a frontal fundus image and a second calculation method using a cross section image.
- an image of the area of the fundus Ef including the position of the cross section C0 of interest is captured in advance.
- the resulting frontal fundus image may be, for example, a frame of an observed image, a captured image (color image, fluorescent contrast image), or an OCT angiography image (motion contrast image).
- the blood vessel diameter calculation unit 235 sets the scale in the frontal fundus image based on various factors that determine the relationship between the scale in the image and the scale in real space, such as the imaging angle of view (imaging magnification, scan dimensions), working distance, and information about the ocular optical system.
- the blood vessel diameter calculation unit 235 can calculate the diameter of the blood vessel Db of interest in the cross section C0 of interest, i.e., the diameter of the blood vessel region V0, based on the scale set in the frontal fundus image and the pixels in the blood vessel region V0.
- a cross-sectional image of the cross-section of interest C0 is typically used.
- This cross-sectional image may be a principal cross-sectional image or another cross-sectional image.
- the scale of the cross-sectional image is determined based on the OCT measurement conditions, etc.
- the cross-section of interest C0 is scanned as shown in FIG. 5A or 5B.
- the length of the cross-section of interest C0 is determined based on various factors that determine the relationship between the scale on the image and the scale in real space, such as the scan dimensions, working distance, and information about the ocular optical system.
- the blood vessel diameter calculation unit 235 can calculate the diameter of the blood vessel of interest Db in the cross-section of interest C0 by performing a process to determine the pixel pitch based on the length of the cross-section of interest C0 and a process similar to that of the first calculation method.
- the blood flow rate calculation unit 236 calculates the blood flow rate in the blood vessel of interest Db based on the blood flow velocity calculated by the blood flow velocity calculation unit 234 and the blood vessel diameter calculated by the blood vessel diameter calculation unit 235.
- the blood flow calculation unit 236 calculates the blood flow Q by substituting the blood vessel diameter value w calculated by the blood vessel diameter calculation unit 235 and the maximum value Vm based on the blood flow velocity value calculated by the blood flow velocity calculation unit 234 into this formula.
- the types of parameters calculated by the hemodynamic information generation unit 232 are not limited to the several parameters mentioned above.
- the hemodynamic information generation unit 232 can calculate parameters obtained by relative measurement in addition to, or instead of, parameters obtained by absolute measurement such as blood flow velocity and blood flow rate.
- the flow of blood within blood vessels can be considered essentially laminar, with the flow velocity decreasing as the blood approaches the vessel wall due to the frictional drag from the vessel wall, and being greatest at the center of the vessel.
- Laminar flow is a parabolic flow, and the diastolic and systolic waveforms in the cardiac cycle can be read from the pulse wave. This makes it possible to determine the characteristics of these waveforms. For example, parameters relating to deviations from a specific waveform (presence or absence of deviation, degree, frequency, etc.) can be determined.
- the blood flow velocity in veins is not necessarily constant, and slight changes (pulsations) are observed. It is also possible to calculate parameters that indicate these minute pulsations (absolute velocity parameters, relative velocity parameters, etc.).
- Non-limiting aspects of the ophthalmic device Several non-limiting aspects realized by applying the ophthalmic apparatus 1 having the hardware aspects, software aspects, and functional aspects described above will be described. In the following description, matters related to the ophthalmic apparatus 1 will be referred to and used as appropriate. Any matter related to the ophthalmic apparatus 1 can be at least partially combined with each aspect. Two or more aspects can be at least partially combined.
- FIG. 7 shows the configuration of an ophthalmic device 1000 according to one non-limiting embodiment.
- the ophthalmic device 1000 includes a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, and a vascular map generating unit 1050.
- the scanning unit 1010 is configured to collect data by applying an OCT scan to the fundus Ef of the subject's eye E.
- a non-limiting aspect of the ophthalmic apparatus 1 can achieve the functions of the scanning unit 1010 using the fundus camera unit 2 and the OCT unit 100.
- the scanning unit 1010 can apply OCT scanning to the fundus oculi Ef using a scan pattern that does not involve a flyback.
- a flyback is an operation that moves the scan position to a predetermined initial position without collecting data. For example, in a raster scan consisting of multiple B-scans arranged parallel to each other and oriented in the same scanning direction, a flyback is an operation that moves the scan position (the target position for projecting the measurement light LS) from the end position of one B-scan to the start position of the next B-scan.
- a scan pattern that involves a flyback results in a longer period of time during which data collection is stopped, making it relatively disadvantageous for processing or operations that require speed.
- Figures 8A to 8C show an example of a scan pattern that does not involve a flyback.
- the scan pattern in this example is a concentric pattern that is a combination of two circular patterns. Note that the number of circular patterns included in the concentric pattern may be any number, and may be three or more.
- Figure 8A shows the fundus Ef.
- Reference numeral 1200 denotes the optic disc
- reference numeral 1200a denotes the central position of the optic disc 1200 (optic disc center).
- Reference numerals 1201 and 1202 denote two circular patterns. The centers of the two circular patterns 1201 and 1202 are both the optic disc center 1200a.
- the radius of circular pattern 1201 is R1, and the radius of circular pattern 1202 is R2.
- radius R2 is larger than radius R1.
- the region of the fundus Ef to which circular pattern 1201 is applied is a cylindrical side surface 1201b defined by a central axis 1201a passing through the optic disc center 1200a and radius R1.
- the region of the fundus Ef to which circular pattern 1202 is applied is a cylindrical side surface 1202b defined by a central axis 1202a passing through the optic disc center 1200a and radius R2.
- the central axis 1202a coincides with the central axis 1201a.
- Figure 8C shows an example of an OCT scan using a scan pattern consisting of two circular patterns 1201 and 1202.
- the OCT scan is performed by performing a circle scan along circular pattern 1202 immediately after a circle scan along circular pattern 1201. More specifically, in this example, the OCT scan first starts a counterclockwise circle scan 1203a along circular pattern 1201 from position 1201c on circular pattern 1201. This circle scan 1203a is performed once or more than once. The circle scan 1203a along circular pattern 1201 ends at position 1201c.
- the OCT scan is performed by performing an operation 1203b (changing the orientation of the optical scanner 44) to move the scan target position from position 1201c on circular pattern 1201 to position 1202c on circular pattern 1202 immediately after circle scan 1203a.
- the OCT scan begins a counterclockwise circle scan 1203c along the circular pattern 1202 from position 1202c on the circular pattern 1202.
- This circle scan 1203c is performed once or more than once.
- data may be collected while the scan target movement 1203b is being performed.
- some aspects of the scan target movement 1203b may be a scan (data collection).
- the scan patterns shown in Figures 8A to 8C are one example of a scan pattern that includes multiple patterns (two circular patterns 1201 and 1202) that do not each involve a flyback.
- Other scan patterns that have such features include a scan pattern that combines multiple closed curves having the same or different shapes, a scan pattern that combines multiple polygons having the same or different shapes, and a raster scan that combines multiple B-scans that are oriented alternately.
- the scan pattern shown in Figures 8A to 8C is one example of a scan pattern that includes a concentric pattern, which is a combination of multiple concentrically arranged patterns.
- Other scan patterns with this configuration include a scan pattern that combines multiple closed curves that are concentrically arranged and have the same shape, and a scan pattern that combines multiple polygons that are concentrically arranged and have the same shape.
- the multiple cross sections of the fundus Ef scanned with this scan pattern include multiple concentric cross sections that respectively correspond to the multiple concentrically arranged patterns that make up the scan pattern.
- two cylindrical side surfaces 1201b and 1202b correspond to such multiple concentric cross sections.
- the scan pattern without flyback is not limited to the above examples and may be any pattern.
- a spiral scan pattern, a Lissajous curve scan pattern, or similar scan patterns may be employed.
- the cross-sectional image generating unit 1020 is configured to generate a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus Ef based on data collected from the fundus Ef by the scanning unit 1010.
- the cross-sectional image generating unit 1020 is realized by cooperation between hardware including circuits and cross-sectional image generating software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the cross-sectional image generating unit 1020 using the data processing unit 230 (image generating unit 220).
- the cross-sectional image generation unit 1020 When a scan pattern that is a combination of multiple patterns that do not involve a flyback is applied, the cross-sectional image generation unit 1020 generates a cross-sectional image corresponding to the cross section to which each of the multiple patterns that do not involve a flyback is applied.
- the scan unit 1010 collects data from multiple concentric cross sections corresponding to the multiple patterns.
- the cross-sectional image generation unit 1020 generates cross-sectional images corresponding to each concentric cross section based on the data collected from the multiple concentric cross sections.
- the scanning unit 1010 collects data from the cylinder side surface 1201b, which corresponds to the circular pattern 1201, and collects data from the cylinder side surface 1202b, which corresponds to the circular pattern 1202.
- the cross-sectional image generating unit 1020 generates a cross-sectional image depicting the cylinder side surface 1201b based on the data collected from the cylinder side surface 1201b, and generates a cross-sectional image depicting the cylinder side surface 1202b based on the data collected from the cylinder side surface 1202b.
- the cross-sectional image generating unit 1020 can generate multiple cross-sectional images corresponding to the multiple scans and synthesize these multiple cross-sectional images to generate a composite cross-sectional image.
- the synthesis of multiple cross-sectional images is, for example, averaging.
- the averaging reduces random noise such as speckle noise.
- the scanning unit 1010 performs a circle scan using a circular pattern 1201 multiple times to collect multiple pieces of data from the cylinder's side surface 1201b, and performs a circle scan using a circular pattern 1202 multiple times to collect multiple pieces of data from the cylinder's side surface 1202b.
- the cross-sectional image generating unit 1020 generates multiple cross-sectional images depicting the cylinder's side surface 1201b based on each of the multiple pieces of data collected from the cylinder's side surface 1201b, and generates multiple cross-sectional images depicting the cylinder's side surface 1202b based on each of the multiple pieces of data collected from the cylinder's side surface 1202b.
- the cross-sectional image generating unit 1020 synthesizes the multiple cross-sectional images of the cylinder's side surface 1201b to generate a composite cross-sectional image, and synthesizes the multiple cross-sectional images of the cylinder's side surface 1202b to generate a composite cross-sectional image.
- the ophthalmologic device 1000 may select a cross-sectional image suitable for subsequent processing (e.g., identifying a vascular region) from among the multiple cross-sectional images generated.
- evaluation of each cross-sectional image may be performed using any method.
- Portilla-Simoncelli Statistics (PSS) an image texture feature based on human visual perception, may be used as the evaluation value. In this case, a cross-sectional image with a large PSS value is selected.
- the vascular region identification unit 1030 is configured to analyze each cross-sectional image generated by the cross-sectional image generation unit 1020 and detect vascular region groups. This identifies multiple vascular region groups corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020.
- a vascular region group includes one or more vascular regions.
- a vascular region is an image region corresponding to a cross section of a blood vessel.
- the image analysis method for detecting vascular regions from cross-sectional images may be any method.
- this image analysis may be any image segmentation method.
- the image segmentation method performed by the vascular region identification unit 1030 may be, for example, either or both of an image segmentation method using a machine learning algorithm (machine learning model) and an image segmentation method using a non-machine learning algorithm.
- Non-limiting examples of image segmentation methods applicable to the vascular region identification unit 1030 include thresholding-based methods, edge detection-based methods, region-based methods, clustering-based methods, convolutional neural network (CNN)-based methods, transformer-based methods, generative adversarial network (GAN)-based methods, self-supervised learning (SSL)-based methods, graph-based methods, etc.
- the image segmentation performed by the vascular region identification unit 1030 may be a combination of two or more methods.
- the vascular region identification unit 1030 applies projection (image accumulation) in the A-scan direction (z direction) of the OCT scan applied to the fundus Ef by the scan unit 1010 to each cross-sectional image generated by the cross-sectional image generation unit 1020. This generates multiple projection images corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020. Furthermore, the vascular region identification unit 1030 detects a vascular region group from the cross-sectional image that is the original image of the projection image (i.e., the cross-sectional image corresponding to the projection image) based on the brightness distribution in each generated projection image. Generally, the brightness of pixels corresponding to blood vessels is lower than the brightness of pixels corresponding to tissue other than blood vessels.
- the vascular region detection of this embodiment may include processing (e.g., threshold processing, binarization, etc.) to detect low-brightness regions in the projection image. Furthermore, this embodiment may combine a brightness-based image segmentation method with an image segmentation method based on an index other than brightness.
- processing e.g., threshold processing, binarization, etc.
- the vascular region identification unit 1030 is realized by the cooperation of hardware including circuitry and vascular region identification software.
- the non-limiting aspect of the ophthalmologic apparatus 1 can perform analysis processing to detect vascular regions from OCT cross-sectional images using the vascular region identification function realized using the data processing unit 230 (vascular region identification unit 231).
- the vascular region matching unit 1040 is configured to match vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups identified from multiple cross-sectional images by the vascular region identification unit 1030.
- first cross-sectional image two adjacent cross-sectional images will be referred to as the first cross-sectional image and the second cross-sectional image.
- the cross-section of the fundus Ef corresponding to the first cross-sectional image will be referred to as the first cross-section
- the cross-section of the fundus Ef corresponding to the second cross-sectional image will be referred to as the second cross-section.
- the vascular region group identified from the first cross-sectional image will be referred to as the first vascular region group
- the vascular region group identified from the second cross-sectional image will be referred to as the second vascular region group.
- the vascular region association unit 1040 determines the position of each vascular region (referred to as the first vascular region) in the first cross section image and the position of each vascular region (referred to as the second vascular region) in the second cross section image.
- the position of a vascular region may be the position of a feature point in the vascular region (e.g., the center position, the center of gravity position, the upper end position, the lower end position, etc.).
- the position of a vascular region may be information indicating its position in the cross section image in which it was detected (i.e., coordinates in a two-dimensional coordinate system representing the pixel position of the cross section image), or information indicating its position in a three-dimensional image including both the first cross section and the second cross section (coordinates in a three-dimensional coordinate system representing the position in the three-dimensional region), or other information that can be used to define the position.
- the vascular region association unit 1040 can identify pairs of first and second vascular regions corresponding to different cross sections of the same blood vessel by comparing the position of each vascular region in the first cross-sectional image with the position of each vascular region in the second cross-sectional image, and associate these with each other. For example, for any first vascular region belonging to a first vascular region group, the vascular region association unit 1040 may be configured to compare the position of the first vascular region with the position of each second vascular region belonging to a second vascular region group, identify the second vascular region that is the shortest distance from the first vascular region, and associate the identified second vascular region with the first vascular region.
- the vascular region matching unit 1040 can be configured to perform the vascular region matching process taking such assumptions into account.
- the criteria for determining whether the distance between the first and second cross sections is sufficiently small may be determined arbitrarily, and may take into consideration, for example, the position of the area to which the OCT scan is applied on the fundus Ef, the state of the fundus blood vessels, etc. Furthermore, when OCT blood flow measurement is performed, the distance between the two supplementary cross sections C1 and C2 shown in FIG. 5A (supplementary cross section images G1 and G2 shown in FIG. 6A) is generally set to be sufficiently small, so in embodiments intended for similar applications, the distance between the first and second cross sections can be assumed to be sufficiently small.
- the vascular region association unit 1040 in a non-limiting embodiment may be configured to, for example, determine the positional relationship between the first cross section and the second cross section, and the positional relationship between each vascular region belonging to the first vascular region group and each vascular region belonging to the second vascular region group, by referring to separately acquired images (frontal fundus images and/or three-dimensional fundus images). Furthermore, the vascular region association unit 1040 may be configured to identify pairs of first and second vascular regions corresponding to different cross sections of the same blood vessel by referring to the obtained positional relationship, and associate these with each other.
- the vascular region matching unit 1040 is realized by the cooperation of hardware including circuits and vascular region matching software.
- the non-limiting aspect of the ophthalmologic apparatus 1 can perform analytical processing for matching between vascular regions using the vascular region matching function realized using the data processing unit 230.
- the vascular map generation unit 1050 is configured to generate information (vascular map) indicating the distribution of blood vessels based on the results of the vascular region matching obtained by the vascular region matching unit 1040.
- the range of vascular distribution represented by the vascular map includes at least a portion of the area of the fundus Ef defined by the multiple cross sections corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020.
- the vascular map is information that indicates at least the positions (coordinates) of blood vessels (the same blood vessels mentioned above) corresponding to multiple vascular regions that have been associated with each other by the vascular region association unit 1040, and is information that indicates the distribution state of blood vessels in the fundus Ef.
- the vascular map may be visual information obtained by visualizing vascular positions, or may be unvisualized vascular position data.
- the visual information may be provided for image processing, image analysis, display, printing, etc.
- the vascular position data may be provided for data processing, data analysis, imaging (visualization), etc.
- the vascular map generation unit 1050 is realized by the cooperation of hardware including circuits and vascular map generation software.
- the non-limiting aspect of the ophthalmic device 1 can perform processing for generating a vascular map using a vascular map generation function realized using the data processing unit 230.
- Figure 9 shows an example of the operation of the ophthalmic device 1000.
- Figures 8A to 8C see Figures 8A to 8C.
- step S1 the scanning unit 1010 applies an OCT scan using a scan pattern without flyback to the fundus Ef of the subject's eye E to collect data.
- the OCT scan may be a combination of a circle scan 1203a along the circular pattern 1201, scan target movement 1203b, and a circle scan 1203c along the circular pattern 1202, as shown in FIG. 8C. This allows data to be collected from the cylinder side surface 1201b along the circular pattern 1201, and data to be collected from the cylinder side surface 1202b along the circular pattern 1202.
- step S2 the cross-sectional image generation unit 1020 generates multiple cross-sectional images corresponding to multiple cross sections of the fundus oculi Ef based on the data collected in step S1.
- the cross-sectional image generation unit 1020 generates a cross-sectional image 1301 representing the cylinder side surface 1201b based on data collected from the cylinder side surface 1201b along the circular pattern 1201 in step S1 (see FIG. 10A). Furthermore, the cross-sectional image generation unit 1020 generates a cross-sectional image 1302 representing the cylinder side surface 1202b based on data collected from the cylinder side surface 1202b along the circular pattern 1202 in step S1 (see FIG. 10A).
- step S3 the vascular region identification unit 1030 detects vascular region groups from each of the multiple cross-sectional images generated in step S2. This identifies multiple vascular region groups corresponding to each of the multiple cross-sectional images.
- the vascular region identification unit 1030 detects vascular region groups 1301a, 1301b, 1301c, and 1301d from the cross-sectional image 1301 generated in step S2 (see Figure 10A). Furthermore, the vascular region identification unit 1030 detects vascular region groups 1302a, 1302b, 1302c, and 1302d from the cross-sectional image 1302 generated in step S2 (see Figure 10A).
- step S4 the vascular region association unit 1040 associates vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified in step S4.
- the vascular region matching unit 1040 identifies pairs of vascular regions corresponding to different cross sections of the same blood vessel between the vascular region group 1301a, 1301b, 1301c, and 1301d corresponding to the cross-sectional image 1301 and the vascular region group 1302a, 1302b, 1302c, and 1302d corresponding to the cross-sectional image 1302.
- vascular region association unit 1040 associates the two vascular regions belonging to each pair. This results in association 1303a between the first pair, association 1303b between the second pair, association 1303c between the third pair, and association 1303d between the fourth pair.
- step S5 the vascular map generation unit 1050 generates a vascular map based on the vascular region matching results obtained in step S4.
- the vascular map generation unit 1050 determines a first connection region connecting vascular regions 1301a and 1302a belonging to the first pair, a second connection region connecting vascular regions 1301b and 1302b belonging to the second pair, a third connection region connecting vascular regions 1301c and 1302c belonging to the third pair, and a fourth connection region connecting vascular regions 1301d and 1302d belonging to the fourth pair.
- vascular region 1301a is the region on cylinder side surface 1201b along circular pattern 1201
- vascular region 1302a is the region on cylinder side surface 1202b along circular pattern 1202.
- the first connection region connects vascular region 1301a and vascular region 1302a.
- the first connection region has two ends, the first end being located in vascular region 1301a (cylinder side surface 1201b, circular pattern 1201) and the second end being located in vascular region 1302a (cylinder side surface 1202b, circular pattern 1202).
- FIG. 10B and 10C show two non-limiting examples of vascular maps.
- the vascular map 1310 in FIG. 10B represents a two-dimensional vascular distribution when viewed from a frontal perspective. Objects shown with dashed lines are included to aid in explanation and may not be visualized in the vascular map 1310. At least the circular patterns 1201 and 1202 may be visualized.
- Reference numeral 1311 denotes the optic disc.
- Reference numerals 1312a, 1312b, 1312c, and 1312d each denote a blood vessel.
- Reference symbol 1313a indicates a first connection region connecting two vascular regions 1301a and 1302a belonging to the first pair.
- the first connection region 1313a corresponds to a portion of the blood vessel 1312a.
- Reference symbol 1313b indicates a second connection region connecting two vascular regions 1301b and 1302b belonging to the second pair.
- the second connection region 1313b corresponds to a portion of the blood vessel 1312b.
- Reference symbol 1313c indicates a third connection region connecting two vascular regions 1301c and 1302c belonging to the third pair.
- the third connection region 1313c corresponds to a portion of the blood vessel 1312c.
- Reference symbol 1313d indicates a fourth connection region connecting two vascular regions 1301d and 1302d belonging to the fourth pair.
- the fourth connection region 1313d corresponds to a portion of the blood vessel 1312d.
- a frontal fundus image can be used to visualize the optic disc 1311 and/or blood vessels 1312a-1312d.
- This frontal fundus image may be, for example, a fundus camera image, an SLO image, or an OCT frontal image (projection image, an annular image, etc.).
- the vascular map 1320 in Figure 10C represents a three-dimensional distribution of blood vessels.
- the vascular map 1320 may be three-dimensional data defined in a three-dimensional coordinate system, three-dimensional image data (volume data, stack data, etc.) that visualizes this three-dimensional data, or a rendered image of this three-dimensional image data (volume rendering image, etc.). Objects indicated by dashed lines are shown to aid in explanation and do not need to be visualized in the vascular map 1320. Note that the cylinder side surfaces 1201b and 1202b, as well as the circular patterns 1201 and 1202 (not shown), may be visualized.
- Reference numeral 1321 denotes the optic disc.
- Reference numerals 1322a, 1322b, 13222c, and 1322d each denote a blood vessel.
- Reference symbol 1323a indicates a first connection region connecting two vascular regions 1301a and 1302a belonging to the first pair.
- the first connection region 1323a corresponds to a portion of the blood vessel 1322a.
- Reference symbol 1323b indicates a second connection region connecting two vascular regions 1301b and 1302b belonging to the second pair.
- the second connection region 1323b corresponds to a portion of the blood vessel 1322b.
- Reference symbol 1323c indicates a third connection region connecting two vascular regions 1301c and 1302c belonging to the third pair.
- the third connection region 1323c corresponds to a portion of the blood vessel 1322c.
- Reference symbol 1323d indicates a fourth connection region connecting two vascular regions 1301d and 1302d belonging to the fourth pair.
- the fourth connection region 1323d corresponds to a portion of the blood vessel 1322d.
- the ophthalmic device 1000 capable of executing the processing procedures of this operational example collects data from the fundus Ef by OCT scanning using a scan pattern without flyback, generates multiple cross-sectional images corresponding to multiple cross sections, identifies multiple groups of vascular regions corresponding to the multiple cross-sectional images, and generates a vascular map by matching vascular regions corresponding to different cross sections of the same blood vessel.
- One advantage of the ophthalmic device 1000 is that it can quickly generate a vascular map by using a scan pattern without flyback.
- preparatory steps e.g., Doppler angle estimation, alignment, search for blood vessels of interest, etc.
- OCT blood flow measurement fundus blood flow dynamics measurement
- FIG. 11 shows the configuration of an ophthalmic device 1400 according to one non-limiting embodiment.
- Elements of the ophthalmic device 1400 that have the same names and symbols as elements of the ophthalmic device 1000 in FIG. 7 may have the same configuration and function as the corresponding elements in the ophthalmic device 1000, unless otherwise specified. However, this does not exclude the adoption of modified means, equivalent means, alternative means, etc. for the corresponding elements.
- the ophthalmic device 1400 has similar elements to the ophthalmic device 1000, including a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, and a vascular map generating unit 1050.
- the ophthalmic device 1400 also has a scanning control unit 1060, a blood vessel designation unit 1070, a hemodynamic information generating unit 1080, a display control unit 1090, and a display device 1100.
- the scan control unit 1060 is configured to control the scan unit 1010.
- the scan control unit 1060 can control the scan unit 1010 to apply repeated scans of OCT blood flow measurement to a pre-specified blood vessel of interest in the fundus Ef.
- the blood vessel of interest may be designated automatically or manually. Automatic designation of the blood vessel of interest is performed, for example, by the blood vessel designation unit 1070 described below. Manual designation is performed, for example, by combining the display of an image of the fundus Ef using a user interface (the above-mentioned user interface 240) with a position designation operation on this displayed image.
- the scan control unit 1060 is realized by cooperation between hardware including circuits and scan control software.
- the ophthalmologic apparatus 1 can realize the functions of the scan control unit 1060 using the control unit 240 (main control unit 211).
- the vessel of interest designation unit 1070 is configured to analyze the vessel map 1051 generated by the vessel map generation unit 1050 and designate a vessel of interest to which OCT blood flow measurement is to be applied. As described above, the vessel map 1051 generated by the vessel map generation unit 1050 visualizes the distribution of multiple blood vessels in the fundus Ef.
- the vessel of interest designation unit 1070 may be configured to select a vessel of interest from among the multiple blood vessels presented in the vessel map 1051. The number of vessels of interest selected may be any number greater than or equal to one.
- the vessel of interest designated by the vessel of interest designation unit 1070 is treated as, for example, the vessel of interest Db shown in FIG. 5A or 5B.
- the vessel of interest designation unit 1070 selects the vessel of interest based on preset criteria. These criteria are referred to as selection criteria.
- the selection criteria may include an index related to the direction of the vessel. Examples of such vessel direction indices include the magnitude of the Doppler angle (Doppler angle) in OCT blood flow measurement, the suitability of the Doppler angle in OCT blood flow measurement, etc.
- the preferred value for the Doppler angle in OCT blood flow measurement is approximately 80 degrees, and in actual measurements, the search for the blood vessel and cross section of interest is performed with a target range of 77 to 83 degrees.
- the Doppler angle is as small as 75 degrees, there is a problem of phase wrapping becoming more likely to occur, so it is considered desirable to set the lower limit of the target range to approximately 77 degrees.
- the Doppler angle is as large as 85 degrees, there is a problem of a decrease in the strength of the detected Doppler signal, so it is considered desirable to set the upper limit of the target range to approximately 83 degrees.
- this target range is a non-limiting example, and a different target range may be used.
- the preferred range of Doppler angles in OCT blood flow measurement is typically set to between 77 degrees and 83 degrees. This range is referred to as the acceptable Doppler angle range.
- the ophthalmic device 1400 (for example, any element of the vascular map generation unit 1050, the vessel of interest designation unit 1070, and the hemodynamic information generation unit 1080) can estimate the Doppler angle of each blood vessel presented in the vascular map 1051.
- this calculation method please refer to the Doppler angle calculation unit 233 of the ophthalmic device 1 described above.
- the vessel of interest designation unit 1070 compares the Doppler angle of each blood vessel displayed on the blood vessel map 1051 with the Doppler angle tolerance range. If the Doppler angle value of a certain blood vessel falls within the tolerance range, the vessel of interest designation unit 1070 determines that the Doppler angle of that blood vessel is good. If there are two or more blood vessels with good Doppler angles and only one blood vessel of interest is designated, the vessel of interest designation unit 1070 may be configured to select the blood vessel with the Doppler angle closest to the optimal value (typically 80 degrees). In some embodiments, each blood vessel with a good Doppler angle may be designated as a blood vessel of interest. In some embodiments, a predetermined number of blood vessels of interest may be selected from a plurality of blood vessels with good Doppler angles.
- the selection criteria may include other indices.
- indices include blood vessel type (e.g., artery, vein), size (blood vessel diameter), tortuosity, and position (e.g., position relative to the optic disc).
- the blood vessel of interest designation unit 1070 can select a blood vessel of interest by considering two or more types of indices in stages or in parallel.
- the vessel of interest designation unit 1070 may be configured to designate a vessel selected by the user from among the multiple blood vessels presented in the vessel map 1051 as the vessel of interest.
- the ophthalmic device 1400 displays the vessel map 1051 (a visualized image of the vessel map 1051). The user can select a vessel of interest by referring to the displayed vessel map 1051. The operation of inputting the selected vessel of interest is performed using a user interface (e.g., the user interface 240 of the ophthalmic device 1) not shown.
- the vessel of interest designation unit 1070 is realized by the cooperation of hardware including circuitry and software for designating the vessel of interest.
- the non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the vessel of interest designation unit 1070 using the data processing unit 230.
- the hemodynamic information generation unit 1080 is configured to generate hemodynamic information for the blood vessel of interest designated by the blood vessel of interest designation unit 1070 based on data collected by the scanning unit 1010 when the scanning unit 1010 applies a scan for OCT blood flow measurement to the fundus Ef.
- the hemodynamic information generation unit 1080 may have a configuration similar to that of the data processing unit 230 of the ophthalmologic apparatus 1. Specifically, the hemodynamic information generation unit 1080 may include the image generation unit 220, vascular region identification unit 231, and hemodynamic information generation unit 232 shown in FIG. 4.
- the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement to the blood vessel of interest designated by the blood vessel of interest designation unit 1070. As shown in Figures 5A to 6B, the repeated scans are applied to a specific cross section (cross section of interest) of the blood vessel of interest. In this embodiment, the cross section of interest may be set at any position in the blood vessel displayed on the blood vessel map 1051.
- the scan control unit 1060 sets a cross section at an arbitrary position in the first connection region 1313a.
- the cross section of interest may be set at a position equidistant from the two circular patterns 1201 and 1202 in the first connection region 1313a (i.e., a position midway between the two blood vessel regions 1301a and 1302a connected by the first connection region 1313a).
- This cross section of interest corresponds to the cross section of interest C0 shown in FIGS. 5A and 6A, and the two blood vessel regions 1301a and 1302a correspond to the two supplementary cross sections C1 and C2 (two supplementary cross section images G1 and G2).
- the data collected by the scanning unit 1010 through repeated scans of the cross section of interest of the blood vessel of interest is referred to as OCT blood flow measurement data 1075.
- the blood flow dynamics information generating unit 1080 generates blood flow dynamics information 1085 of the blood vessel of interest designated by the blood vessel of interest designating unit 1070 based on the OCT blood flow measurement data 1075.
- the hemodynamic information generation unit 1080 is realized by the cooperation of hardware including circuits and hemodynamic information generation software.
- the ophthalmologic apparatus 1 in a non-limiting embodiment can realize the functions of the hemodynamic information generation unit 1080 using the data processing unit 230.
- the display device 1100 is any type of display.
- the display device 1100 may be the display unit 241 of the ophthalmic device 1, for example, the display device 3.
- the display device 1100 of this embodiment is a component of the ophthalmic device 1400.
- the display device may be a peripheral device (external device) of the ophthalmic device.
- the display control unit 1090 is configured to control the display device 1100 to display information.
- the display control unit 1090 is realized by cooperation between hardware including circuits and display control software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the display control unit 1090 using the control unit 240 (main control unit 211).
- the display control unit 1090 causes the display device 1100 to display information generated by the ophthalmic device 1400.
- information generated by the ophthalmic device 1400 include an image of the subject's eye E, information on the direction of blood vessels in the fundus Ef (Doppler angle, evaluation information, etc.), a blood vessel map, and information obtained by processing any of these (correction, analysis, evaluation, etc.).
- the display device 1100 causes the display device 1100 to display information (e.g., images) acquired by the ophthalmic device 1400 from outside.
- Evaluation information is information obtained by evaluating the Doppler angle of hemodynamic measurement.
- the hemodynamic information generation unit 1080 obtains information indicating the magnitude of the Doppler angle (Doppler angle information) using the method described above. Furthermore, the hemodynamic information generation unit 1080 applies a predetermined evaluation process to this Doppler angle information. This evaluation process, for example, evaluates the suitability of the magnitude of the Doppler angle in OCT blood flow measurement.
- the preferred range (target range) of the Doppler angle in OCT blood flow measurement may be set between 77 degrees and 83 degrees, with the most preferred value being set at approximately 80 degrees.
- the hemodynamic information generation unit 1080 compares the Doppler angle value of the target blood vessel with the target range. If the Doppler angle value falls within the target range, the hemodynamic information generation unit 1080 determines that the Doppler angle of the blood vessel is preferred and generates evaluation information indicating this determination result. On the other hand, if the Doppler angle of the blood vessel does not fall within the target range, the hemodynamic information generation unit 1080 generates evaluation information indicating that the Doppler angle of the blood vessel is not preferred.
- the target range may be divided into multiple sections to allow for more detailed evaluation of cases where the Doppler angle is preferred.
- the range outside the target range may be divided into multiple sections to allow for more detailed evaluation of cases where the Doppler angle is not preferred.
- the display control unit 1090 can display the front fundus image and the vascular map 1051 on the display device 1100.
- the display control unit 1090 can display the front fundus image on the display device 1100, and can also display the vascular map 1051 (an image obtained by visualizing this) on the front fundus image.
- a registration unit included in the hemodynamic information generation unit 1080 or the display control unit 1090.
- the registration unit is realized by cooperation between hardware including a circuit and registration software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can perform registration between the frontal fundus image and the vascular map 1051 using a registration function realized using the data processing unit 230. Some non-limiting examples of this registration are described below.
- the registration unit is configured to perform registration between the vascular map 1051 and the frontal fundus image using the blood vessels displayed in the vascular map 1051 as landmarks.
- Each blood vessel displayed in the vascular map 1051 is located in the area between the circular pattern 1201 and the circular pattern 1202, and the two circular patterns 1201 and 1202 are centered on the optic disc center 1200a and have predetermined radii R1 and R2, respectively.
- R1 and R2 predetermined radii
- the vascular map 1051 is three-dimensional data such as the vascular map 1320 in FIG. 10C
- projection may be applied to the vascular map 1051 to generate two-dimensional data (a frontal image such as the vascular map 1310 in FIG. 10B), and registration may be performed between this two-dimensional data and the frontal fundus image.
- the registration unit is configured to perform registration between the vascular map 1051 and the frontal fundus image using both ends of each of the multiple blood vessels displayed on the vascular map 1051 as multiple landmarks.
- the both ends of each blood vessel displayed on the vascular map 1051 are a first end located on the circular pattern 1201 and a second end located on the circular pattern 1202.
- the search range can be limited.
- the processing procedure when the vascular map 1051 is three-dimensional data may also be the same as in the first example.
- a non-limiting third example uses at least one cross-sectional image from the multiple cross-sectional images used to generate the vascular map 1051.
- the registration unit in this example generates a circular image by applying projection to the cross-sectional image.
- the cross-sectional image is an OCT intensity image, in which blood vessels are depicted at low brightness. Even in a circular image generated as a projection of such a cross-sectional image, the brightness of pixels corresponding to blood vessels (vascular pixels) is lower than the brightness of pixels corresponding to other tissues.
- registration can be performed between the cross-sectional image, which is the original image of the circular image, and the frontal fundus image. Furthermore, registration can be performed between the vascular map 1051 generated from this cross-sectional image and the frontal fundus image.
- the ophthalmic device 1400 of this embodiment can estimate the Doppler angle of each blood vessel presented in the vascular map 1051. Furthermore, the ophthalmic device 1400 can generate a Doppler angle map by associating the position information of each blood vessel in the vascular map 1051 with the estimated Doppler angle. Furthermore, the ophthalmic device 1400 can generate evaluation information by applying an evaluation process to the estimated Doppler angle of each blood vessel in the vascular map 1051, and generate a Doppler angle evaluation map by associating the position information of each blood vessel with the evaluation information.
- Doppler angle maps and Doppler angle evaluation maps contain information indicating the distribution of blood vessel orientation. By visualizing such maps, visual information (orientation distribution image) representing the distribution of blood vessel orientation can be generated.
- the display control unit 1090 can display the orientation distribution image on the display device 1100.
- the display control unit 1090 may also display the orientation distribution image on the frontal fundus image. Registration between the frontal fundus image and the orientation distribution image may be performed by the registration unit described above.
- Figure 12 shows one example of the operation of the ophthalmic device 1400. Steps S11 to S15 may be performed in the same manner as steps S1 to S5 in Figure 9, respectively.
- step S16 the vessel of interest designation unit 1070 generates orientation information (Doppler angle, suitability, etc.) for each blood vessel presented in the vessel map 1051 generated in step S15.
- orientation information Doppler angle, suitability, etc.
- the vessel of interest designation unit 1070 designates a vessel of interest from among the multiple blood vessels presented in the vessel map 1051, based on the orientation information of each blood vessel generated in step S16. Note that in addition to or instead of the orientation information, the vessel of interest may be designated with reference to other selection criteria. Information on the designated vessel of interest is sent to the scan control unit 1060.
- step S18 the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessel of interest designated in step S17.
- OCT blood flow measurement OCT blood flow measurement
- the blood vessels in the fundus Ef corresponding to the specified blood vessels of interest are searched for and identified, for example, by referring to an infrared observation image (real-time moving image) of the fundus Ef.
- alignment between the infrared observation image and the blood vessel map 1051 may be performed using a process similar to the registration described above.
- the scan control unit 1060 may control the scan unit 1010 to perform a repetitive scan (the aforementioned main scan) on the blood vessel of interest, as well as to perform a scan (the aforementioned supplementary scan) to newly determine the Doppler angle.
- the OCT blood flow measurement data 1075 collected by the repetitive scan (and the data collected by the supplementary scan) is sent to the hemodynamic information generation unit 1080.
- step S19 the hemodynamic information generation unit 1080 generates hemodynamic information 1085 for the blood vessel of interest designated by the blood vessel of interest designation unit 1070 based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S17 (and data collected by the supplementary scan).
- the hemodynamic information generation unit 1080 calculates the blood flow velocity in the blood vessel of interest based on the OCT blood flow measurement data 1075 collected from the blood vessel of interest in step S17 and the orientation information (Doppler angle) of the blood vessel of interest calculated in step S16.
- the blood flow dynamics information generation unit 1080 calculates the Doppler angle of the blood vessel of interest based on the data collected in the supplementary scan of step S17, and calculates the blood flow velocity in the blood vessel of interest based on this Doppler angle and the OCT blood flow measurement data 1075 collected in the main scan of step S17.
- the hemodynamic information generation unit 1080 may perform a process of calculating the vascular diameter of the blood vessel of interest and a process of calculating the blood flow volume based on this vascular diameter and the blood flow velocity.
- the display control unit 1090 causes the display device 1100 to display the hemodynamic information (blood flow velocity, blood flow volume, blood vessel diameter, Doppler angle, etc.) generated in step S19.
- the display control unit 1090 may also cause the display device 1100 to display any information obtained in this operation example (images, calculation results, analysis results, etc.).
- the hemodynamic information generation unit 1080 may apply evaluation processing to the Doppler angle to generate evaluation information, and the display control unit 1090 may then cause the display device 1100 to display this evaluation information.
- the ophthalmic device 1400 has the functions and advantages described below in addition to the functions and advantages of the ophthalmic device 1000.
- the ophthalmic device 1400 is capable of measuring the hemodynamics of the fundus. Furthermore, the ophthalmic device 1400 can automatically designate a blood vessel of interest based on a vascular map, or can assist in the task of designating a blood vessel of interest based on a vascular map. In addition, the ophthalmic device 1400 can automatically perform hemodynamic measurements of a designated blood vessel of interest using a vascular map. This facilitates and reduces the labor required to designate the position at which to apply hemodynamic measurements, and improves the precision and accuracy of this task. Therefore, the ophthalmic device 1400 of this embodiment facilitates easier, more labor-saving, and faster examinations, contributing to improved examination quality.
- the ophthalmic device 1400 provides a novel method of generating orientation information for fundus blood vessels by obtaining information on the course of blood vessels from images obtained by fundus imaging.
- the orientation information generated by the ophthalmic device 1400 contributes to simplification and labor savings in the task of specifying blood flow measurement positions, and contributes to improved precision and accuracy of the task.
- the orientation information generated by the ophthalmic device 1400 provides the magnitude of the Doppler angle and its evaluation results, making it possible to predict the optimal blood flow measurement position.
- the ophthalmologic device 1400 can provide visual information showing various types of information, such as vascular maps, fundus images, Doppler angles, and evaluation information. This allows the user to understand information about the subject's eye (fundus).
- Figure 13 shows the configuration of an ophthalmic device 1500 according to one non-limiting embodiment.
- Elements of the ophthalmic device 1500 that have the same names and symbols as elements of the ophthalmic device 1400 in Figure 11 may have the same configuration and function as the corresponding elements of the ophthalmic device 1400, unless otherwise specified. However, this does not exclude the use of modified, equivalent, or alternative means for the corresponding elements.
- the ophthalmic device 1500 has the same elements as the ophthalmic device 1400: a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, a vascular map generating unit 1050, a scan control unit 1060, and a blood flow dynamics information generating unit 1080.
- the ophthalmic device 1500 also has an observation image generating unit 1110, a movement control unit 1120, and a movement mechanism 1130.
- the observation image generation unit 1110 is configured to generate an infrared observation image (real-time moving image) of the fundus oculi Ef.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the observation image generation unit 1110 using the fundus camera unit 2.
- the movement mechanism 1130 is configured to move the scanning unit 1010.
- the ophthalmologic apparatus 1 can achieve the functions of the movement mechanism 1130 using the movement mechanism 150.
- the movement control unit 1120 is configured to control the movement mechanism 1130.
- the movement control unit 1120 is realized by cooperation between hardware including circuits and movement control software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the movement control unit 1120 using the control unit 210 (main control unit 211) and the data processing unit 230.
- the vascular map 1051 generated by the vascular map generation unit 1050 is input to the movement control unit 1120. Furthermore, the infrared observation image generated by the observation image generation unit 1110 is input to the movement control unit 1120 in real time.
- the movement control unit 1120 compares the vascular map 1051 with the infrared observation image (a frame of a real-time moving image), calculates the deviation between the vascular map 1051 and the infrared observation image, and controls the movement mechanism 1130 based on this deviation. Calculation of the deviation between the vascular map 1051 and the infrared observation image may be performed, for example, using a method similar to the registration described above.
- the movement control unit 1120 controls the movement mechanism 1130 so as to cancel out the calculated deviation.
- the movement control unit 1120 calculates the deviation of each frame of the infrared observation image (or each frame obtained through thinning processing) relative to the vascular map 1051, and compares each sequentially calculated deviation with a predetermined threshold. The movement control unit 1120 repeats this series of processes until the deviation becomes smaller than the threshold value. This achieves alignment of the infrared observation image with the vascular map 1051, and realizes new alignment using the vascular map 1051. Note that in parallel with the series of processes described here, alignment may be performed using the vascular map 1051, which is updated in real time, and the infrared observation image (real-time moving image), by repeatedly performing OCT scanning and data processing to generate the vascular map 1051.
- FIG. 14 shows one example of the operation of the ophthalmic device 1500. Steps S31 to S35 may be performed in the same manner as steps S1 to S5 in Figure 9, respectively.
- step S36 the ophthalmic device 1500 starts alignment. Specifically, the ophthalmic device 1500 starts generating an infrared observation image of the fundus oculi Ef using the observation image generation unit 1110, and starts operation of the movement control unit 1120 and the movement mechanism 1130.
- step S37 the movement control unit 1120 calculates the deviation of the infrared observation image relative to the vascular map 1051 generated in step S35 and compares the calculated deviation with a threshold value. This process is repeated until the deviation becomes smaller than the threshold value (step S38: No). When the deviation of the infrared observation image relative to the vascular map 1051 becomes smaller than the threshold value (step S38: Yes), the processing procedure proceeds to step S39.
- step S39 the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessels of interest in the fundus oculi Ef.
- OCT blood flow measurement OCT blood flow measurement
- the blood vessels of interest may be designated in the same manner as in the ophthalmic apparatus 1400 (blood vessel designation unit 1070) described above.
- step S40 the hemodynamic information generation unit 1080 generates hemodynamic information for the blood vessel of interest based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S39.
- the ophthalmic device 1500 can provide a novel alignment method that utilizes a vascular map that indicates the position of fundus blood vessels obtained from OCT images.
- Figure 15 shows the configuration of an ophthalmic device 1600 according to one non-limiting embodiment.
- Elements of the ophthalmic device 1600 that have the same names and are assigned the same reference numerals as elements of the ophthalmic device 1400 in Figure 11 may have the same configuration and function as the corresponding elements of the ophthalmic device 1400, unless otherwise specified. However, this does not exclude the adoption of modified, equivalent, or alternative means for the corresponding elements.
- the ophthalmic device 1600 has the same elements as the ophthalmic device 1400, including a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, a vascular map generating unit 1050, a scan control unit 1060, and a blood flow dynamics information generating unit 1080.
- the ophthalmic device 1500 has a movement mechanism 1130, a movement control unit 1140, and a projection image generating unit 1150.
- the movement mechanism 1130 has the same configuration and functions as the movement mechanism 1130 of the ophthalmic device 1500 in FIG. 13.
- the vascular map 1051 is three-dimensional data such as the vascular map 1320 in Figure 10C.
- the projection image generation unit 1150 generates a projection image by applying a projection in the A-scan direction (z direction) of an OCT scan of the fundus Ef to the vascular map 1051.
- This projection of the vascular map 1051 is not limited to a projection onto the vascular map 1051, but may also be a projection onto multiple cross-sectional images 1021, which are the original images of the vascular map 1051.
- vascular map 1051 is two-dimensional data such as the vascular map 1310 in Figure 10B, there is no need to provide a projection image generation unit 1150.
- a projection image generation unit 1150 may be provided that is configured to operate when the vascular map 1051 is three-dimensional data and not operate when the vascular map 1051 is two-dimensional data.
- the projection image generation unit 1150 is realized by the cooperation of hardware including circuits and projection image generation software.
- the ophthalmologic apparatus 1 in a non-limiting embodiment can realize the functions of the projection image generation unit 1150 using the data processing unit 230.
- the movement control unit 1140 is configured to control the movement mechanism 1130.
- the movement control unit 1120 is realized by cooperation between hardware including circuits and movement control software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the movement control unit 1120 using the control unit 210 (main control unit 211) and the data processing unit 230.
- the projection image generated from the vascular map 1051 by the projection image generation unit 1150 is input to the movement control unit 1140.
- the vascular map 1051 is repeatedly generated, and a projection image is generated from each vascular map 1051 (or each vascular map 1051 obtained through thinning processing).
- Steps S51 to S55 may be executed in the same manner as steps S1 to S5 in Figure 9, respectively. Steps S51 to S55 are executed repeatedly. As a result, a blood vessel map 1051 is generated sequentially.
- the time interval for generating the blood vessel map 1051 i.e., the time interval for applying a scan to the fundus Ef, may be constant.
- the blood vessel map 1051 generated sequentially is input sequentially to the projection image generation unit 1150.
- step S56 the projection image generation unit 1150 applies projection to the vascular map 1051 that is sequentially input to generate a projection image.
- the projection images that are sequentially generated in response to the input of the vascular map 1051 are sequentially input to the movement control unit 1140.
- step S57 the movement control unit 1140 compares the two input projection images, calculates the deviation between these projection images, and compares the calculated deviation with a threshold value. This process is repeated until the deviation becomes smaller than the threshold value (step S58: No). When the deviation becomes smaller than the threshold value (step S58: Yes), the processing procedure proceeds to step S59.
- the deviation becoming smaller than the threshold value means that the movement of the subject's eye E is sufficiently small and the position of the subject's eye E is stable.
- step S57 includes, for example, a step of detecting a connection area corresponding to a blood vessel from two consecutive projection images, a step of using the detected connection area as a landmark to calculate the positional error between the two projection images, and a step of comparing this positional error (deviation) with a threshold value.
- alignment evaluation may be performed based on the distribution of landmarks in a single projection image.
- the positions of feature points of the fundus oculi Ef can be estimated based on the distribution of landmarks in a single projection image.
- the scan in step S51 is a circle scan centered on the optic optic nerve center, and the landmarks in a single projection image are arranged on a circle. Therefore, the center of the circle formed by connecting these landmarks corresponds to the optic optic nerve center.
- the movement control unit 1140 can control the movement mechanism 1130 so that the center of the circle detected in this manner is positioned at the center position of the projection image. More generally, the movement control unit 1140 can control the movement mechanism 1130 so that the center of the circle detected is positioned at a predetermined position in the projection image.
- step S59 the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessels of interest in the fundus oculi Ef.
- OCT blood flow measurement OCT blood flow measurement
- the blood vessels of interest may be designated in the same manner as in the ophthalmic apparatus 1400 (blood vessel designation unit 1070) described above.
- step S60 the hemodynamic information generation unit 1080 generates hemodynamic information for the blood vessel of interest based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S59.
- the ophthalmic device 1600 can provide a novel alignment method that utilizes a vascular map that indicates the position of fundus blood vessels obtained from OCT images.
- FIG. 17 shows the configuration of an ophthalmic device 2000 according to one non-limiting embodiment.
- the ophthalmic device 2000 includes a scanning unit 2010, a three-dimensional image generating unit 2020, a cross-sectional image extracting unit 2030, a vascular region identifying unit 2040, a vascular region matching unit 2050, and a vascular map generating unit 2060.
- any of the matters described or suggested in this disclosure such as the matters relating to the ophthalmic device 1000 in FIG. 7, the matters relating to the ophthalmic device 1400 in FIG. 11, the matters relating to the ophthalmic device 1500 in FIG. 13, and the matters relating to the ophthalmic device 1400 in FIG. 15, can be combined with the ophthalmic device 2000 of this embodiment.
- the scanning unit 2010 collects data by applying an OCT scan to a three-dimensional region of the fundus Ef.
- Non-limiting examples of this OCT scan include a raster scan and a Lissajous scan.
- a non-limiting aspect of the ophthalmic apparatus 1 can realize the functions of the scanning unit 2010 using the fundus camera unit 2 and the OCT unit 100.
- the three-dimensional image generating unit 2020 is configured to generate a three-dimensional image of the fundus oculi Ef based on data collected from a three-dimensional region of the fundus oculi Ef by the scanning unit 2010.
- the generated three-dimensional image may be, for example, stack data or volume data.
- the three-dimensional image generating unit 2020 is realized by cooperation between hardware including circuits and three-dimensional image generating software.
- a non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the three-dimensional image generating unit 2020 using the data processing unit 230 (image generating unit 220).
- the cross-sectional image extraction unit 2030 extracts cross-sectional images from the three-dimensional image generated by the three-dimensional image generation unit 2020.
- the cross-sectional image extraction unit 2030 extracts multiple cross-sectional images corresponding to multiple cross sections of the fundus oculi Ef from the three-dimensional image.
- the multiple cross sections correspond to a scan pattern without a flyback.
- the scan pattern without a flyback may be, for example, any of the scan patterns described above regarding the scan unit 1010 of the ophthalmic device 1000, such as the concentric circular scan shown in Figures 8A to 8C.
- the cross-sectional image extraction unit 2030 is realized by cooperation between hardware including circuits and cross-sectional image extraction software.
- a non-limiting aspect of the ophthalmic device 1 can realize the functions of the cross-sectional image extraction unit 2030 using the data processing unit 230.
- the vascular region identification unit 2040 is configured to detect vascular region groups from each of the multiple cross-sectional images extracted from the three-dimensional image by the cross-sectional image extraction unit 2030, and thereby identify multiple vascular region groups corresponding to these multiple cross-sectional images.
- the vascular region identification unit 2040 has the same functions as the vascular region identification unit 1030 of the ophthalmic device 1000 described above.
- the vascular region identification unit 2040 is realized by cooperation between hardware including circuits and vascular region identification software.
- a non-limiting aspect of the ophthalmic device 1 can realize the functions of the vascular region identification unit 2040 using the data processing unit 230 (vascular region identification unit 231).
- the vascular region matching unit 2050 is configured to match vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups identified from multiple cross-sectional images by the vascular region identification unit 2040.
- the vascular region matching unit 2050 has the same functions as the vascular region matching unit 1040 of the ophthalmic device 1000 described above.
- the vascular region matching unit 2050 is realized by cooperation between hardware including circuits and vascular region matching software.
- a non-limiting aspect of the ophthalmic device 1 can achieve the functions of the vascular region matching unit 2050 using the data processing unit 230.
- the vascular map generation unit 2060 is configured to generate a vascular map showing the distribution of blood vessels based on the vascular region matching results obtained by the vascular region matching unit 2050.
- the vascular map generation unit 2060 has the same functions as the vascular map generation unit 1050 of the ophthalmic device 1000 described above.
- the vascular map generation unit 2060 is realized by cooperation between hardware including circuits and vascular map generation software.
- the ophthalmic device 1 of a non-limiting aspect can realize the functions of the vascular map generation unit 2060 using the data processing unit 230.
- Figure 18 shows an example of the operation of the ophthalmic device 2000.
- step S71 the scanning unit 2010 applies OCT scanning to a three-dimensional region of the fundus Ef of the subject's eye E to collect data.
- the collected data is sent to the three-dimensional image generating unit 2020.
- step S72 the three-dimensional image generation unit 2020 generates a three-dimensional image of the fundus oculi Ef based on the data collected from the three-dimensional region of the fundus oculi Ef in step S71.
- the generated three-dimensional image is sent to the cross-sectional image extraction unit 2030.
- step S73 the cross-sectional image extraction unit 2030 extracts, from the three-dimensional image generated in step S72, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus oculi Ef corresponding to a scan pattern without flyback.
- the cross-sectional image extraction unit 2030 extracts, from the three-dimensional image generated in step S72, two cross-sectional images corresponding to two concentric circle scans (concentric circle scans).
- the extracted cross-sectional images are sent to the vascular region identification unit 2040.
- step S74 the vascular region identification unit 2040 detects vascular region groups from each of the multiple cross-sectional images extracted from the 3D image in step S73. This identifies multiple vascular region groups corresponding to each of the multiple cross-sectional images. The identified multiple vascular region groups are sent to the vascular region association unit 2050.
- step S75 the vascular region matching unit 2050 matches vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified in step S74.
- the results of this vascular region matching process are sent to the vascular region matching unit 2050.
- step S76 the vascular map generation unit 2060 generates a vascular map based on the vascular region matching results obtained in step S75.
- the ophthalmic device 2000 capable of executing the processing procedures of this operational example generates a 3D OCT image of the fundus of the subject's eye, extracts from the 3D OCT image multiple cross-sectional images corresponding to multiple cross sections corresponding to a scan pattern without flyback, identifies multiple groups of vascular regions corresponding to the multiple cross-sectional images, and generates a vascular map by associating vascular regions corresponding to different cross sections of the same blood vessel.
- This vascular map can be used in preparatory steps (e.g., Doppler angle estimation, alignment, search for blood vessels of interest, etc.) in OCT blood flow measurement (fundus blood flow dynamics measurement). Therefore, the ophthalmic device 2000 of this embodiment contributes to improving the quality of OCT blood flow measurement.
- various features can be combined with the ophthalmic device 2000 of this embodiment.
- the features related to the ophthalmic device 1000 in FIG. 7, the features related to the ophthalmic device 1400 in FIG. 11, the features related to the ophthalmic device 1500 in FIG. 13, and the features related to the ophthalmic device 1400 in FIG. 15 can be combined with the ophthalmic device 2000.
- An ophthalmic device obtained by such a combination exhibits the functions and effects of the combined features, and further exhibits synergistic functions and effects between the combined features and the ophthalmic device 2000.
- an ophthalmic device obtained by combining multiple features with the ophthalmic device 2000 exhibits synergistic functions and effects between two or more of the multiple features, and synergistic functions and effects between two or more of the multiple features and the ophthalmic treatment 2000.
- the present disclosure also provides embodiments in categories other than ophthalmic devices.
- the present disclosure may provide an embodiment of a method for controlling an ophthalmic device, an embodiment of a method for controlling an ophthalmic information processing device, an embodiment of a program for causing a computer to execute each step of any of the methods, and an embodiment of a computer-readable non-transitory recording medium on which any of the programs is recorded.
- the recording medium may take any form.
- the recording medium may be any of a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
- Some embodiments are methods for controlling an ophthalmic device having a scanning unit and a processor that performs OCT scans.
- the method according to this embodiment causes the processor to perform scan control, cross-sectional image generation processing, vascular region identification processing, vascular region correspondence processing, and vascular map generation processing.
- the scan control controls the scanning unit to collect data by applying an OCT scan using a scan pattern that does not involve flyback to the fundus of the subject's eye.
- the cross-sectional image generation processing generates multiple cross-sectional images corresponding to multiple cross sections of the fundus based on data collected from the fundus under scan control.
- the vascular region identification processing identifies multiple vascular region groups corresponding to the multiple cross-sectional images by detecting vascular region groups from each of the multiple cross-sectional images generated.
- the vascular region correspondence processing associates vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups corresponding to the multiple cross sections.
- the vascular map generation processing generates a vascular map showing the distribution of blood vessels based on the results of the vascular region correspondence processing. The method according to this embodiment enables the ophthalmic device to perform the processing steps shown in FIG. 9 described above.
- This non-transitory recording medium may be in any form, including, for example, a magnetic disk, optical disk, magneto-optical disk, and semiconductor memory. Any of the features described in this disclosure may be combined with the method, program, and recording medium according to this embodiment.
- Some embodiments are methods for controlling an ophthalmic device having a scanning unit and a processor that performs OCT scanning.
- the method of this embodiment causes the processor to perform scan control, 3D image generation processing, cross-sectional image extraction processing, vascular region identification processing, vascular region correspondence processing, and vascular map generation processing.
- the scan control controls the scanning unit to apply an OCT scan to a 3D region of the fundus of the subject's eye to collect data.
- the 3D image generation processing generates a 3D image of the fundus based on data collected from the 3D region of the fundus under the scan control.
- the cross-sectional image extraction processing extracts, from the 3D image generated by the 3D image generation processing, multiple cross-sectional images corresponding to multiple cross sections of the fundus corresponding to a scan pattern without flyback.
- the vascular region identification processing detects vascular region groups from each of the multiple cross-sectional images extracted from the 3D image, thereby identifying multiple vascular region groups corresponding to these multiple cross-sectional images.
- the vascular region correspondence processing associates vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups corresponding to multiple cross sections.
- the vascular map generation process generates a vascular map showing the distribution of blood vessels based on the results of the vascular region association process.
- a scanning unit that applies an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data
- a cross-sectional image generating unit that generates a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data collected by the scanning unit
- a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images
- a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups
- a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on a result of the association.
- the scan pattern includes a plurality of patterns each of which does not involve a flyback.
- the scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically, the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns;
- the scanning unit applies an optical coherence tomography scan using the concentric pattern to the fundus to collect data;
- the cross-sectional image generation unit generates the plurality of cross-sectional images corresponding to the plurality of concentric cross sections, respectively, based on the data collected by the optical coherence tomography scan using the concentric pattern;
- the concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically, the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged and correspond to the plurality of circular patterns, respectively;
- the scanning unit applies an optical coherence tomography scan using the concentric circle pattern to the fundus to collect data;
- the cross-sectional image generating unit generates the plurality of cross-sectional images corresponding to the plurality of cylindrical side surfaces, respectively, based on the data collected by the optical coherence tomography scan using the concentric circle pattern.
- a scanning unit that applies optical coherence tomography scanning to a three-dimensional region of the fundus of the subject's eye to collect data
- a three-dimensional image generating unit that generates a three-dimensional image of the fundus based on the data collected by the scanning unit
- a cross-sectional image extracting unit that extracts, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback
- a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images
- a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups
- a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on
- the scan pattern includes a plurality of patterns each of which does not involve a flyback.
- the scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically, the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns; the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of concentric cross sections from the three-dimensional image;
- the concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically, the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged and correspond to the plurality of circular patterns, respectively; the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of cylinder side surfaces, respectively, from the three-dimensional image;
- the ophthalmic devices listed above are number 7.
- the vascular region specifying unit generating a plurality of projection images corresponding to the plurality of cross-sectional images by applying a projection of the optical coherence tomography scan in an A-scan direction to each of the plurality of cross-sectional images; for each of the plurality of projection images, detecting a blood vessel region group from a corresponding cross-sectional image based on a luminance distribution in the projection image; Any one of the ophthalmic devices 1 to 8 above.
- a scan control unit that controls the scanning unit so as to apply repeated scans to a pre-specified blood vessel of interest in the fundus; and a hemodynamic information generating unit that generates hemodynamic information in the blood vessel of interest based on data collected by the repeated scans. Any one of the ophthalmic devices 1 to 9 above.
- a blood vessel of interest designation unit that analyzes the blood vessel map and designates the blood vessel of interest.
- the plurality of cross-sectional images include a first cross-sectional image and a second cross-sectional image
- the vascular region specifying unit detecting a first blood vessel region from the first cross-sectional image; detecting a second blood vessel region from the second cross-sectional image; the vascular region associating unit associates the first vascular region and the second vascular region with each other as different cross sections of the blood vessel of interest;
- the hemodynamic information generating unit generating first position information indicating a position of the first vascular region; generating second position information indicating a position of the second vascular region; generating Doppler angle information indicating the magnitude of the Doppler angle in the optical coherence tomography blood flow measurement for the blood vessel of interest based on the first position information and the second position information; 12.
- the ophthalmic device according to claim 10 or 11.
- the hemodynamic information generating unit generates the hemodynamic information in the blood vessel of interest based on the data collected by the repeated scans and the Doppler angle information.
- the above 12 ophthalmic devices generates the hemodynamic information in the blood vessel of interest based on the data collected by the repeated scans and the Doppler angle information.
- the hemodynamic information generating unit generates evaluation information by applying an evaluation process to the magnitude of the Doppler angle; Further, a display control unit is provided to display the evaluation information on a display device. 14. The ophthalmic device according to claim 12 or 13.
- An observation image generating unit that generates an infrared observation image of the fundus; a movement mechanism that moves the scanning unit; a movement control unit that controls the movement mechanism based on a displacement of the infrared observation image relative to the blood vessel map, When the deviation becomes smaller than a preset threshold, the scan control unit executes the control of the scan unit. 15.
- the ophthalmic device according to any one of 10 to 14 above.
- the vascular map generating unit generates a new vascular map based on new data newly collected from the fundus by the scanning unit; a projection image generator that applies a projection of the optical coherence tomography scan in an A-scan direction to the vascular map to generate a first projection image, and applies the projection to the new vascular map to generate a second projection image; a movement mechanism that moves the scanning unit; a movement control unit that controls the movement mechanism based on a deviation between the first projection image and the second projection image. 15.
- the ophthalmic device according to any one of 10 to 14 above.
- a method for controlling an ophthalmic apparatus having a scanning unit and a processor that performs an optical coherence tomography scan comprising: the processor, a scan control that controls the scan unit to apply an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data; a cross-sectional image generation process for generating a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data; a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images; a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups; and a vascular map generation process for generating a vascular map showing the distribution of blood vessels based on the result of the association. method.
- a method for controlling an ophthalmic apparatus having a scanning unit and a processor that performs an optical coherence tomography scan comprising: the processor, a scan control that controls the scanning unit to collect data by applying an optical coherence tomography scan to a three-dimensional region of the fundus of the subject's eye; a three-dimensional image generation process for generating a three-dimensional image of the fundus based on the data; a cross-sectional image extraction process for extracting, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback; a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images; a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality
- Ophthalmic apparatus 1010 Scan unit 1020 Cross-sectional image generating unit 1030 Blood vessel region identifying unit 1040 Blood vessel region matching unit 1050 Blood vessel map generating unit
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
(関連出願の相互参照)本出願は、2024年2月26日に出願された、「OPHTHALMIC OPTICAL COHERENCE TOMOGRAPHY」と題する米国仮特許出願第63/557,720号の優先権を主張し、その内容全体が参照により本明細書に組み込まれる。 (CROSS-REFERENCE TO RELATED APPLICATIONS) This application claims priority to U.S. Provisional Patent Application No. 63/557,720, entitled "OPHTHALMIC OTICAL COHERENCE TOMOGRAPHY," filed February 26, 2024, the entire contents of which are incorporated herein by reference.
本開示は、眼科装置、その制御方法、プログラム、及び記録媒体に関する。 This disclosure relates to an ophthalmic device, a control method therefor, a program, and a recording medium.
眼科診療では様々な種類のイメージングモダリティが用いられる。その代表例として、眼底カメラ、走査型レーザー検眼鏡(SLO)、スリットランプ顕微鏡、光コヒーレンストモグラフィ(OCT)などがある。OCTは、構造イメージング及び機能イメージングの双方に利用可能である。 Various types of imaging modalities are used in ophthalmology. Representative examples include fundus cameras, scanning laser ophthalmoscopy (SLO), slit lamp microscopes, and optical coherence tomography (OCT). OCT can be used for both structural and functional imaging.
OCTを用いた構造イメージングは、被検物の構造に応じて変化するOCT信号強度の空間分布を画像として表現する技術である。この技術で生成される画像は、OCT強度画像(OCT intensity image)又は単に強度画像などと呼ばれる。 Structural imaging using OCT is a technique that represents the spatial distribution of OCT signal intensity, which changes depending on the structure of the test object, as an image. Images generated using this technique are called OCT intensity images or simply intensity images.
OCTを用いた機能イメージングの一つの手法に、OCT血流計測がある。OCT血流計測は、OCTを利用して血流動態を求めるドップラー計測であり、ドップラーOCTとも呼ばれる。OCT血流計測は、OCT測定光で血管の断面を反復的にスキャンしてデータセットを収集し、このデータセットにおける差分から血流によるドップラー信号を求めるとともに、当該血管とOCT測定光とのなす角度(ドップラー角度)を求めることによって、網膜血流速度の大きさを求める技術である。また、得られた血流速度に血管断面積を乗算することで、血流量を求めることができる。眼科分野においてOCT血流計測が適用される血管は、典型的には眼底血管であり、特に網膜血管である。ただし、脈絡膜血管を対象としたOCT血流計測の報告もある。 One method of functional imaging using OCT is OCT blood flow measurement. OCT blood flow measurement is a Doppler measurement that uses OCT to determine blood flow dynamics, and is also called Doppler OCT. OCT blood flow measurement is a technique that repeatedly scans the cross section of a blood vessel with OCT measurement light to collect a data set, and determines the Doppler signal due to blood flow from the difference in this data set, as well as the angle (Doppler angle) between the blood vessel and the OCT measurement light, thereby determining the magnitude of retinal blood flow velocity. Furthermore, the blood flow volume can be calculated by multiplying the obtained blood flow velocity by the cross-sectional area of the blood vessel. In the field of ophthalmology, OCT blood flow measurement is typically applied to blood vessels in the fundus, and particularly retinal blood vessels. However, there have also been reports of OCT blood flow measurement of choroidal blood vessels.
本開示の目的は、ドップラーOCTを利用した眼底血流動態計測の向上を図ることにある。 The purpose of this disclosure is to improve fundus hemodynamic measurement using Doppler OCT.
いくつかの実施形態に係る眼科装置は、スキャン部と、断面画像生成部と、血管領域特定部と、血管領域対応付け部と、血管マップ生成部とを備えている。スキャン部は、フライバックを伴わないスキャンパターンによる光コヒーレンストモグラフィスキャンを被検眼の眼底に適用してデータを収集するように構成されている。断面画像生成部は、スキャン部により収集されたデータに基づいて、眼底の複数の断面にそれぞれ対応する複数の断面画像を生成するように構成されている。血管領域特定部は、生成された複数の断面画像のそれぞれから血管領域群を検出することにより、複数の断面画像にそれぞれ対応する複数の血管領域群を特定するように構成されている。血管領域対応付け部は、複数の断面画像から特定された複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行うように構成されている。血管マップ生成部は、血管領域の対応付けの結果に基づいて、血管の分布を示す血管マップを生成するように構成されている。 In some embodiments, an ophthalmologic apparatus includes a scanning unit, a cross-sectional image generating unit, a vascular region identifying unit, a vascular region matching unit, and a vascular map generating unit. The scanning unit is configured to collect data by applying an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye. The cross-sectional image generating unit is configured to generate multiple cross-sectional images corresponding to multiple cross sections of the fundus based on the data collected by the scanning unit. The vascular region identifying unit is configured to identify multiple vascular region groups corresponding to the multiple cross-sectional images by detecting vascular region groups from each of the multiple cross-sectional images generated. The vascular region matching unit is configured to match vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified from the multiple cross-sectional images. The vascular map generating unit is configured to generate a vascular map indicating the distribution of blood vessels based on the results of the vascular region matching.
いくつかの実施形態によれば、ドップラーOCTを利用した眼底血流動態計測の向上を図ることが可能である。 According to some embodiments, it is possible to improve fundus hemodynamic measurements using Doppler OCT.
本開示に係るいくつかの非限定的な実施形態を説明する。本開示では、眼科装置(例えば、眼科血流計測装置、眼科イメージング装置など)の実施形態、眼科装置を制御する方法の実施形態、プログラムの実施形態、及び、記録媒体の実施形態などを説明する。ただし、本開示に係る実施形態のカテゴリーは、これらに限定されるものではない。 Several non-limiting embodiments of the present disclosure will be described. This disclosure describes embodiments of ophthalmic devices (e.g., ophthalmic blood flow measuring devices, ophthalmic imaging devices, etc.), embodiments of methods for controlling ophthalmic devices, embodiments of programs, and embodiments of recording media. However, the categories of embodiments of the present disclosure are not limited to these.
本開示に係る実施形態は、ドップラーOCTを利用した眼底血流動態計測において生じる問題を解決するために採用可能である。ドップラーOCTを利用した眼底血流動態計測における問題には様々なものがある。 Embodiments of the present disclosure can be employed to solve problems that arise when measuring fundus hemodynamics using Doppler OCT. There are a variety of problems that arise when measuring fundus hemodynamics using Doppler OCT.
本開示のいくつかの実施形態は、眼底血流動態計測に関する処理や作業の品質向上を意図している。着目している処理や作業の例として、眼底に対する装置光学系のアライメント、ドップラー角度の推定、計測を適用する血管の探索などがある。ただし、これらに限定されるものではない。 Some embodiments of the present disclosure are intended to improve the quality of processes and tasks related to fundus hemodynamic measurements. Examples of processes and tasks of interest include, but are not limited to, alignment of the device optical system with the fundus, estimation of Doppler angles, and search for blood vessels to which measurements are to be applied.
従来、アライメントは、眼底の赤外観察画像(リアルタイム動画像)を参照して行われていた。いくつかの実施形態は、OCT画像から得られる眼底血管の位置を利用した新規なアライメント手法を提供する。 Conventionally, alignment has been performed by referencing infrared observation images (real-time moving images) of the fundus. Some embodiments provide a novel alignment method that utilizes the positions of fundus blood vessels obtained from OCT images.
また、従来、ユーザーは、眼底の赤外観察画像又は予め取得された眼底写真を参照しながら、血流計測を適用する血管や断面を決定していた。いくつかの実施形態は、眼底血管の向き情報(ドップラー角度、その好適度など)を生成する新規な手法を提供する。また、いくつかの実施形態によれば、生成された向き情報を用いることにより、血流計測位置(計測対象の血管、計測対象の断面など)を指定する作業の容易化や省力化を図ることが可能になる。また、いくつかの実施形態によれば、血流計測位置を指定する作業の精度や正確度の向上を図ることが可能になる。また、いくつかの実施形態によれば、血流計測位置を或る程度予測することが可能になる。これにより、眼底血流動態計測を実際に適用する前に、計測適用エリアや解析適用エリアを制限することができるため、血流計測位置を指定する作業の容易化や省力化を更に促進することが可能になる。 Furthermore, in the past, users would refer to an infrared observation image of the fundus or a previously acquired fundus photograph to determine the blood vessels and cross sections to which blood flow measurement should be applied. Some embodiments provide a novel method for generating orientation information (Doppler angle, its suitability, etc.) of fundus blood vessels. Furthermore, some embodiments use the generated orientation information to facilitate and reduce the effort required to specify blood flow measurement positions (blood vessels to be measured, cross sections to be measured, etc.). Furthermore, some embodiments improve the precision and accuracy of specifying blood flow measurement positions. Furthermore, some embodiments make it possible to predict blood flow measurement positions to a certain extent. This makes it possible to limit the measurement application area and analysis application area before actually applying fundus hemodynamic measurement, further facilitating and reducing the effort required to specify blood flow measurement positions.
また、本開示のいくつかの実施形態は、次の問題にも着目している。眼底血流動態計測で得られる位相信号は、良好なコントラストメカニズムを有している。しかし、拍動が比較的弱い拡張期に計測が適用される場合、検出される信号の強度が低いため、計測の品質が低くなることがある。動脈と比べて拍動が弱い静脈に計測が適用される場合や、好適でないドップラー角度で計測が適用される場合においても、検出される信号の強度が低下して計測品質が低くなることがある。また、眼底血管は3次元的に複雑に走行し分布している。実際の血管走行は、眼底イメージングを行うことで把握することができる。従来、この情報は利用されていなかった。いくつかの実施形態は、眼底血流動態計測のために好適な血管のマップを生成することによって、当該問題への対処を図る。 Some embodiments of the present disclosure also address the following problem: The phase signal obtained by fundus hemodynamic measurement has a good contrast mechanism. However, when measurement is performed during the diastolic phase, when pulsation is relatively weak, the detected signal strength is low, which can result in poor measurement quality. When measurement is performed on veins, which have a weaker pulsation than arteries, or when measurement is performed at an unsuitable Doppler angle, the detected signal strength can also be reduced, resulting in poor measurement quality. Furthermore, fundus blood vessels run and are distributed in a complex three-dimensional manner. The actual blood vessel course can be determined by performing fundus imaging. Conventionally, this information has not been utilized. Some embodiments address this problem by generating a map of blood vessels suitable for fundus hemodynamic measurement.
なお、本開示に係る技術を利用して扱うことが可能な問題が、以上に例として述べた事項に限定されないことは、当業者であれば理解することができるであろう。 It should be clear to those skilled in the art that the problems that can be addressed using the technology disclosed herein are not limited to the examples given above.
〈眼科装置の実施形態〉
実施形態に係る眼科装置について、いくつかの非限定的な態様を説明する。実施形態に係る眼科装置は、OCT血流計測を実行する機能と、OCT血流計測で得られたデータを処理する機能とを備えている。
<Embodiment of Ophthalmic Apparatus>
Some non-limiting aspects of the ophthalmic apparatus according to the embodiment will be described below. The ophthalmic apparatus according to the embodiment has a function of performing OCT blood flow measurement and a function of processing data obtained by the OCT blood flow measurement.
本開示で主に説明する態様の眼科装置は、OCT血流計測(OCTスキャン及び画像生成処理)を実行可能なOCT装置として機能するものである。別のいくつかの態様の眼科装置は、OCT血流計測の少なくとも一部の処理を実行可能でなくてもよい。 The ophthalmic device of the aspect primarily described in this disclosure functions as an OCT device capable of performing OCT blood flow measurement (OCT scanning and image generation processing). Some other aspects of the ophthalmic device may not be capable of performing at least some of the processing involved in OCT blood flow measurement.
OCTの方式は、任意であってよく、例えば、スペクトラルドメインOCT及びスウェプトソースOCTのいずれかであってよい。スペクトラルドメインOCTは、低コヒーレンス光源からの光を測定光と参照光とに分割し、被検物からの測定光の戻り光を参照光と重ね合わせて干渉光を生成し、この干渉光のスペクトル分布を分光器で検出し、検出されたスペクトル分布にフーリエ変換などの処理を施して画像を構築する手法である。スウェプトソースOCTは、波長可変光源からの光を測定光と参照光とに分割し、被検物からの測定光の戻り光を参照光と重ね合わせて干渉光を生成し、この干渉光を光検出器(バランスドフォトダイオードなど)で検出し、波長の掃引及び測定光のスキャンに応じて収集された検出データにフーリエ変換などの処理を施して画像を構築する手法である。すなわち、スペクトラルドメインOCTは空間分割でスペクトル分布を取得するOCT方式であり、スウェプトソースOCTは時分割でスペクトル分布を取得するOCT方式である。なお、タイムドメインOCTなど、他のOCT方式を用いてもよい。 The OCT method may be any method, for example, spectral domain OCT or swept-source OCT. Spectral domain OCT is a method in which light from a low-coherence light source is split into measurement light and reference light, and return light from the test object is superimposed on the reference light to generate interference light. The spectral distribution of this interference light is detected with a spectrometer, and the detected spectral distribution is subjected to processing such as Fourier transform to construct an image. Swept-source OCT is a method in which light from a tunable light source is split into measurement light and reference light, and return light from the test object is superimposed on the reference light to generate interference light. The interference light is detected with a photodetector (such as a balanced photodiode), and the detection data collected in response to wavelength sweeping and measurement light scanning is subjected to processing such as Fourier transform to construct an image. In other words, spectral domain OCT is an OCT method that acquires spectral distributions using spatial division, while swept-source OCT is an OCT method that acquires spectral distributions using time division. Other OCT methods, such as time domain OCT, may also be used.
本開示で主に説明する態様の眼科装置は、眼底を撮影可能な眼底カメラとしての機能を有している。別のいくつかの態様の眼科装置は、眼底カメラとしての機能に加えて、又はその代わりに、SLO、スリットランプ顕微鏡、又は手術用顕微鏡など、任意の眼科イメージングモダリティとしての機能を有していてもよい。 The ophthalmic device of the embodiment primarily described in this disclosure functions as a fundus camera capable of photographing the fundus. Other embodiments of the ophthalmic device may function as any ophthalmic imaging modality, such as an SLO, a slit lamp microscope, or a surgical microscope, in addition to or instead of functioning as a fundus camera.
本開示において、特に言及しない限り、「画像データ」と、それに基づく視覚情報である「画像」とを区別しない。また、特に言及しない限り、被検眼の部位又は組織と、その画像(画像データ)とを区別しない。 In this disclosure, unless otherwise specified, no distinction is made between "image data" and the "image" that is the visual information based on it. Furthermore, unless otherwise specified, no distinction is made between the part or tissue of the subject's eye and its image (image data).
いくつかの例示的な実施形態に係る眼科装置は、眼底撮影機能を有しなくてよい。そのような眼科装置は、記憶装置や記録媒体から眼底の正面画像を取得する機能を有する。記憶装置の典型例として医用画像アーカイビングシステム(医用画像ファイリングシステム)がある。また、記録媒体の典型的な例としてハードディスクドライブや光ディスクがある。 In some exemplary embodiments, the ophthalmologic apparatus does not need to have a fundus imaging function. Such an ophthalmologic apparatus has the function of acquiring a front image of the fundus from a storage device or recording medium. A typical example of a storage device is a medical image archiving system (medical image filing system). Also, typical examples of recording media are a hard disk drive or an optical disk.
本開示に係る実施形態の要素の機能の少なくとも一部は、回路構成(circuitry)又は処理回路構成(Processing circuitry)を用いて実装される。回路構成又は処理回路構成は、開示された機能の少なくとも一部を実行するように構成及び/又はプログラムされた、汎用プロセッサ、専用プロセッサ、集積回路、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、ASIC(Application Specific Integrated Circuit)、プログラマブル論理デバイス(例えば、SPLD(Simple Programmable Logic Device)、CPLD(Complex Programmable Logic Device)、FPGA(Field Programmable Gate Array)、従来の回路構成、及びそれらの任意の組み合わせのいずれかを含む。プロセッサは、トランジスタ及び/又は他の回路構成を含む、処理回路構成又は回路構成とみなされる。本開示において、回路構成、ユニット、手段、又はこれらに類する用語は、開示された機能の少なくとも一部を実行するハードウェア、又は、開示された機能の少なくとも一部を実行するようにプログラムされたハードウェアである。ハードウェアは、本明細書に開示されたハードウェアであってよく、或いは、記載された機能の少なくとも一部を実行するようにプログラム及び/又は構成された既知のハードウェアであってもよい。ハードウェアが或るタイプの回路構成とみなされ得るプロセッサである場合、回路構成、ユニット、手段、又はこれらに類する用語は、ハードウェアとソフトウェアとの組み合わせであり、このソフトウェアはハードウェア及び/又はプロセッサを構成するために使用される。 At least a portion of the functionality of the elements of the embodiments of the present disclosure is implemented using circuitry or processing circuitry. The circuitry or processing circuitry may include any of a general purpose processor, a special purpose processor, an integrated circuit, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), a programmable logic device (e.g., a Simple Programmable Logic Device (SPLD), a Complex Programmable Logic Device (CPLD), a Field Programmable Gate Array (FPGA)), conventional circuitry, and any combination thereof, configured and/or programmed to perform at least some of the disclosed functions. In this disclosure, a circuit configuration, unit, means, or similar term refers to hardware that performs at least some of the disclosed functions or that is programmed to perform at least some of the disclosed functions. The hardware may be hardware disclosed herein or known hardware that is programmed and/or configured to perform at least some of the described functions. If the hardware is a processor, which can be considered a type of circuit configuration, the circuit configuration, unit, means, or similar term refers to a combination of hardware and software, and the software is used to configure the hardware and/or processor.
例示的な眼科装置の構成を図1~図4に示す。本例の眼科装置1は、眼底カメラユニット2と、OCTユニット100と、演算制御ユニット200とを含む。眼底カメラユニット2には、眼底撮影及び前眼部撮影が可能な眼底カメラの要素と、OCTスキャナーの要素とが設けられている。OCTユニット100には、OCTスキャナーの要素が設けられている。演算制御ユニット200は、各種の処理(演算、解析、制御など)を実行するように構成された1つ以上のプロセッサを含んでいる。 The configuration of an exemplary ophthalmic device is shown in Figures 1 to 4. The ophthalmic device 1 in this example includes a fundus camera unit 2, an OCT unit 100, and an arithmetic and control unit 200. The fundus camera unit 2 is equipped with elements of a fundus camera capable of photographing the fundus and the anterior segment, and elements of an OCT scanner. The OCT unit 100 is equipped with elements of an OCT scanner. The arithmetic and control unit 200 includes one or more processors configured to perform various processes (calculation, analysis, control, etc.).
眼底カメラユニット2について説明する。眼底カメラユニット2は、被検眼Eの眼底Ef(及び前眼部)を撮影するための光学系を含む。眼底カメラユニット2により取得されるデジタル画像は、典型的には、正面画像である。眼底カメラユニット2は、例えば、近赤外の定常光を照明光に用いた動画撮影によって観察画像を取得することができ、可視のフラッシュ光を照明光に用いた撮影によって撮影画像を取得することができる。 The fundus camera unit 2 will now be described. The fundus camera unit 2 includes an optical system for photographing the fundus Ef (and the anterior segment) of the subject's eye E. The digital image acquired by the fundus camera unit 2 is typically a front image. The fundus camera unit 2 can, for example, acquire observation images by video capture using near-infrared fixed light as illumination light, and can acquire photographed images by capture using visible flash light as illumination light.
眼底カメラユニット2は、照明光学系10と撮影光学系30とを備えている。照明光学系10は、被検眼Eに照明光を照射する。撮影光学系30は、被検眼Eに照射された照明光の戻り光を検出する。換言すると、撮影光学系30は、照明光により照明されている被検眼Eを撮影する。OCTユニット100から提供されるOCT測定光は、眼底カメラユニット2内の光路を通じて被検眼Eに導かれる。被検眼Eに適用されたOCT測定光の戻り光は、眼底カメラユニット2内の光路を通じてOCTユニット100に導かれる。 The fundus camera unit 2 comprises an illumination optical system 10 and an imaging optical system 30. The illumination optical system 10 irradiates illumination light onto the subject's eye E. The imaging optical system 30 detects the return light of the illumination light irradiated onto the subject's eye E. In other words, the imaging optical system 30 photographs the subject's eye E illuminated by the illumination light. The OCT measurement light provided by the OCT unit 100 is guided to the subject's eye E via an optical path within the fundus camera unit 2. The return light of the OCT measurement light applied to the subject's eye E is guided to the OCT unit 100 via an optical path within the fundus camera unit 2.
照明光学系10の観察光源11から出力された観察照明光は、凹面鏡12により反射され、集光レンズ13を経由し、可視カットフィルター14を透過して近赤外光となり、撮影光源15の近傍にて一旦集束し、ミラー16により反射され、リレーレンズ系17とリレーレンズ18と絞り19とリレーレンズ系20とを経由して有孔ミラー21に導かれ、有孔ミラー21の中央孔部の周囲のミラー部にて反射され、ダイクロイックミラー46を透過し、対物レンズ22により屈折され、被検眼E(眼底Ef)に投射される。被検眼Eに投射された観察照明光の戻り光は、対物レンズ22により屈折され、ダイクロイックミラー46を透過し、有孔ミラー21の中央孔部を通過し、ダイクロイックミラー55を透過し、撮影合焦レンズ31を経由し、ミラー32により反射され、ハーフミラー33Aを透過し、ダイクロイックミラー33により反射され、結像レンズ34によりイメージセンサー35の受光面に結像される。イメージセンサー35は、一定の時間間隔(フレームレート)で戻り光を検出する。撮影光学系30のフォーカスは、撮影部位に応じて調整される。 The observation illumination light output from the observation light source 11 of the illumination optical system 10 is reflected by the concave mirror 12, passes through the condenser lens 13, and passes through the visible cut filter 14 to become near-infrared light.It is then focused near the imaging light source 15, reflected by the mirror 16, and passed through the relay lens system 17, relay lens 18, aperture 19, and relay lens system 20 to be guided to the perforated mirror 21, reflected by the mirror portion surrounding the central hole of the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the subject's eye E (fundus Ef). The return light of the observation illumination light projected onto the subject's eye E is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the central hole in the perforated mirror 21, passes through the dichroic mirror 55, passes through the photographing focusing lens 31, is reflected by the mirror 32, passes through the half mirror 33A, is reflected by the dichroic mirror 33, and is imaged on the light-receiving surface of the image sensor 35 by the imaging lens 34. The image sensor 35 detects the return light at regular time intervals (frame rate). The focus of the photographing optical system 30 is adjusted according to the photographing area.
撮影光源15から出力された撮影照明光は、観察照明光と同様の経路を通って眼底Efに投射される。被検眼Eからの撮影照明光の戻り光は、観察照明光の戻り光と同じ経路を通ってダイクロイックミラー33まで導かれ、ダイクロイックミラー33を透過し、ミラー36により反射され、結像レンズ37によりイメージセンサー38の受光面に結像される。 The imaging illumination light output from the imaging light source 15 travels the same path as the observation illumination light and is projected onto the fundus Ef. The return light of the imaging illumination light from the subject's eye E travels the same path as the return light of the observation illumination light and is guided to the dichroic mirror 33, passes through the dichroic mirror 33, is reflected by the mirror 36, and is imaged by the imaging lens 37 on the light-receiving surface of the image sensor 38.
液晶ディスプレイ(LCD)39は、視線の誘導及び固定を行うための固視標(固視標画像)を表示する。液晶ディスプレイ39から出力された光束は、ハーフミラー33Aに反射され、ミラー32に反射され、撮影合焦レンズ31及びダイクロイックミラー55を経由し、有孔ミラー21の中央孔部を通過し、ダイクロイックミラー46を透過し、対物レンズ22により屈折され、眼底Efに投射される。これにより、被検者は、固視標を視覚的に認識することができる。 The liquid crystal display (LCD) 39 displays a fixation target (fixation target image) for guiding and fixing the gaze. The light beam output from the LCD display 39 is reflected by the half mirror 33A, reflected by the mirror 32, passes through the photographic focusing lens 31 and dichroic mirror 55, passes through the central hole in the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus Ef. This allows the subject to visually recognize the fixation target.
アライメント光学系50は、被検眼Eに対する眼科装置1の位置合わせのためのアライメント指標を生成する。発光ダイオード(LED)51から出力されたアライメント光は、絞り52と絞り53とリレーレンズ54とを経由し、ダイクロイックミラー55により反射され、有孔ミラー21の中央孔部を通過し、ダイクロイックミラー46を透過し、対物レンズ22を介して被検眼Eに投射される。アライメント光の被検眼Eからの戻り光は、観察照明光の戻り光と同じ経路を通ってイメージセンサー35に導かれる。その受光像(アライメント指標像)を参照することによってマニュアルアライメントやオートアライメントを実行することができる。 The alignment optical system 50 generates an alignment index for aligning the ophthalmic apparatus 1 with the subject's eye E. The alignment light output from the light-emitting diode (LED) 51 passes through the diaphragm 52, the diaphragm 53, and the relay lens 54, is reflected by the dichroic mirror 55, passes through the central hole in the perforated mirror 21, transmits through the dichroic mirror 46, and is projected onto the subject's eye E via the objective lens 22. The return light of the alignment light from the subject's eye E is guided to the image sensor 35 along the same path as the return light of the observation illumination light. Manual alignment and automatic alignment can be performed by referring to the received light image (alignment index image).
フォーカス光学系60は、被検眼Eに対するフォーカス調整に用いられるスプリット指標を生成する。撮影光学系30の光路(撮影光路)に沿った撮影合焦レンズ31の移動に連動して、フォーカス光学系60は、照明光学系10の光路(照明光路)に沿って移動される。反射棒67は、照明光路に対して挿脱される。フォーカス調整を行う際には、反射棒67の反射面が照明光路に傾斜配置される。LED61から出力されたフォーカス光は、リレーレンズ62を通過し、スプリット指標板63により2つの光束に分離され、二孔絞り64を通過し、ミラー65により反射され、集光レンズ66により反射棒67の反射面に一旦結像されて反射され、リレーレンズ20を経由し、有孔ミラー21に反射され、ダイクロイックミラー46を透過し、対物レンズ22を介して被検眼Eに投射される。フォーカス光の被検眼Eからの戻り光は、アライメント光の戻り光と同じ経路を通ってイメージセンサー35に導かれる。その受光像(スプリット指標像)を参照することによってマニュアルフォーカシングやオートフォーカシングを実行することができる。 The focusing optical system 60 generates a split index used to adjust the focus of the subject's eye E. The focusing optical system 60 moves along the optical path (illumination optical path) of the illumination optical system 10 in conjunction with the movement of the photographing focusing lens 31 along the optical path (photographing optical path) of the photographing optical system 30. The reflecting rod 67 is inserted and removed from the illumination optical path. When adjusting the focus, the reflective surface of the reflecting rod 67 is positioned at an angle to the illumination optical path. The focusing light output from the LED 61 passes through the relay lens 62, is separated into two beams of light by the split index plate 63, passes through the two-hole diaphragm 64, is reflected by the mirror 65, is first imaged and reflected by the condenser lens 66 on the reflective surface of the reflecting rod 67, passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, and is projected onto the subject's eye E via the objective lens 22. The return light of the focusing light from the subject's eye E is guided to the image sensor 35 along the same path as the return light of the alignment light. Manual focusing and autofocusing can be performed by referring to the received light image (split target image).
被検眼Eが強度遠視眼である場合、有孔ミラー21とダイクロイックミラー55との間の撮影光路に、視度補正レンズ70(プラスレンズ)が配置される。一方、被検眼Eが強度近視眼である場合には、視度補正レンズ71(マイナスレンズ)が配置される。 If the subject's eye E is highly hyperopic, a diopter correction lens 70 (plus lens) is placed in the imaging optical path between the perforated mirror 21 and the dichroic mirror 55. On the other hand, if the subject's eye E is highly myopic, a diopter correction lens 71 (minus lens) is placed.
ダイクロイックミラー46は、眼底カメラユニット2による撮影用の光路と、OCT用の光路(測定アーム)とを合成する。ダイクロイックミラー46は、OCT用の波長帯の光を反射し、眼底カメラユニット2による撮影用の波長帯の光を透過させる。測定アームには、OCTユニット100側から順に、コリメーターレンズユニット40、リトロリフレクター41、分散補償部材42、OCT合焦レンズ43、光スキャナー44、及びリレーレンズ45が設けられている。リトロリフレクター41は、これに入射するOCT測定光の光路に沿って移動可能であり、眼軸長に応じた光路長補正や、干渉状態の調整に利用される。分散補償部材42は、測定アームと参照アームとの間の分散補償に用いられる。OCT合焦レンズ43は、測定アームに沿って移動可能であり、測定アームのフォーカス調整に用いられる。眼科装置1のフォーカス調整は、撮影合焦レンズ31の移動と、フォーカス光学系60の移動と、OCT合焦レンズ43の移動との連係によって行われる。光スキャナー44は、アライメントによって、被検眼Eの瞳孔に対して実質的に共役な位置に配置され、OCT測定光の進行方向を変化させる。光スキャナー44は、例えば、2次元スキャンが可能なガルバノスキャナーである。 The dichroic mirror 46 combines the optical path for imaging by the fundus camera unit 2 with the optical path for OCT (measurement arm). The dichroic mirror 46 reflects light in the wavelength band for OCT and transmits light in the wavelength band for imaging by the fundus camera unit 2. The measurement arm is equipped with, in order from the OCT unit 100 side, a collimator lens unit 40, a retroreflector 41, a dispersion compensation element 42, an OCT focusing lens 43, an optical scanner 44, and a relay lens 45. The retroreflector 41 is movable along the optical path of the OCT measurement light incident on it and is used to correct the optical path length according to the axial length and adjust the interference state. The dispersion compensation element 42 is used for dispersion compensation between the measurement arm and the reference arm. The OCT focusing lens 43 is movable along the measurement arm and is used to adjust the focus of the measurement arm. Focus adjustment of the ophthalmologic apparatus 1 is performed by coordinating the movement of the imaging focusing lens 31, the movement of the focusing optical system 60, and the movement of the OCT focusing lens 43. The optical scanner 44 is positioned at a position substantially conjugate with the pupil of the subject's eye E through alignment, and changes the direction of travel of the OCT measurement light. The optical scanner 44 is, for example, a galvano scanner capable of two-dimensional scanning.
OCTユニット100について説明する。図2に示すOCTユニット100は、スペクトラルドメイン方式のOCT光学系を備えている。このOCT光学系は干渉光学系を含む。この干渉光学系は、低コヒーレンス光源(広帯域光源)からの光を測定光LS(OCT測定光)と参照光LRとに分割し、被検眼Eに投射された測定光LSの戻り光を参照光LRに重ね合わせて干渉光LCを生成する。生成された干渉光LCは、分光器130によって検出される。これにより、干渉光LCのスペクトル分布を示す信号が得られる。この検出信号は、演算制御ユニット200に送られる。 The OCT unit 100 will now be described. The OCT unit 100 shown in Figure 2 is equipped with a spectral domain OCT optical system. This OCT optical system includes an interference optical system. This interference optical system splits light from a low-coherence light source (broadband light source) into measurement light LS (OCT measurement light) and reference light LR, and generates interference light LC by superimposing the return light of the measurement light LS projected onto the subject's eye E on the reference light LR. The generated interference light LC is detected by the spectroscope 130. This provides a signal indicating the spectral distribution of the interference light LC. This detection signal is sent to the arithmetic and control unit 200.
光源ユニット101は、広帯域の低コヒーレンス光L0を出力する。光源ユニット101は、スーパールミネセントダイオード(SLD)、LED、半導体光増幅器(SOA)などの光出力デバイスを含む。光源ユニット101から出力された低コヒーレンス光L0は、光ファイバー102により偏波コントローラー103に導かれてその偏光状態が調整され、光ファイバー104によりファイバーカプラー105に導かれて測定光LSと参照光LRとに分割される。測定光LSは測定アームにより導かれ、参照光LRは参照アームにより導かれる。 The light source unit 101 outputs broadband low-coherence light L0. The light source unit 101 includes an optical output device such as a superluminescent diode (SLD), LED, or semiconductor optical amplifier (SOA). The low-coherence light L0 output from the light source unit 101 is guided by optical fiber 102 to polarization controller 103, where its polarization state is adjusted, and then guided by optical fiber 104 to fiber coupler 105, where it is split into measurement light LS and reference light LR. The measurement light LS is guided by the measurement arm, and the reference light LR is guided by the reference arm.
参照光LRは、光ファイバー110によりコリメーター111に導かれて平行光束に変換され、測定アームと参照アームとの間における光学的距離の補償のための光路長補正部材112を経由し、測定アームと参照アームとの間における分散補償のための分散補償部材113を経由し、リトロリフレクター114に導かれる。リトロリフレクター114は、これに入射する参照光LRの光路に沿って移動可能であり、眼軸長に応じた光路長補正や、干渉状態の調整に利用される。リトロリフレクター114を経由した参照光LRは、分散補償部材113及び光路長補正部材112を経由し、コリメーター116によって平行光束から集束光束に変換され、光ファイバー117を通じて偏波コントローラー118に導かれてその偏光状態が調整され、光ファイバー119を通じてアッテネーター120に導かれてその光量が調整され、光ファイバー121を経由してファイバーカプラー122に到達する。 The reference light LR is guided by optical fiber 110 to collimator 111, where it is converted into a parallel beam. It then passes through optical path length compensation element 112, which compensates for the optical distance between the measurement arm and reference arm, and dispersion compensation element 113, which compensates for dispersion between the measurement arm and reference arm, before being guided to retroreflector 114. Retroreflector 114 is movable along the optical path of the reference light LR incident thereon, and is used to correct the optical path length according to the axial length and adjust the interference state. After passing through retroreflector 114, the reference light LR passes through dispersion compensation element 113 and optical path length compensation element 112, and is converted from a parallel beam into a focused beam by collimator 116. It is then guided via optical fiber 117 to polarization controller 118, where its polarization state is adjusted, and then through optical fiber 119 to attenuator 120, where its light intensity is adjusted. It then reaches fiber coupler 122 via optical fiber 121.
一方、測定光LSは、光ファイバー127を通じてコリメーターレンズユニット40に導かれて平行光束に変換され、リトロリフレクター41と分散補償部材42とOCT合焦レンズ43と光スキャナー44とリレーレンズ45とを経由し、ダイクロイックミラー46により反射され、対物レンズ22により屈折され、被検眼Eに投射される。測定光LSは、被検眼Eの様々な深さ位置において散乱・反射される。測定光LSの被検眼Eからの戻り光は、測定アームを逆向きに進行してファイバーカプラー105に導かれ、光ファイバー128を経由してファイバーカプラー122に到達する。 Meanwhile, the measurement light LS is guided through the optical fiber 127 to the collimator lens unit 40, where it is converted into a parallel beam of light, passes through the retroreflector 41, dispersion compensation member 42, OCT focusing lens 43, optical scanner 44, and relay lens 45, is reflected by the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the subject's eye E. The measurement light LS is scattered and reflected at various depth positions in the subject's eye E. The return light of the measurement light LS from the subject's eye E travels backward through the measurement arm, is guided to the fiber coupler 105, and reaches the fiber coupler 122 via the optical fiber 128.
ファイバーカプラー122は、光ファイバー128を介して入射された測定光LSと、光ファイバー121を介して入射された参照光LRとを重ね合わせて干渉光LCを生成する。生成された干渉光LCは、光ファイバー129を通じて分光器130に導かれる。非限定的な例において、分光器130は、入射された干渉光LCをコリメーターレンズによって平行光束に変換し、平行光束に変換された干渉光LCを回折格子によって複数のスペクトル成分に分解し、回折格子により生成された複数のスペクトル成分をレンズによってイメージセンサーに投射する。このイメージセンサーは、例えばラインセンサーであり、干渉光LCの複数のスペクトル成分を検出して電気信号(検出信号)を生成する。生成された検出信号は、干渉光LCのスペクトル分布の情報を含んでおり、演算制御ユニット200に送られる。 Fiber coupler 122 generates interference light LC by superimposing measurement light LS incident via optical fiber 128 and reference light LR incident via optical fiber 121. The generated interference light LC is guided to spectrometer 130 via optical fiber 129. In a non-limiting example, spectrometer 130 converts the incident interference light LC into a parallel beam using a collimator lens, resolves the parallel beam of interference light LC into multiple spectral components using a diffraction grating, and projects the multiple spectral components generated by the diffraction grating onto an image sensor via a lens. This image sensor is, for example, a line sensor, and detects the multiple spectral components of the interference light LC to generate an electrical signal (detection signal). The generated detection signal contains information on the spectral distribution of the interference light LC and is sent to the arithmetic and control unit 200.
以上に説明した図2のOCTユニット100は、スペクトラルドメインOCT方式を採用している。スウェプトソースOCT方式が用いられる場合、光源ユニット101は、出射光の波長を高速で変化させる波長可変光源(例えば、近赤外波長可変レーザー)を含んでいる。また、スウェプトソースOCT方式の光学系では、測定光LSと参照光LRとを重ね合わせて生成された干渉光LCを所定の分岐比(例えば1:1)で分岐して一対の干渉光を生成し、この一対の干渉光を光検出器で検出する。この光検出器は、バランスドフォトダイオードを含む。バランスドフォトダイオードは、一対の干渉光をそれぞれ検出する一対のフォトディテクターを含み、この一対のフォトディテクターにより得られた一対の検出信号の差分を出力する。光検出器は、この差分信号をデータ収集システム(DAQ)に送る。データ収集システムには、光源ユニット101からクロックが供給される。このクロックは、光源ユニット101において、波長可変光源により所定の波長範囲内で掃引される各波長の出力タイミングに同期して生成される。例えば、光源ユニット101は、各出力波長の光を分岐して2つの分岐光を生成し、一方の分岐光を光学的に遅延させた後に2つの分岐光を合成し、得られた合成光を検出し、その検出信号に基づきクロックを生成する。データ収集システムは、光源ユニット101から提供されるクロックを用いて、光検出器から入力される検出信号(差分信号)のサンプリングを実行する。このサンプリングで得られたデータが画像生成などの処理に提供される。 The OCT unit 100 in Figure 2 described above employs the spectral domain OCT method. When the swept-source OCT method is used, the light source unit 101 includes a tunable light source (e.g., a near-infrared tunable laser) that rapidly changes the wavelength of the emitted light. Furthermore, in the optical system of the swept-source OCT method, the interference light LC generated by superimposing the measurement light LS and the reference light LR is split at a predetermined splitting ratio (e.g., 1:1) to generate a pair of interference light beams, which are then detected by a photodetector. This photodetector includes a balanced photodiode. The balanced photodiode includes a pair of photodetectors that respectively detect the pair of interference light beams, and outputs the difference between the pair of detection signals obtained by the pair of photodetectors. The photodetector sends this difference signal to a data acquisition system (DAQ). A clock is supplied to the data acquisition system from the light source unit 101. This clock is generated in the light source unit 101 in synchronization with the output timing of each wavelength swept within a predetermined wavelength range by the tunable light source. For example, the light source unit 101 splits light of each output wavelength to generate two split lights, optically delays one of the split lights, then combines the two split lights, detects the resulting combined light, and generates a clock based on the detection signal. The data collection system uses the clock provided by the light source unit 101 to sample the detection signal (differential signal) input from the photodetector. The data obtained by this sampling is provided for processing such as image generation.
図1及び図2に示す例では、測定アーム及び参照アームの双方に光路長変更要素(リトロリフレクター41及び114)が設けられているが、いずれか一方のみが設けられてもよい。また、光路長変更要素は、リトロリフレクターに限定されない。例えば、参照アームの光路長変更要素は、移動可能な反射部材(参照ミラー)であってもよい。より一般に、本開示に係る眼科装置は、測定アーム長と参照アーム長とを相対的に変化させるように構成された要素(つまり、測定アームと参照アームとの間の光路長差を変更するように構成された要素)を備えており、この要素を用いてコヒーレンスゲート位置を移動することができる。 In the example shown in Figures 1 and 2, optical path length changing elements (retroreflectors 41 and 114) are provided in both the measurement arm and the reference arm, but only one of them may be provided. Furthermore, the optical path length changing elements are not limited to retroreflectors. For example, the optical path length changing element of the reference arm may be a movable reflecting member (reference mirror). More generally, the ophthalmic device according to the present disclosure includes an element configured to relatively change the measurement arm length and the reference arm length (i.e., an element configured to change the optical path length difference between the measurement arm and the reference arm), and this element can be used to move the coherence gate position.
演算制御ユニット200について説明する。演算制御ユニット200は、眼科装置1の各部の制御、各種の演算、各種の解析など、様々な処理を実行する。例えば、演算制御ユニット200は、分光器130により取得されたスペクトル分布(干渉信号、インターフェログラム)にフーリエ変換などの信号処理を施すことによって、測定光LSの各投射位置において深さ方向(z方向)に延びるライン(Aライン)における反射強度プロファイルを算出する。更に、演算制御ユニット200は、各Aラインの反射強度プロファイルを画像化することによって画像データを生成する。そのための演算処理は、従来のスペクトラルドメインOCT方式の画像生成と同様であってよい。演算制御ユニット200は、例えば、プロセッサ、RAM、ROM、ハードディスクドライブ、通信インターフェイスなどを含む。ハードディスクドライブなどの記憶装置には各種のコンピュータプログラムが格納されている。演算制御ユニット200は、操作デバイス、入力デバイス、表示デバイスなどを含んでいてもよい。 The arithmetic and control unit 200 will now be described. The arithmetic and control unit 200 performs various processes, such as controlling each part of the ophthalmic apparatus 1, various calculations, and various analyses. For example, the arithmetic and control unit 200 calculates the reflection intensity profile of a line (A-line) extending in the depth direction (z-direction) at each projection position of the measurement light LS by performing signal processing such as Fourier transform on the spectral distribution (interference signal, interferogram) acquired by the spectroscope 130. Furthermore, the arithmetic and control unit 200 generates image data by imaging the reflection intensity profile of each A-line. The arithmetic and control unit 200 may perform the same calculations as in image generation using conventional spectral domain OCT. The arithmetic and control unit 200 includes, for example, a processor, RAM, ROM, a hard disk drive, a communication interface, etc. Various computer programs are stored in storage devices such as hard disk drives. The arithmetic and control unit 200 may also include an operation device, an input device, a display device, etc.
図3に示すユーザーインターフェイス240について説明する。ユーザーインターフェイス240は、表示部241と操作部242とを有する。表示部241は、例えば、図1の表示装置3を含む。操作部242は、各種の操作デバイスや入力デバイスを含む。ユーザーインターフェイス240は、タッチパネルを含んでもよい。いくつかの例示的な態様では、ユーザーインターフェイスの少なくとも一部は、眼科装置1に接続された周辺機器として設けられる。 The user interface 240 shown in Figure 3 will now be described. The user interface 240 has a display unit 241 and an operation unit 242. The display unit 241 includes, for example, the display device 3 of Figure 1. The operation unit 242 includes various operation devices and input devices. The user interface 240 may include a touch panel. In some exemplary embodiments, at least a portion of the user interface is provided as a peripheral device connected to the ophthalmologic apparatus 1.
図3に示す移動機構150について説明する。移動機構150は、眼科装置1の光学系を移動するように構成されている。移動機構150は、例えば、少なくとも眼底カメラユニット2を3次元的に移動する。 The moving mechanism 150 shown in Figure 3 will now be described. The moving mechanism 150 is configured to move the optical system of the ophthalmologic apparatus 1. The moving mechanism 150, for example, moves at least the fundus camera unit 2 three-dimensionally.
図3に示すデータ入出力部290について説明する。データ入出力部290は、眼科装置1へのデータの入力と、眼科装置1からのデータの出力とを実行する。非限定的な態様のデータ入出力部290は、例えば、図示しない外部装置と通信するための機能を有する。通信部290は、外部装置との接続形態に応じた通信インターフェイスを備える。外部装置は、例えば、任意の眼科装置であってよい。また、外部装置は、病院情報システム(HIS)サーバ、DICOM(Digital Imaging and COmmunication in Medicine)サーバ、医師端末、モバイル端末、個人端末、クラウドサーバなど、任意の情報処理装置であってもよい。いくつかの例示的な態様のデータ入出力部290は、記録媒体から情報を読み取る装置(データリーダー)と、記録媒体に情報を書き込む装置(データライター)とを含む。データ入出力部290の態様は、これらに限定されない。 The data input/output unit 290 shown in Figure 3 will be described. The data input/output unit 290 inputs data to the ophthalmic device 1 and outputs data from the ophthalmic device 1. A non-limiting embodiment of the data input/output unit 290 has a function for communicating with an external device (not shown), for example. The communication unit 290 has a communication interface according to the connection form with the external device. The external device may be, for example, any ophthalmic device. The external device may also be any information processing device, such as a Hospital Information System (HIS) server, a DICOM (Digital Imaging and Communication in Medicine) server, a doctor's terminal, a mobile terminal, a personal terminal, or a cloud server. Some exemplary embodiments of the data input/output unit 290 include a device that reads information from a recording medium (data reader) and a device that writes information to a recording medium (data writer). The embodiments of the data input/output unit 290 are not limited to these.
眼科装置1の処理系(演算制御系)について説明する。処理系の構成例を図3及び図4に示す。制御部210及びデータ処理部230は、演算制御ユニット200に設けられている。 The processing system (arithmetic and control system) of the ophthalmic apparatus 1 will now be described. An example configuration of the processing system is shown in Figures 3 and 4. The control unit 210 and data processing unit 230 are provided in the arithmetic and control unit 200.
制御部210は、プロセッサを含み、眼科装置1の各部を制御する。制御部210は、主制御部211と記憶部212とを含む。主制御部211は、プロセッサを含み、眼科装置1の各要素(図1~図3に示された要素を含む)を制御するように構成されている。また、主制御部211は、眼科装置1に接続された装置やデバイスやシステムを制御可能に構成されてもよい。主制御部211の機能は、例えば、回路を含むハードウェアと、制御ソフトウェアとの協働によって実現される。記憶部212は各種のデータを記憶する。記憶部212は、ハードディスクドライブ、ソリッドステートドライブなどの記憶装置を含んでいる。 The control unit 210 includes a processor and controls each component of the ophthalmic apparatus 1. The control unit 210 includes a main control unit 211 and a memory unit 212. The main control unit 211 includes a processor and is configured to control each component of the ophthalmic apparatus 1 (including the components shown in Figures 1 to 3). The main control unit 211 may also be configured to control apparatuses, devices, and systems connected to the ophthalmic apparatus 1. The functions of the main control unit 211 are realized, for example, by cooperation between hardware including circuits and control software. The memory unit 212 stores various types of data. The memory unit 212 includes storage devices such as hard disk drives and solid state drives.
主制御部211により実行されるいくつかの制御を説明する。主制御部211は、図示しない撮影合焦駆動部を制御することにより、撮影合焦レンズ31とフォーカス光学系60とを同期的に移動する。主制御部211は、リトロリフレクター(RR)駆動部41Aを制御することにより、測定アームのリトロリフレクター41を移動する。主制御部211は、OCT合焦駆動部43Aを制御することにより、測定アームのOCT合焦レンズ43を移動する。主制御部211は、光スキャナー44を制御することにより、予め設定されたスキャンパターンにしたがって測定光LSを偏向する。主制御部211は、リトロリフレクター(RR)駆動部114Aを制御することにより、参照アームのリトロリフレクター114を移動する。主制御部211は、移動機構150を制御することにより、光学系(例えば、眼底カメラユニット2及びOCTユニット100)を移動する。 The following describes several controls performed by the main controller 211. The main controller 211 controls the imaging focusing driver (not shown) to synchronously move the imaging focusing lens 31 and the focus optical system 60. The main controller 211 controls the retroreflector (RR) driver 41A to move the retroreflector 41 of the measurement arm. The main controller 211 controls the OCT focusing driver 43A to move the OCT focusing lens 43 of the measurement arm. The main controller 211 controls the optical scanner 44 to deflect the measurement light LS according to a preset scan pattern. The main controller 211 controls the retroreflector (RR) driver 114A to move the retroreflector 114 of the reference arm. The main controller 211 controls the movement mechanism 150 to move the optical system (e.g., the fundus camera unit 2 and OCT unit 100).
データ処理部230は、各種のデータ処理を実行する。例えば、データ処理部230は、眼底カメラユニット2で取得された画像(眼底画像、前眼部画像など)に対して各種の処理を適用する。また、データ処理部230は、OCTスキャンを用いて取得された画像(OCT画像)に対して各種の処理を適用する。データ処理部230は、プロセッサを含む。データ処理部230は、例えば、回路を含むハードウェアと、データ処理ソフトウェアとの協働により実現される。 The data processing unit 230 performs various types of data processing. For example, the data processing unit 230 applies various types of processing to images (fundus images, anterior segment images, etc.) acquired by the fundus camera unit 2. The data processing unit 230 also applies various types of processing to images acquired using OCT scanning (OCT images). The data processing unit 230 includes a processor. The data processing unit 230 is realized, for example, by the cooperation of hardware including circuits and data processing software.
データ処理部230は、画像生成部220を含む。画像生成部220は、被検眼Eの眼底EfにOCTスキャンを適用して収集されたデータを処理してOCT画像データを生成する。画像生成部220は、プロセッサを含む。画像生成部220の機能は、例えば、回路を含むハードウェアと、画像生成ソフトウェアとの協働によって実現される。 The data processing unit 230 includes an image generation unit 220. The image generation unit 220 processes data collected by applying an OCT scan to the fundus Ef of the subject's eye E to generate OCT image data. The image generation unit 220 includes a processor. The functions of the image generation unit 220 are realized, for example, by cooperation between hardware including circuits and image generation software.
画像生成部220は、干渉信号の強度を視覚情報として表現したOCT強度画像を生成する処理と、干渉信号の位相情報を視覚情報として表現した位相画像を生成する処理とを実行するように構成されている。次に、強度画像を生成する処理の非限定的な例を説明する。位相画像を生成する処理については、OCT血流計測の理論的側面の説明と併せて、その非限定的な例を後述する。 The image generation unit 220 is configured to perform a process of generating an OCT intensity image that represents the intensity of the interference signal as visual information, and a process of generating a phase image that represents the phase information of the interference signal as visual information. Next, a non-limiting example of the process of generating an intensity image will be described. A non-limiting example of the process of generating a phase image will be described later, along with an explanation of the theoretical aspects of OCT blood flow measurement.
画像生成部220は、分光器130により取得されたデータ(干渉信号)に基づいて強度画像を生成する。この強度画像生成処理は、従来のスペクトラルドメインOCTと同様に、A/D変換、デノイジング(denoising)、フィルタリング、高速フーリエ変換(FFT)などの信号処理を含む。高速フーリエ変換は、分光器130により得られた干渉信号を、Aラインプロファイル(z方向に沿った反射強度プロファイル)に変換する。Aラインプロファイルに画像化処理(反射強度値に画素値を割り当てる処理)を適用することで、Aラインプロファイルが視覚化される(ビジュアライゼーション)。これにより、Aスキャン画像データが得られる。複数のAスキャン画像をスキャンパターンにしたがって配列することにより、当該スキャンパターンに応じた断面画像(例えば、Bスキャン画像データ、サークルスキャン画像データなど)が構築される。別のOCT方式が用いられる場合、断面画像生成部221は、そのタイプに応じた公知の処理を実行する。 The image generation unit 220 generates an intensity image based on the data (interference signal) acquired by the spectrometer 130. Similar to conventional spectral-domain OCT, this intensity image generation process includes signal processing such as A/D conversion, denoising, filtering, and fast Fourier transform (FFT). The fast Fourier transform converts the interference signal acquired by the spectrometer 130 into an A-line profile (a reflection intensity profile along the z direction). The A-line profile is visualized by applying imaging processing (a process that assigns pixel values to reflection intensity values) to the A-line profile. This results in A-scan image data. By arranging multiple A-scan images according to a scan pattern, a cross-sectional image (e.g., B-scan image data, circle scan image data, etc.) corresponding to that scan pattern is constructed. If another OCT method is used, the cross-sectional image generation unit 221 performs known processing appropriate for that type.
いくつかの態様の強度画像は、OCTスキャンが適用されたエリアに配列された複数のAラインにおける反射強度プロファイルを視覚化して得られた一群のAスキャン画像データを含むデータセットであってよい。換言すると、いくつかの態様の強度画像は、一群のAスキャン画像データとそれらの位置情報(座標)とを含むデータセットであってよい。別の態様の強度画像は、複数のBスキャン画像を単一の3次元座標系に埋め込んで構築されたスタックデータ、つまり、複数のBスキャン画像とそれらの位置情報とを含むデータセットであってよい。更に別の態様の強度画像は、スタックデータにボクセル化処理を適用して生成されたボリュームデータ(ボクセルデータ)であってよい。スタックデータ及びボリュームデータは、3次元座標系を用いて画素座標が定義された3次元画像データの非限定的な例である。3次元画像データを生成する処理は、画像生成部220によって実行される。 In some embodiments, the intensity image may be a dataset including a group of A-scan image data obtained by visualizing the reflection intensity profile of multiple A-lines arranged in the area where the OCT scan was applied. In other words, in some embodiments, the intensity image may be a dataset including a group of A-scan image data and their position information (coordinates). In another embodiment, the intensity image may be stack data constructed by embedding multiple B-scan images in a single three-dimensional coordinate system, that is, a dataset including multiple B-scan images and their position information. In yet another embodiment, the intensity image may be volume data (voxel data) generated by applying a voxelization process to the stack data. Stack data and volume data are non-limiting examples of three-dimensional image data in which pixel coordinates are defined using a three-dimensional coordinate system. The process of generating the three-dimensional image data is performed by the image generation unit 220.
画像生成部220は、3次元画像データを加工することができる。例えば、画像生成部220は、3次元画像データにレンダリングを適用して新たな画像データを生成することができる。レンダリングの手法としては、ボリュームレンダリング、サーフェスレンダリング、多断面再構成(MPR)、最大値投影(MIP)、最小値投影(MinIP)、平均値投影(AIP)などがある。画像生成部220は、3次元画像データをz方向に積算(投影)してプロジェクションデータを構築することができる。画像生成部220は、3次元画像データの一部(3次元部分画像データ)をz方向に積算(投影)してシャドウグラムを構築することができる。3次元部分画像データは、任意の画像セグメンテーション法を用いて3次元画像データから抽出される。 The image generation unit 220 can process three-dimensional image data. For example, the image generation unit 220 can generate new image data by applying rendering to the three-dimensional image data. Rendering techniques include volume rendering, surface rendering, multi-planar reconstruction (MPR), maximum intensity projection (MIP), minimum intensity projection (MinIP), and average intensity projection (AIP). The image generation unit 220 can construct projection data by integrating (projecting) the three-dimensional image data in the z direction. The image generation unit 220 can construct a shadowgram by integrating (projecting) a portion of the three-dimensional image data (three-dimensional partial image data) in the z direction. The three-dimensional partial image data is extracted from the three-dimensional image data using any image segmentation method.
眼科装置1は、OCT血流計測を眼底Efに適用することができる。以下、OCT血流計測の理論的側面を説明するとともに、OCT血流計測のいくつかの非限定的な態様を説明する。 The ophthalmic device 1 can apply OCT blood flow measurement to the fundus Ef. Below, we will explain the theoretical aspects of OCT blood flow measurement, as well as some non-limiting aspects of OCT blood flow measurement.
非限定的な態様の血流計測は、眼底Efに対して2種類のスキャン(主走査及び補足走査)を適用する。主走査は、位相画像データを取得するために、眼底Efの注目血管に対して注目位置において交差する注目領域(注目断面)を測定光LSで反復的に走査する。一方、補足走査は、注目断面における注目血管の傾きを推定するために、所定の断面(補足断面)を測定光LSで走査する。非限定的な態様において、補足断面は、例えば、注目血管に交差し、且つ、注目断面の近傍に位置する断面(第1補足断面)であってよい。別の非限定的な態様において、補足断面は、注目断面に交差し、且つ、注目血管に沿う断面(第2補足断面)であってよい。注目血管の傾きは、注目断面に投射される測定光LSと注目血管とがなす角度であり、ドップラーOCTにおけるドップラー角度である。 A non-limiting embodiment of blood flow measurement applies two types of scans (main scan and supplementary scan) to the fundus Ef. The main scan repeatedly scans a region of interest (cross section of interest) that intersects the blood vessel of interest in the fundus Ef at a position of interest with the measurement light LS to obtain phase image data. Meanwhile, the supplementary scan scans a predetermined cross section (supplementary cross section) with the measurement light LS to estimate the inclination of the blood vessel of interest in the cross section of interest. In a non-limiting embodiment, the supplementary cross section may be, for example, a cross section (first supplementary cross section) that intersects the blood vessel of interest and is located near the cross section of interest. In another non-limiting embodiment, the supplementary cross section may be a cross section (second supplementary cross section) that intersects the cross section of interest and is aligned with the blood vessel of interest. The inclination of the blood vessel of interest is the angle between the measurement light LS projected onto the cross section of interest and the blood vessel of interest, which is the Doppler angle in Doppler OCT.
第1補足断面が適用される場合の例を図5Aに示す。本例では、眼底像Dに示すように、眼底Efの視神経乳頭Daの近傍に位置する1つの注目断面C0と、その近傍に位置する2つの補足断面C1及びC2とが、注目血管Dbに交差するように設定される。2つの補足断面C1及びC2の一方は、注目断面C0に対して注目血管Dbの上流側に位置し、他方は下流側に位置する。注目断面C0及び補足断面C1及びC2は、例えば、注目血管Dbの走行方向に対して略直交するように向き付けられる。 An example of when the first supplemental cross section is applied is shown in Figure 5A. In this example, as shown in fundus image D, one cross section of interest C0 located near the optic disc Da of the fundus Ef, and two supplemental cross sections C1 and C2 located nearby are set to intersect with the blood vessel of interest Db. One of the two supplemental cross sections C1 and C2 is located upstream of the blood vessel of interest Db relative to the cross section of interest C0, and the other is located downstream. The cross section of interest C0 and the supplemental cross sections C1 and C2 are oriented, for example, approximately perpendicular to the running direction of the blood vessel of interest Db.
第2補足断面が適用される場合の例を図5Bに示す。本例では、図5Aに示す例と同様の注目断面C0が注目血管Dbに略直交するように設定され、且つ、注目断面C0に略直交するように補足断面Cpが設定される。補足断面Cpは、注目血管Dbに沿って設定される。一例として、補足断面Cpは、注目断面C0の位置において注目血管Dbの中心軸を通過するように設定されてよい。 Figure 5B shows an example of when the second supplementary cross section is applied. In this example, a cross section of interest C0 similar to the example shown in Figure 5A is set so as to be approximately perpendicular to the blood vessel of interest Db, and a supplementary cross section Cp is set so as to be approximately perpendicular to the cross section of interest C0. The supplementary cross section Cp is set along the blood vessel of interest Db. As an example, the supplementary cross section Cp may be set so as to pass through the central axis of the blood vessel of interest Db at the position of the cross section of interest C0.
OCT血流計測の主走査は、被検者の心臓の少なくとも1心拍周期を含む期間にわたるデータを収集することが望ましい。これにより、全ての心拍時相における血流動態を求めることが可能になる。なお、主走査を実行する時間は、予め設定された一定の時間であってもよいし、被検者ごとに又は検査ごとに設定された時間であってもよい。この一定の時間は、従来、標準的な心拍周期よりも十分に長い時間(例えば2秒間)に設定されている。また、被検者又は検査ごとに設定される時間は、従来、心電計などの生体信号検出器からのデータを参照することが行われている。 It is desirable for the main scan in OCT blood flow measurement to collect data over a period that includes at least one cardiac cycle of the subject's heart. This makes it possible to determine blood flow dynamics in all cardiac phases. The time for performing the main scan may be a fixed, predetermined time, or a time set for each subject or test. This fixed time has traditionally been set to a time (e.g., 2 seconds) that is sufficiently longer than a standard cardiac cycle. Furthermore, the time set for each subject or test has traditionally been determined by referring to data from a biosignal detector such as an electrocardiograph.
画像生成部220は、断面画像生成部221と、位相画像生成部222とを含む。断面画像生成部221は、プロセッサを含み、その機能は、例えば、回路を含むハードウェアと、断面画像生成ソフトウェアとの協働によって実現される。位相画像生成部222は、プロセッサを含み、その機能は、例えば、回路を含むハードウェアと、位相画像生成ソフトウェアとの協働によって実現される。 The image generation unit 220 includes a cross-sectional image generation unit 221 and a phase image generation unit 222. The cross-sectional image generation unit 221 includes a processor, and its functions are realized, for example, by cooperation between hardware including circuits and cross-sectional image generation software. The phase image generation unit 222 includes a processor, and its functions are realized, for example, by cooperation between hardware including circuits and phase image generation software.
断面画像生成部221は、眼底Efに対するOCTスキャンで収集されたデータに基づいて強度画像を生成する。強度画像生成処理は、スペクトラルドメインOCT方式における従来の画像生成法と同様であってよい。 The cross-sectional image generation unit 221 generates an intensity image based on data collected by OCT scanning of the fundus oculi Ef. The intensity image generation process may be similar to conventional image generation methods used in spectral domain OCT.
断面画像生成部221は、眼底Efの注目断面に対する主走査において分光器130より得られた干渉信号に基づいて、注目断面における形態の時系列変化を表す断面画像(主断面画像)を生成する。前述したように、主走査において、眼科装置1は、注目断面C0に反復スキャンを適用する。この反復スキャンは、注目断面C0に対する複数のBスキャンを含む。複数のBスキャンにおいて分光器130により逐次に生成される干渉信号は、断面画像生成部221に逐次に入力される。断面画像生成部221は、各Bスキャンに対応する干渉信号に基づいて、注目断面C0に対応する1つの主断面画像を生成する。断面画像生成部221は、この処理を、主走査におけるBスキャンの反復回数だけ繰り返すことにより、時系列に沿った一連の主断面画像を生成する。このように、断面画像生成部221は、主走査の反復スキャンにより収集されたデータセットに基づいて、複数のBスキャンにそれぞれ対応する複数の強度画像を生成する。いくつかの例示的な態様は、主走査で得られた一連の主断面画像を複数の群に分割し、各群に含まれる主断面画像群に画像合成(例えば、加算平均)を適用して複数の合成画像を生成することによって、主断面画像の画質の向上を図ることができる。 The cross-sectional image generation unit 221 generates cross-sectional images (main cross-sectional images) representing time-series changes in the morphology of the cross-section of interest based on interference signals obtained by the spectroscope 130 during main scanning of the cross-section of interest of the fundus oculi Ef. As described above, during main scanning, the ophthalmic device 1 applies repeated scans to the cross-section of interest C0. This repeated scan includes multiple B-scans for the cross-section of interest C0. The interference signals generated sequentially by the spectroscope 130 in the multiple B-scans are input sequentially to the cross-sectional image generation unit 221. The cross-sectional image generation unit 221 generates one main cross-sectional image corresponding to the cross-section of interest C0 based on the interference signals corresponding to each B-scan. The cross-sectional image generation unit 221 repeats this process the number of times the B-scan is repeated in the main scanning, thereby generating a series of main cross-sectional images in chronological order. In this way, the cross-sectional image generation unit 221 generates multiple intensity images corresponding to multiple B-scans based on the data set collected by repeated scanning of the main scanning. In some exemplary embodiments, the image quality of the principal cross-sectional images can be improved by dividing a series of principal cross-sectional images obtained by main scanning into multiple groups, and applying image synthesis (e.g., averaging) to the principal cross-sectional images included in each group to generate multiple synthesized images.
断面画像生成部221は、眼底Efの補足断面に対する補足走査において分光器130により得られた干渉信号に基づいて、補足断面の形態を表す断面画像(補足断面画像)を生成する。補足断面画像を生成する処理は、主断面画像を生成する処理と同じ要領で実行される。補足断面画像は、1つの断面画像であってもよいし、2つ以上の断面画像であってもよい。いくつかの例示的な態様は、補足断面を複数回走査して複数の断面画像を生成し、これら断面画像に画像合成を適用して合成画像を生成することによって、補足断面画像の画質の向上を図ることができる。図5Aに例示する補足断面C1及びC2が適用される場合、断面画像生成部221は、補足断面C1に対応する補足断面画像と、補足断面C2に対応する補足断面画像とを生成する。図5Bに例示する補足断面Cpが適用される場合、断面画像生成部221は、補足断面Cpに対応する補足断面画像を生成する。 The cross-sectional image generation unit 221 generates a cross-sectional image (supplementary cross-sectional image) representing the morphology of the supplementary cross-section based on the interference signal obtained by the spectroscope 130 during supplementary scanning of the supplementary cross-section of the fundus oculi Ef. The process of generating the supplementary cross-sectional image is performed in the same manner as the process of generating the main cross-sectional image. The supplementary cross-sectional image may be one cross-sectional image, or two or more cross-sectional images. In some exemplary embodiments, the image quality of the supplementary cross-sectional image can be improved by scanning the supplementary cross-section multiple times to generate multiple cross-sectional images, and then applying image synthesis to these cross-sectional images to generate a synthesized image. When the supplementary cross-sections C1 and C2 illustrated in FIG. 5A are applied, the cross-sectional image generation unit 221 generates a supplementary cross-sectional image corresponding to the supplementary cross-section C1 and a supplementary cross-sectional image corresponding to the supplementary cross-section C2. When the supplementary cross-section Cp illustrated in FIG. 5B is applied, the cross-sectional image generation unit 221 generates a supplementary cross-sectional image corresponding to the supplementary cross-section Cp.
位相画像生成部222は、主走査において分光器130により得られた干渉信号に基づいて、注目断面における位相差の時系列変化を表す位相画像を生成する。位相画像の生成に用いられる干渉信号は、断面画像生成部221による主断面画像の生成に用いられる干渉信号と同じであってよい。この場合、主断面画像の画素と位相画像の画素との間に自然な位置対応関係が定義されるため、主断面画像と位相画像との間の位置合わせを容易に行うことができる。これに対し、いくつかの態様は、主断面画像と位相画像とを互いに別の干渉信号から生成してもよい。この場合、例えば、公知の画像レジストレーション法を用いて主断面画像と位相画像との間の位置合わせを行うことができる。 The phase image generation unit 222 generates a phase image that represents the time-series changes in phase difference in the cross section of interest based on the interference signal obtained by the spectroscope 130 during the main scan. The interference signal used to generate the phase image may be the same as the interference signal used to generate the main cross section image by the cross section image generation unit 221. In this case, a natural positional correspondence is defined between the pixels of the main cross section image and the pixels of the phase image, making it easy to align the main cross section image and the phase image. In contrast, in some embodiments, the main cross section image and the phase image may be generated from different interference signals. In this case, for example, it is possible to align the main cross section image and the phase image using a known image registration method.
位相画像を生成する処理の非限定的な例を説明する。本例の位相画像は、隣り合うAライン複素信号(つまり、隣接する走査点に対応する信号)の位相差を算出することにより得られる。換言すると、この例の位相画像は、主断面画像の画素値(輝度値)の時系列変化に基づいて生成される。主断面画像の任意の画素について、位相画像生成部222は、その画素の輝度値の時系列変化を示すグラフを作成する。位相画像生成部222は、このグラフにおいて所定の時間間隔Δtだけ離れた2つの時点t1及びt2(t2=t1+Δt)の間における位相差Δφを求める。そして、この位相差Δφは、時点t1(より一般に、時点t1と時点t2との間の任意の時点)における位相差Δφ(t1)として定義される。この一連の処理を、予め設定された多数の時点のそれぞれについて実行することにより、当該画素における位相差の時系列変化が得られる。なお、位相差の時系列変化は、時間間隔Δtを十分に小さくして位相の相関を確保することによって得られる。そのために、測定光LSの走査(主走査)は、断面画像の分解能に相当する時間よりも小さい値に時間間隔Δtが設定されたオーバーサンプリングを実行する。 A non-limiting example of the process for generating a phase image will now be described. The phase image in this example is obtained by calculating the phase difference between adjacent A-line complex signals (i.e., signals corresponding to adjacent scanning points). In other words, the phase image in this example is generated based on the time-series changes in pixel values (brightness values) of the principal cross-sectional image. For any pixel in the principal cross-sectional image, the phase image generation unit 222 creates a graph showing the time-series changes in the brightness value of that pixel. The phase image generation unit 222 calculates the phase difference Δφ between two points in time t1 and t2 (t2 = t1 + Δt) separated by a predetermined time interval Δt on this graph. This phase difference Δφ is then defined as the phase difference Δφ(t1) at point t1 (or more generally, any point in time between points t1 and t2). By performing this series of processes for each of a number of preset points in time, the time-series changes in the phase difference at that pixel can be obtained. The time-series changes in the phase difference can be obtained by making the time interval Δt sufficiently small to ensure phase correlation. To achieve this, the scanning (main scanning) of the measurement light LS performs oversampling with the time interval Δt set to a value smaller than the time corresponding to the resolution of the cross-sectional image.
位相画像は、各画素の各時点における位相差の値を視覚情報で表現して得られる画像である(画像化処理)。この画像化処理は、例えば、所定の表示パラメータ(例えば、表示色、輝度など)を用いて位相差の値を表現する処理を含む。いくつかの態様の画像化処理は、時系列に沿って位相が増加していることを示す表示色と、減少していることを示す表示色とを互いに異なる色にすることができる。例えば、時系列に沿った位相の増加を赤色で表現し、減少を青色で表現することができる。また、いくつかの態様の画像化処理は、位相の変化の大きさ(位相変化量)を表示色の濃度で表現することができる。ここに説明したいくつかの態様の画像化処理は、血流の向きや大きさを可視化することを可能にする。このような画像化処理をそれぞれの画素について実行することにより、位相画像が生成される。 A phase image is an image obtained by visually representing the phase difference value for each pixel at each time point (imaging process). This imaging process includes, for example, processing that represents the phase difference value using predetermined display parameters (e.g., display color, brightness, etc.). Some forms of imaging process can use different display colors to indicate an increase in phase over time and a decrease in phase over time. For example, an increase in phase over time can be represented by red, and a decrease by blue. Furthermore, some forms of imaging process can represent the magnitude of phase change (amount of phase change) as the intensity of the display color. Some forms of imaging process described herein make it possible to visualize the direction and magnitude of blood flow. A phase image is generated by performing such imaging process on each pixel.
データ処理部230は、血流動態情報を求めるための例示的な要素として、血管領域特定部231と、血流動態情報生成部232とを含む。血流動態情報生成部232は、ドップラー角度算出部233と、血流速度算出部234と、血管径算出部235と、血流量算出部236とを含んでいてよい。 The data processing unit 230 includes, as exemplary elements for determining hemodynamic information, a vascular region identification unit 231 and a hemodynamic information generation unit 232. The hemodynamic information generation unit 232 may include a Doppler angle calculation unit 233, a blood flow velocity calculation unit 234, a vascular diameter calculation unit 235, and a blood flow amount calculation unit 236.
血管領域特定部231は、例えば、血管領域特定プログラムにしたがって動作可能なプロセッサを含む。血流動態情報生成部232は、例えば、血流動態情報生成プログラムにしたがって動作可能なプロセッサを含む。ドップラー角度算出部233は、例えば、ドップラー角度算出プログラムにしたがって動作可能なプロセッサを含む。血流速度算出部234は、例えば、血流速度算出プログラムにしたがって動作可能なプロセッサを含む。血管径算出部235は、例えば、血管径算出プログラムにしたがって動作可能なプロセッサを含む。血流量算出部236は、例えば、血流量算出プログラムにしたがって動作可能なプロセッサを含む。 The vascular region identification unit 231 includes, for example, a processor operable according to a vascular region identification program. The hemodynamic information generation unit 232 includes, for example, a processor operable according to a hemodynamic information generation program. The Doppler angle calculation unit 233 includes, for example, a processor operable according to a Doppler angle calculation program. The blood flow velocity calculation unit 234 includes, for example, a processor operable according to a blood flow velocity calculation program. The vascular diameter calculation unit 235 includes, for example, a processor operable according to a vascular diameter calculation program. The blood flow volume calculation unit 236 includes, for example, a processor operable according to a blood flow volume calculation program.
血管領域特定部231は、眼底のOCT画像を解析して、このOCT画像中において血管に相当する画像領域(血管領域)を特定する。また、血管領域特定部231は、眼底の正面画像(例えば、眼底カメラユニット2により取得された観察画像又は撮影画像)を解析して、この正面画像中において血管に相当する画像領域(血管領域)を特定する。血管領域特定部231により実行される血管領域特定処理は、任意の画像セグメンテーションを用いた画像処理であってよく、例えば、対象画像中における画素値を解析することにより実行される(例えば、閾値処理)。いくつかの態様の血管領域特定部231は、主断面画像、補足断面画像、及び位相画像のそれぞれから、注目血管Dbに対応する血管領域を特定する。 The vascular region identification unit 231 analyzes an OCT image of the fundus and identifies an image region (vascular region) corresponding to a blood vessel in this OCT image. The vascular region identification unit 231 also analyzes a front image of the fundus (e.g., an observed image or photographed image acquired by the fundus camera unit 2) and identifies an image region (vascular region) corresponding to a blood vessel in this front image. The vascular region identification process performed by the vascular region identification unit 231 may be image processing using any image segmentation, and is performed, for example, by analyzing pixel values in the target image (e.g., threshold processing). In some embodiments, the vascular region identification unit 231 identifies a vascular region corresponding to the blood vessel of interest Db from each of the principal cross-sectional image, supplementary cross-sectional image, and phase image.
いくつかの場合、主断面画像と補足断面画像は、血管領域特定処理における解析の対象として十分な解像度を有する一方、位相画像は、血管領域の境界を特定できるほどの解像度を有していない。このような場合においても、位相画像に基づき血流動態情報を生成する以上、位相画像中の血管領域を高い正確度で特定する必要がある。そのために、例えば次に説明する処理を採用することができる。 In some cases, the principal and supplementary cross-sectional images have sufficient resolution to be analyzed in the vascular region identification process, but the phase image does not have enough resolution to identify the boundaries of the vascular region. Even in such cases, since blood flow dynamics information is generated based on the phase image, it is necessary to identify the vascular region in the phase image with high accuracy. To achieve this, the process described below can be used, for example.
主断面画像と位相画像とが同じ干渉信号に基づき生成される場合、主断面画像の画素と位相画像の画素との間に定義される自然な位置対応関係(前述)を利用することができる。例えば、血管領域特定部231は、主断面画像を解析して血管領域を特定する処理と、主断面画像中の当該血管領域に対応する位相画像中の画像領域を当該位置対応関係に基づき特定する処理とを実行することができる。位相画像中の当該画像領域が、この位相画像中の血管領域として採用される。これにより、位相画像の血管領域を高い正確度で決定することができる。主断面画像と位相画像とが互いに異なる干渉信号に基づき生成される場合には、上記した自然な位置対応関係の代わりに、主断面画像と位相画像との間の画像レジストレーション(前述)の結果を利用することによって、位相画像の血管領域を決定することができる。 When the principal cross-sectional image and the phase image are generated based on the same interference signal, the natural positional correspondence relationship (described above) defined between the pixels of the principal cross-sectional image and the pixels of the phase image can be utilized. For example, the vascular region identification unit 231 can perform a process of analyzing the principal cross-sectional image to identify the vascular region, and a process of identifying the image region in the phase image that corresponds to the vascular region in the principal cross-sectional image based on the positional correspondence relationship. The image region in the phase image is used as the vascular region in this phase image. This allows the vascular region in the phase image to be determined with high accuracy. When the principal cross-sectional image and the phase image are generated based on mutually different interference signals, the vascular region in the phase image can be determined by using the results of image registration (described above) between the principal cross-sectional image and the phase image instead of the natural positional correspondence relationship described above.
血流動態情報生成部232は、眼底血管における血流動態を示す情報(血流動態情報)を生成する。血流動態情報は、眼底血流動態を示す任意のパラメータ(血流動態パラメータ)の情報であってよい。本開示は血流速度及び血流量について説明するが、血流動態パラメータはこれらに限定されない。 The hemodynamic information generation unit 232 generates information indicating the hemodynamics of blood flow in the fundus blood vessels (hemodynamic information). The hemodynamic information may be information on any parameter (hemodynamic parameter) indicating the fundus hemodynamics. While this disclosure describes blood flow velocity and blood volume, hemodynamic parameters are not limited to these.
血流動態情報生成部232は、注目血管Dbに関する血流動態情報を生成する。前述のように、いくつかの態様の血流動態情報生成部232は、ドップラー角度算出部233と、血流速度算出部234と、血管径算出部235と、血流量算出部236とを含む。 The hemodynamic information generation unit 232 generates hemodynamic information related to the blood vessel of interest Db. As described above, some embodiments of the hemodynamic information generation unit 232 include a Doppler angle calculation unit 233, a blood flow velocity calculation unit 234, a blood vessel diameter calculation unit 235, and a blood flow rate calculation unit 236.
ドップラー角度算出部233は、補足走査により収集された補足断面のデータ(断面データ、補足断面画像)に基づいて、注目血管の傾きの推定値を求める。算出される値は、例えば、注目断面における注目血管の傾きの測定値に基づく値、又はその近似値であってよい。前述したように、注目血管の傾きは、ドップラーOCTにおけるドップラー角度に相当するパラメータである。すなわち、ドップラー角度は、注目断面に対する主走査における測定光LSの入射方向と注目血管の軸の方向(つまり、注目血管の傾き)とのなす角度であるから、注目血管の傾きはドップラー角度と等価である。 The Doppler angle calculation unit 233 calculates an estimated value of the tilt of the blood vessel of interest based on the data of the supplementary cross section (cross-sectional data, supplementary cross-sectional image) collected by the supplementary scan. The calculated value may be, for example, a value based on the measured value of the tilt of the blood vessel of interest in the cross section of interest, or an approximate value thereof. As mentioned above, the tilt of the blood vessel of interest is a parameter equivalent to the Doppler angle in Doppler OCT. In other words, since the Doppler angle is the angle between the incident direction of the measurement light LS in the main scan for the cross section of interest and the direction of the axis of the blood vessel of interest (i.e., the tilt of the blood vessel of interest), the tilt of the blood vessel of interest is equivalent to the Doppler angle.
注目血管の傾きの値を実際に測定する場合の例を説明する(傾き推定の第1の例)。図5Aに示す補足断面C1及びC2が適用される場合、ドップラー角度算出部233は、注目断面C0と補足断面C1と補足断面C2との間における位置関係と、血管領域特定部231により得られた血管領域の特定結果とに基づいて、注目断面C0における注目血管Dbの傾きを算出することができる。 An example of actually measuring the gradient value of a blood vessel of interest will be described (first example of gradient estimation). When the supplementary cross sections C1 and C2 shown in Figure 5A are applied, the Doppler angle calculation unit 233 can calculate the gradient of the blood vessel of interest Db on the cross section of interest C0 based on the positional relationship between the cross section of interest C0, the supplementary cross sections C1, and the supplementary cross sections C2, and the vascular region identification result obtained by the vascular region identification unit 231.
注目血管Dbの傾きの算出方法について図6Aを参照しつつ説明する。符号G0、G1及びG2は、それぞれ、注目断面C0における主断面画像、補足断面C1における補足断面画像、及び補足断面C2における補足断面画像を示す。また、符号V0、V1及びV2は、それぞれ、主断面画像G0内の血管領域、補足断面画像G1内の血管領域、及び補足断面画像G2内の血管領域を示す。図6Aに示すz座標軸は、測定光LSの入射方向と実質的に一致する。また、主断面画像G0(注目断面C0)と補足断面画像G1(補足断面C1)との間の距離をdで示し、主断面画像G0(注目断面C0)と補足断面画像G2(補足断面C2)との間の距離を同じくdで示す。隣接する断面画像の間隔、つまり隣接する断面の間隔を、断面間距離と呼ぶ。 The method for calculating the gradient of the blood vessel of interest Db will be described with reference to Figure 6A. The symbols G0, G1, and G2 respectively indicate the principal cross-sectional image of the cross-section of interest C0, the supplementary cross-sectional image of the supplementary cross-section C1, and the supplementary cross-sectional image of the supplementary cross-section C2. Furthermore, the symbols V0, V1, and V2 respectively indicate the vascular region in the principal cross-sectional image G0, the vascular region in the supplementary cross-sectional image G1, and the vascular region in the supplementary cross-sectional image G2. The z coordinate axis shown in Figure 6A substantially coincides with the incident direction of the measurement light LS. Furthermore, the distance between the principal cross-sectional image G0 (cross-section of interest C0) and the supplementary cross-sectional image G1 (supplementary cross-section C1) is indicated by d, and the distance between the principal cross-sectional image G0 (cross-section of interest C0) and the supplementary cross-sectional image G2 (supplementary cross-section C2) is also indicated by d. The distance between adjacent cross-sectional images, i.e., the distance between adjacent cross-sections, is called the inter-section distance.
ドップラー角度算出部233は、3つの血管領域V0、V1及びV2の間の位置関係に基づいて、注目断面C0における注目血管Dbの傾きAを算出することができる。この位置関係は、例えば、3つの血管領域V0、V1及びV2を接続することによって求められる。その具体例として、ドップラー角度算出部233は、3つの血管領域V0、V1及びV2のそれぞれの特徴位置を特定し、これら特徴位置を接続することができる。この特徴位置は、例えば、中心位置、重心位置、最上部(z座標値が最小の位置)、及び最下部(z座標値が最大の位置)のうちのいずれかであってよい。特徴位置同士を接続する方法は、任意であってよく、例えば、線分で結ぶ方法、又は近似曲線(スプライン曲線、ベジェ曲線等)で結ぶ方法であってよい。 The Doppler angle calculation unit 233 can calculate the gradient A of the blood vessel of interest Db in the cross section of interest C0 based on the positional relationship between the three blood vessel regions V0, V1, and V2. This positional relationship can be determined, for example, by connecting the three blood vessel regions V0, V1, and V2. As a specific example, the Doppler angle calculation unit 233 can identify the characteristic positions of each of the three blood vessel regions V0, V1, and V2 and connect these characteristic positions. This characteristic position may be, for example, one of the center position, center of gravity position, top (position with the smallest z coordinate value), and bottom (position with the largest z coordinate value). The characteristic positions may be connected by any method, such as connecting them with a line segment or an approximate curve (spline curve, Bezier curve, etc.).
更に、ドップラー角度算出部233は、3つの血管領域V0、V1及びV2から特定された特徴位置同士を結ぶ接続線に基づいて、注目断面C0における注目血管Dbの傾きAを算出する。接続線が線分である場合、ドップラー角度算出部233は、注目断面C0の特徴位置と補足断面C1の特徴位置とを結ぶ第1線分の傾きと、注目断面C0の特徴位置と補足断面C2の特徴位置とを結ぶ第2線分の傾きとに基づいて、傾きAを算出することができる。この算出処理の非限定的な例は、2つの線分の傾きの平均値を求めてよい。接続線が近似曲線である場合、ドップラー角度算出部233は、近似曲線と注目断面C0とが交差する位置における近似曲線の傾きを、傾きAとして求めることができる。ドップラー角度算出処理において、断面間距離dは、例えば、接続線を求めるために断面画像G0~G2をxyz座標系に埋め込むときに用いられる。 Furthermore, the Doppler angle calculation unit 233 calculates the gradient A of the blood vessel of interest Db in the cross section of interest C0 based on the connecting lines connecting the characteristic positions identified from the three vascular regions V0, V1, and V2. If the connecting lines are line segments, the Doppler angle calculation unit 233 can calculate the gradient A based on the gradient of a first line segment connecting the characteristic position of the cross section of interest C0 to the characteristic position of the supplementary cross section C1, and the gradient of a second line segment connecting the characteristic position of the cross section of interest C0 to the characteristic position of the supplementary cross section C2. A non-limiting example of this calculation process may be calculating the average gradient of the two line segments. If the connecting lines are approximated curves, the Doppler angle calculation unit 233 can calculate the gradient A as the gradient of the approximated curve at the position where the approximated curve intersects with the cross section of interest C0. In the Doppler angle calculation process, the inter-section distance d is used, for example, when embedding the cross-sectional images G0 to G2 in an xyz coordinate system to find the connecting lines.
上記の例では、3つの断面における血管領域を考慮している。いくつかの態様は、2つの断面を考慮して傾きを求めてもよい。非限定的な例は、注目断面C0における注目血管Dbの傾きAとして、上記第1線分の傾き又は上記第2線分の傾きを求めることができる。或いは、2つの補足断面画像G1及びG2に基づいて、注目断面C0における注目血管Dbの傾きAを算出してもよい。 In the above example, the vascular region in three cross sections is considered. In some embodiments, the gradient may be calculated by considering two cross sections. As a non-limiting example, the gradient A of the blood vessel Db of interest in the cross section C0 may be calculated as the gradient of the first line segment or the gradient of the second line segment. Alternatively, the gradient A of the blood vessel Db of interest in the cross section C0 may be calculated based on two supplementary cross section images G1 and G2.
注目血管の傾きの近似値を求める場合の例を説明する(傾き推定の第2の例)。図5Bに示す補足断面Cpが適用される場合、ドップラー角度算出部233は、補足断面Cpに対応する補足断面画像を解析して、注目断面C0における注目血管Dbの傾きの近似値を算出することができる。 An example of calculating an approximate value of the gradient of a blood vessel of interest will be described (second example of gradient estimation). When the supplementary cross section Cp shown in Figure 5B is applied, the Doppler angle calculation unit 233 can analyze the supplementary cross section image corresponding to the supplementary cross section Cp to calculate an approximate value of the gradient of the blood vessel of interest Db on the cross section of interest C0.
注目血管Dbの傾きの近似方法について図6Bを参照しつつ説明する。符号Gpは、補足断面Cpにおける補足断面画像を示す。符号Aは、図6Aに示す例と同様に、注目断面C0における注目血管Dbの傾きを示す。 The method for approximating the gradient of the blood vessel of interest Db will be explained with reference to Figure 6B. The symbol Gp indicates the supplementary cross-sectional image at the supplementary cross-section Cp. The symbol A indicates the gradient of the blood vessel of interest Db at the cross-section of interest C0, similar to the example shown in Figure 6A.
本例において、ドップラー角度算出部233は、補足断面画像Gpを解析して、眼底Efの所定組織に相当する画像領域を特定することができる。例えば、ドップラー角度算出部233は、網膜の表層組織である内境界膜(ILM)に相当する画像領域(内境界膜領域)Mを特定することができる。画像領域の特定には、例えば、公知の画像セグメンテーション法が利用される。 In this example, the Doppler angle calculation unit 233 can analyze the supplementary cross-sectional image Gp to identify an image region corresponding to a specific tissue of the fundus oculi Ef. For example, the Doppler angle calculation unit 233 can identify an image region (internal limiting membrane region) M corresponding to the internal limiting membrane (ILM), which is a superficial tissue of the retina. To identify the image region, for example, a known image segmentation method is used.
内境界膜と眼底血管とは互いに略平行であることが知られている。ドップラー角度算出部233は、注目断面C0における内境界膜領域Mの傾きAappを算出する。注目断面C0における内境界膜領域Mの傾きAappは、注目断面C0における注目血管Dbの傾きAの近似値として用いられる。 It is known that the internal limiting membrane and the fundus blood vessels are approximately parallel to each other. The Doppler angle calculation unit 233 calculates the gradient A app of the internal limiting membrane region M on the cross section C0 of interest. The gradient A app of the internal limiting membrane region M on the cross section C0 of interest is used as an approximation of the gradient A of the blood vessel Db of interest on the cross section C0 of interest.
なお、図6A及び図6Bに示す傾きAは、注目血管Dbの向きを表すベクトルであり、その値の定義は任意であってよい。いくつかの非限定的な例は、傾き(ベクトル)Aとz軸とが成す角度(ドップラー角度)として、傾きAの値を定義することができる。同様に、図6Bに示す傾きAappは、内境界膜領域Mの向きを表すベクトルであり、その値の定義は任意であってよい。例えば、傾き(ベクトル)Aappとz軸とが成す角度(ドップラー角度の近似値)として傾きAappの値を定義することが可能である。ここで、z軸の向きは、測定光LSの入射方向と実質的に一致している。 Note that the slope A shown in Figures 6A and 6B is a vector representing the orientation of the blood vessel of interest Db, and its value may be defined arbitrarily. In some non-limiting examples, the value of the slope A can be defined as the angle (Doppler angle) formed between the slope (vector) A and the z-axis. Similarly, the slope A app shown in Figure 6B is a vector representing the orientation of the internal limiting membrane region M, and its value may be defined arbitrarily. For example, the value of the slope A app can be defined as the angle (approximate value of the Doppler angle) formed between the slope (vector) A app and the z-axis. Here, the orientation of the z-axis substantially coincides with the incident direction of the measurement light LS.
注目血管の傾き推定の第3の例として、ドップラー角度算出部233は、図6Bに示す補足断面画像Gpを解析して、注目血管Dbに相当する画像領域を特定し、注目断面C0に相当する位置における当該画像領域の傾きを求めることができる。このとき、ドップラー角度算出部233は、例えば、注目血管Dbに相当する画像領域の境界又は中心軸を曲線近似することができ、注目断面C0に相当する位置における当該近似曲線の傾きを求めるようにしてもよい。前述した眼底Efの所定組織に相当する画像領域(例えば内境界膜領域M)に対して同様の曲線近似を適用することも可能である。 As a third example of estimating the gradient of a blood vessel of interest, the Doppler angle calculation unit 233 can analyze the supplementary cross-sectional image Gp shown in FIG. 6B to identify an image region corresponding to the blood vessel of interest Db and determine the gradient of that image region at a position corresponding to the cross section of interest C0. At this time, the Doppler angle calculation unit 233 can, for example, curve-approximate the boundary or central axis of the image region corresponding to the blood vessel of interest Db, and determine the gradient of that approximate curve at a position corresponding to the cross section of interest C0. A similar curve approximation can also be applied to the image region corresponding to a specific tissue of the fundus Ef described above (for example, the internal limiting membrane region M).
ドップラー角度算出部233が実行する処理は上記の例には限定されず、眼底Efの断面にOCTスキャンを適用して収集された断面データに基づいて注目血管Dbの傾きの推定値(例えば、注目血管Db自体の傾き値、その近似値など)を求めることが可能な任意の処理であってよい。 The processing performed by the Doppler angle calculation unit 233 is not limited to the above example, and may be any processing that can obtain an estimated value of the inclination of the blood vessel Db of interest (e.g., the inclination value of the blood vessel Db itself, its approximate value, etc.) based on cross-sectional data collected by applying an OCT scan to a cross section of the fundus Ef.
血流速度算出部234は、位相画像として得られる位相差の時系列変化の情報に基づいて、注目血管Db内を流れる血液の注目断面C0における血流速度を算出する。算出される情報は、特定の時点における血流速度の値(血流速度値)でもよいし、血流速度の値の時系列変化(血流速度変化情報)でもよい。血流速度値は、心拍周期から選択された特定の心拍時相(例えば、R波の時相)における値であってよい。血流速度変化情報が定義される期間は、注目断面C0に対して主走査が適用された期間の全体であってもよいし、当該期間から選択された部分であってもよい。 The blood flow velocity calculation unit 234 calculates the blood flow velocity of blood flowing through the blood vessel Db at the cross section of interest C0 based on information on the time series change in phase difference obtained as a phase image. The calculated information may be the value of the blood flow velocity at a specific time point (blood flow velocity value), or the time series change in the blood flow velocity value (blood flow velocity change information). The blood flow velocity value may be a value at a specific cardiac phase selected from the cardiac cycle (for example, the R wave phase). The period for which the blood flow velocity change information is defined may be the entire period during which the main scan is applied to the cross section of interest C0, or a selected portion of that period.
血流速度変化情報が得られた場合、血流速度算出部234は、計測期間における血流速度の統計値を算出してもよい。この統計値は、例えば、平均値、標準偏差、分散、中央値、最頻値、最大値、最小値、極大値、及び極小値のうちのいずれかであってよい。ただし、これらに限定されない。また、血流速度変化情報が得られた場合、血流速度の変化を可視化して視覚情報(例えば、グラフ、ヒストグラムなど)を生成することができる。 When blood flow velocity change information is obtained, the blood flow velocity calculation unit 234 may calculate a statistical value of the blood flow velocity during the measurement period. This statistical value may be, for example, any of the mean value, standard deviation, variance, median, mode, maximum value, minimum value, local maximum value, and local minimum value. However, it is not limited to these. Furthermore, when blood flow velocity change information is obtained, the change in blood flow velocity can be visualized to generate visual information (for example, a graph, histogram, etc.).
血流速度算出部234は、ドップラーOCTの手法を用いて血流速度を算出する。このとき、ドップラー角度算出部233により算出された注目断面C0における注目血管Dbの傾きA(又は、その近似値Aapp)が考慮される。具体的には、血流速度算出部234は、次式を用いることができる:Δf=[2nvcosθ]/λ。 The blood flow velocity calculation unit 234 calculates the blood flow velocity using the Doppler OCT technique, taking into account the gradient A (or its approximate value A app ) of the blood vessel Db of interest in the cross section C0 calculated by the Doppler angle calculation unit 233. Specifically, the blood flow velocity calculation unit 234 can use the following formula: Δf=[2nv cos θ]/λ.
ここで:
Δfは、測定光LSの散乱光が受けるドップラーシフトを示し;
nは、媒質の屈折率を示し;
vは、媒質の流速(血流速度)を示し;
θは、測定光LSの入射方向と媒質の流れベクトルとがなす角度を示し;
λは、測定光LSの中心波長を示す。
Where:
Δf represents the Doppler shift experienced by the scattered light of the measurement light LS;
n denotes the refractive index of the medium;
v denotes the flow velocity of the medium (blood flow velocity);
θ represents the angle between the incident direction of the measurement light LS and the flow vector of the medium;
λ indicates the center wavelength of the measurement light LS.
いくつかの態様では、nとλは既知であり、Δfは位相差の時系列変化から得られ、θはドップラー角度(傾きA又は近似値Aappから得られる)である。血流速度算出部234は、媒質屈折率nと、測定光LSの中心波長λと、ドップラーシフトΔfと、ドップラー角度θとを、上記の式に代入することにより、血流速度vを算出する:v=[λΔf]/[2ncosθ]。なお、血流速度を算出する方法は、ここに説明した方法に限定されず、ドップラーOCTにおいて採用可能な任意の方法であってよい。 In some aspects, n and λ are known, Δf is obtained from the time-series change in the phase difference, and θ is the Doppler angle (obtained from the slope A or the approximate value A app ). The blood flow velocity calculation unit 234 calculates the blood flow velocity v by substituting the medium refractive index n, the center wavelength λ of the measurement light LS, the Doppler shift Δf, and the Doppler angle θ into the above equation: v = [λΔf] / [2n cos θ]. Note that the method for calculating the blood flow velocity is not limited to the method described herein, and any method that can be used in Doppler OCT may be used.
血管径算出部235は、注目断面C0における注目血管Dbの径を算出する。この算出方法の例として、正面眼底画像を用いた第1の算出方法と、断面画像を用いた第2の算出方法がある。 The blood vessel diameter calculation unit 235 calculates the diameter of the blood vessel of interest Db in the cross section of interest C0. Examples of this calculation method include a first calculation method using a frontal fundus image and a second calculation method using a cross section image.
第1の算出方法が適用される場合、注目断面C0の位置を含む眼底Efの領域の撮影が予め行われる。それにより得られる正面眼底画像は、例えば、観察画像のフレーム、撮影画像(カラー画像、蛍光造影画像)、及び、OCT血管造影画像(モーションコントラスト画像)のいずれかであってよい。 When the first calculation method is applied, an image of the area of the fundus Ef including the position of the cross section C0 of interest is captured in advance. The resulting frontal fundus image may be, for example, a frame of an observed image, a captured image (color image, fluorescent contrast image), or an OCT angiography image (motion contrast image).
血管径算出部235は、撮影画角(撮影倍率、スキャン寸法)、ワーキングディスタンス、眼球光学系の情報など、画像におけるスケールと実空間におけるスケールとの間の関係を決定する各種ファクターに基づいて、正面眼底画像におけるスケールを設定する。このスケールは、例えば、隣接する画素の間隔(画素ピッチ)と、実空間におけるスケールとを対応付けたものである(例えば、画素ピッチ=10マイクロメートル)。血管径算出部235は、正面眼底画像に設定されたスケールと、血管領域V0における画素とに基づいて、注目断面C0における注目血管Dbの径、つまり血管領域V0の径を算出することができる。 The blood vessel diameter calculation unit 235 sets the scale in the frontal fundus image based on various factors that determine the relationship between the scale in the image and the scale in real space, such as the imaging angle of view (imaging magnification, scan dimensions), working distance, and information about the ocular optical system. This scale, for example, corresponds the spacing between adjacent pixels (pixel pitch) to the scale in real space (for example, pixel pitch = 10 micrometers). The blood vessel diameter calculation unit 235 can calculate the diameter of the blood vessel Db of interest in the cross section C0 of interest, i.e., the diameter of the blood vessel region V0, based on the scale set in the frontal fundus image and the pixels in the blood vessel region V0.
第2の算出方法について説明する。第2の算出方法では、典型的には、注目断面C0における断面画像が用いられる。この断面画像は、主断面画像でもよいし、別の断面画像でもよい。断面画像におけるスケールは、OCTの計測条件などに基づき決定される。いくつかの態様は、図5A又は図5Bに示すように注目断面C0を走査する。注目断面C0の長さは、スキャン寸法、ワーキングディスタンス、眼球光学系の情報など、画像上のスケールと実空間でのスケールとの関係を決定する各種ファクターに基づき決定される。血管径算出部235は、注目断面C0の長さに基づいて画素ピッチを求める処理と、第1の算出方法と同様の処理とを実行することによって、注目断面C0における注目血管Dbの径を算出することができる。 The second calculation method will now be described. In the second calculation method, a cross-sectional image of the cross-section of interest C0 is typically used. This cross-sectional image may be a principal cross-sectional image or another cross-sectional image. The scale of the cross-sectional image is determined based on the OCT measurement conditions, etc. In some embodiments, the cross-section of interest C0 is scanned as shown in FIG. 5A or 5B. The length of the cross-section of interest C0 is determined based on various factors that determine the relationship between the scale on the image and the scale in real space, such as the scan dimensions, working distance, and information about the ocular optical system. The blood vessel diameter calculation unit 235 can calculate the diameter of the blood vessel of interest Db in the cross-section of interest C0 by performing a process to determine the pixel pitch based on the length of the cross-section of interest C0 and a process similar to that of the first calculation method.
血流量算出部236は、血流速度算出部234により算出された血流速度と、血管径算出部235により算出された血管径とに基づいて、注目血管Dbにおける血液の流量を算出する。この処理の一例を以下に説明する。血管内における血流は、ハーゲン・ポアズイユ流(Hagen-Poiseuille flow)であると仮定する。また、血管径をwで示し、血流速度の最大値をVmで示す。このとき、血流量Qは次式で表される:Q=[πw2Vm]/8。 The blood flow rate calculation unit 236 calculates the blood flow rate in the blood vessel of interest Db based on the blood flow velocity calculated by the blood flow velocity calculation unit 234 and the blood vessel diameter calculated by the blood vessel diameter calculation unit 235. An example of this process is described below. Assume that the blood flow in the blood vessel is a Hagen-Poiseuille flow. Furthermore, the blood vessel diameter is represented by w, and the maximum value of the blood flow velocity is represented by Vm. In this case, the blood flow rate Q is expressed by the following equation: Q = [πw 2 Vm]/8.
血流量算出部236は、血管径算出部235により算出された血管径の値wと、血流速度算出部234により算出された血流速度の値に基づく最大値Vmとを、この数式に代入することにより、血流量Qを算出する。 The blood flow calculation unit 236 calculates the blood flow Q by substituting the blood vessel diameter value w calculated by the blood vessel diameter calculation unit 235 and the maximum value Vm based on the blood flow velocity value calculated by the blood flow velocity calculation unit 234 into this formula.
血流動態情報生成部232により求められるパラメータの種類は、上記したいくつかのパラメータに限定されない。例えば、血流動態情報生成部232は、血流速度や血流量などの絶対測定で得られるパラメータに加えて、又は、その代わりに、相対測定で得られるパラメータを求めることができる。本実施形態において採用可能な血流動態パラメータについて、いくつかの非限定的な例を説明する。 The types of parameters calculated by the hemodynamic information generation unit 232 are not limited to the several parameters mentioned above. For example, the hemodynamic information generation unit 232 can calculate parameters obtained by relative measurement in addition to, or instead of, parameters obtained by absolute measurement such as blood flow velocity and blood flow rate. Some non-limiting examples of hemodynamic parameters that can be used in this embodiment will be described below.
正確性の低いドップラー角度の値が得られた場合において、血管内部を表す像からプロファイルを抽出してそれを解析することや、血管内部を表す像の時間経過から脈波カーブの形状(心拍に由来する波形)を抽出することができる。いくつかの研究は、このようにして得られる相対値の有用性を示している。また、脈波カーブの波形から所定の特徴パラメータを抽出することも可能である。抽出された特徴パラメータを血流動態評価や疾患評価などに用いることができる。 When an inaccurate Doppler angle value is obtained, it is possible to extract and analyze a profile from an image showing the inside of a blood vessel, or to extract the shape of the pulse wave curve (waveform derived from the heartbeat) from the time course of an image showing the inside of a blood vessel. Several studies have demonstrated the usefulness of the relative values obtained in this way. It is also possible to extract specific characteristic parameters from the waveform of the pulse wave curve. The extracted characteristic parameters can be used for hemodynamic evaluation, disease evaluation, etc.
血管内における血液の流れは、実質的に層流と考えることができ、血管壁から受ける摩擦抗力の影響により、血管壁に近づくほど流速が小さくなり、血管の中心で流速が最も大きくなる。層流はパラボリックフローであり、また、心拍周期における拡張期の波形と収縮期の波形とを脈波から読み取ることができる。よって、これらの波形の特徴を求めることができる。例えば、特定の波形に対する逸脱に関するパラメータ(逸脱の有無、程度、頻度など)を求めてもよい。 The flow of blood within blood vessels can be considered essentially laminar, with the flow velocity decreasing as the blood approaches the vessel wall due to the frictional drag from the vessel wall, and being greatest at the center of the vessel. Laminar flow is a parabolic flow, and the diastolic and systolic waveforms in the cardiac cycle can be read from the pulse wave. This makes it possible to determine the characteristics of these waveforms. For example, parameters relating to deviations from a specific waveform (presence or absence of deviation, degree, frequency, etc.) can be determined.
静脈における血流速度は一定とは限らず、わずかな変化(拍動)が見られる。この微小な拍動を示すパラメータ(絶対速度パラメータ、相対速度パラメータなど)を求めることもできる。 The blood flow velocity in veins is not necessarily constant, and slight changes (pulsations) are observed. It is also possible to calculate parameters that indicate these minute pulsations (absolute velocity parameters, relative velocity parameters, etc.).
〈眼科装置の非限定的な態様〉
以上に説明したハードウェア態様、ソフトウェア態様、及び機能態様を有する眼科装置1を応用して実現されるいくつかの非限定的な態様を説明する。以下の説明において、眼科装置1に関する事項が適宜参照され使用される。眼科装置1に関する任意の事項を少なくとも部分的に各態様に組み合わせることができる。2つ以上の態様を少なくとも部分的に組み合わせることができる。
<Non-limiting aspects of the ophthalmic device>
Several non-limiting aspects realized by applying the ophthalmic apparatus 1 having the hardware aspects, software aspects, and functional aspects described above will be described. In the following description, matters related to the ophthalmic apparatus 1 will be referred to and used as appropriate. Any matter related to the ophthalmic apparatus 1 can be at least partially combined with each aspect. Two or more aspects can be at least partially combined.
図7は、1つの非限定的な態様に係る眼科装置1000の構成を示す。眼科装置1000は、スキャン部1010と、断面画像生成部1020と、血管領域特定部1030と、血管領域対応付け部1040と、血管マップ生成部1050とを備えている。 FIG. 7 shows the configuration of an ophthalmic device 1000 according to one non-limiting embodiment. The ophthalmic device 1000 includes a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, and a vascular map generating unit 1050.
スキャン部1010は、被検眼Eの眼底EfにOCTスキャンを適用してデータを収集するように構成されている。非限定的な態様の眼科装置1は、眼底カメラユニット2及びOCTユニット100を用いてスキャン部1010の機能を実現することができる。 The scanning unit 1010 is configured to collect data by applying an OCT scan to the fundus Ef of the subject's eye E. A non-limiting aspect of the ophthalmic apparatus 1 can achieve the functions of the scanning unit 1010 using the fundus camera unit 2 and the OCT unit 100.
スキャン部1010は、フライバックを伴わないスキャンパターンによるOCTスキャンを眼底Efに適用することができる。フライバックは、データ収集を行うことなく所定の初期位置にスキャン位置を移動する動作である。例えば、互いに並行に配列された複数のBスキャンであって同じスキャン方向に向き付けられた複数のBスキャンからなるラスタースキャンにおいて、1つのBスキャンの終了位置から次のBスキャンの開始位置までスキャン位置(測定光LSの投射目標位置)を移動する動作が、フライバックである。フライバックを伴うスキャンパターンでは、データ収集が停止している時間が長くなるため、迅速性が要求される処理や動作において比較的不利である。いくつかの実施形態は、この問題に着目して達成されたものであり、例えば、OCT血流計測の主走査(反復スキャン)の準備段階で実行されるドップラー角度推定やアライメントの品質向上を図る。 The scanning unit 1010 can apply OCT scanning to the fundus oculi Ef using a scan pattern that does not involve a flyback. A flyback is an operation that moves the scan position to a predetermined initial position without collecting data. For example, in a raster scan consisting of multiple B-scans arranged parallel to each other and oriented in the same scanning direction, a flyback is an operation that moves the scan position (the target position for projecting the measurement light LS) from the end position of one B-scan to the start position of the next B-scan. A scan pattern that involves a flyback results in a longer period of time during which data collection is stopped, making it relatively disadvantageous for processing or operations that require speed. Some embodiments have been achieved by focusing on this issue, and aim to improve the quality of Doppler angle estimation and alignment performed, for example, in the preparation stage for the main scan (repeated scan) of OCT blood flow measurement.
図8A~図8Cは、フライバックを伴わないスキャンパターンの1つの例を示す。本例のスキャンパターンは、2つの円形パターンの組み合わせである同心円パターンである。なお、同心円パターンに含まれる円形パターンの個数は、任意であってよく、3つ以上であってもよい。 Figures 8A to 8C show an example of a scan pattern that does not involve a flyback. The scan pattern in this example is a concentric pattern that is a combination of two circular patterns. Note that the number of circular patterns included in the concentric pattern may be any number, and may be three or more.
図8Aは、眼底Efを示す。符号1200は視神経乳頭を示し、符号1200aは視神経乳頭1200の中心位置(乳頭中心)を示す。符号1201及び1202は、2つの円形パターンを示す。2つの円形パターン1201及び1202の中心は、いずれも、乳頭中心1200aである。 Figure 8A shows the fundus Ef. Reference numeral 1200 denotes the optic disc, and reference numeral 1200a denotes the central position of the optic disc 1200 (optic disc center). Reference numerals 1201 and 1202 denote two circular patterns. The centers of the two circular patterns 1201 and 1202 are both the optic disc center 1200a.
図8Bに示すように、円形パターン1201の半径はR1であり、円形パターン1202の半径はR2である。ここで、半径R2は半径R1よりも大きい。円形パターン1201が適用される眼底Efの領域は、乳頭中心1200aを通る中心軸1201aと半径R1とにより定義される円柱側面1201bである。また、円形パターン1202が適用される眼底Efの領域は、乳頭中心1200aを通る中心軸1202aと半径R2とにより定義される円柱側面1202bである。中心軸1202aは、中心軸1201aに一致している。 As shown in FIG. 8B, the radius of circular pattern 1201 is R1, and the radius of circular pattern 1202 is R2. Here, radius R2 is larger than radius R1. The region of the fundus Ef to which circular pattern 1201 is applied is a cylindrical side surface 1201b defined by a central axis 1201a passing through the optic disc center 1200a and radius R1. Furthermore, the region of the fundus Ef to which circular pattern 1202 is applied is a cylindrical side surface 1202b defined by a central axis 1202a passing through the optic disc center 1200a and radius R2. The central axis 1202a coincides with the central axis 1201a.
図8Cは、2つの円形パターン1201及び1202からなるスキャンパターンによるOCTスキャンの例を示す。本例のOCTスキャンは、円形パターン1201に沿ったサークルスキャンの直後に、円形パターン1202に沿ったサークルスキャンを実行するものである。より具体的には、本例のOCTスキャンは、まず、円形パターン1201上の位置1201cから、円形パターン1201に沿った反時計回りのサークルスキャン1203aを開始する。このサークルスキャン1203aは、1回又は2回以上実行される。円形パターン1201に沿うサークルスキャン1203aは、位置1201cにおいて終了となる。本例のOCTスキャンは、サークルスキャン1203aの直後に、円形パターン1201上の位置1201cから円形パターン1202上の位置1202cにスキャン目標位置を移動するための動作(光スキャナー44の向きの変更)1203bを実行する。本例のOCTスキャンは、この動作(スキャン目標移動)1203bの直後に、円形パターン1202上の位置1202cから、円形パターン1202に沿った反時計回りのサークルスキャン1203cを開始する。このサークルスキャン1203cは、1回又は2回以上実行される。なお、スキャン目標移動1203bを行いつつデータを収集してもよい。つまり、いくつかの態様のスキャン目標移動1203bは、スキャン(データ収集)であってもよい。 Figure 8C shows an example of an OCT scan using a scan pattern consisting of two circular patterns 1201 and 1202. In this example, the OCT scan is performed by performing a circle scan along circular pattern 1202 immediately after a circle scan along circular pattern 1201. More specifically, in this example, the OCT scan first starts a counterclockwise circle scan 1203a along circular pattern 1201 from position 1201c on circular pattern 1201. This circle scan 1203a is performed once or more than once. The circle scan 1203a along circular pattern 1201 ends at position 1201c. In this example, the OCT scan is performed by performing an operation 1203b (changing the orientation of the optical scanner 44) to move the scan target position from position 1201c on circular pattern 1201 to position 1202c on circular pattern 1202 immediately after circle scan 1203a. In this example, immediately after this operation (scan target movement) 1203b, the OCT scan begins a counterclockwise circle scan 1203c along the circular pattern 1202 from position 1202c on the circular pattern 1202. This circle scan 1203c is performed once or more than once. Note that data may be collected while the scan target movement 1203b is being performed. In other words, some aspects of the scan target movement 1203b may be a scan (data collection).
図8A~図8Cに示すスキャンパターンは、それぞれがフライバックを伴わない複数のパターン(2つの円形パターン1201及び1202)を含むスキャンパターンの1つの例である。そのような態様を有する別のスキャンパターンとしては、同じ形状又は異なる形状を有する複数の閉曲線を組み合わせたスキャンパターン、同じ形状又は異なる形状を有する複数の多角形を組み合わせたスキャンパターン、交互に向き付けられた複数のBスキャンを組み合わせたラスタースキャンなどがある。 The scan patterns shown in Figures 8A to 8C are one example of a scan pattern that includes multiple patterns (two circular patterns 1201 and 1202) that do not each involve a flyback. Other scan patterns that have such features include a scan pattern that combines multiple closed curves having the same or different shapes, a scan pattern that combines multiple polygons having the same or different shapes, and a raster scan that combines multiple B-scans that are oriented alternately.
また、図8A~図8Cに示すスキャンパターンは、同心に配置された複数のパターンの組み合わせである同心パターンを含むスキャンパターンの1つの例である。そのような態様を有する別のスキャンパターンとしては、同心に配置され且つ同じ形状を有する複数の閉曲線を組み合わせたスキャンパターン、同心に配置され且つ同じ形状を有する複数の多角形を組み合わせたスキャンパターンなどがある。本態様のスキャンパターンによりスキャンされる眼底Efの複数の断面は、当該スキャンパターンを構成する同心に配置された複数のパターンにそれぞれ相当する複数の同心断面を含む。図8A~図8Cに示す例では、2つの円柱側面1201b及び1202bが、そのような複数の同心断面に相当する。 Furthermore, the scan pattern shown in Figures 8A to 8C is one example of a scan pattern that includes a concentric pattern, which is a combination of multiple concentrically arranged patterns. Other scan patterns with this configuration include a scan pattern that combines multiple closed curves that are concentrically arranged and have the same shape, and a scan pattern that combines multiple polygons that are concentrically arranged and have the same shape. The multiple cross sections of the fundus Ef scanned with this scan pattern include multiple concentric cross sections that respectively correspond to the multiple concentrically arranged patterns that make up the scan pattern. In the example shown in Figures 8A to 8C, two cylindrical side surfaces 1201b and 1202b correspond to such multiple concentric cross sections.
フライバックを伴わないスキャンパターンは、上記の例に限定されず、任意のパターンであってよい。例えば、螺旋状のスキャンパターン、リサジュー曲線状のスキャンパターン、又はこれらに類するスキャンパターンを採用してもよい。 The scan pattern without flyback is not limited to the above examples and may be any pattern. For example, a spiral scan pattern, a Lissajous curve scan pattern, or similar scan patterns may be employed.
断面画像生成部1020は、スキャン部1010により眼底Efから収集されたデータに基づいて、眼底Efの複数の断面にそれぞれ対応する複数の断面画像を生成するように構成されている。断面画像生成部1020は、回路を含むハードウェアと、断面画像生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230(画像生成部220)を用いて断面画像生成部1020の機能を実現することができる。 The cross-sectional image generating unit 1020 is configured to generate a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus Ef based on data collected from the fundus Ef by the scanning unit 1010. The cross-sectional image generating unit 1020 is realized by cooperation between hardware including circuits and cross-sectional image generating software. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the cross-sectional image generating unit 1020 using the data processing unit 230 (image generating unit 220).
それぞれがフライバックを伴わない複数のパターンの組み合わせであるスキャンパターンが適用される場合、断面画像生成部1020は、フライバックを伴わない複数のパターンのそれぞれについて、当該パターンが適用された断面に対応する断面画像を生成する。 When a scan pattern that is a combination of multiple patterns that do not involve a flyback is applied, the cross-sectional image generation unit 1020 generates a cross-sectional image corresponding to the cross section to which each of the multiple patterns that do not involve a flyback is applied.
それぞれがフライバックを伴わない複数のパターンが同心に配置された同心パターンからなるスキャンパターンが適用される場合、スキャン部1010は、当該複数のパターンにそれぞれ相当する複数の同心断面からデータを収集する。断面画像生成部1020は、当該複数の同心断面から収集されたデータに基づいて、各同心断面に対応する断面画像を生成する。 When a scan pattern consisting of multiple concentric patterns, each of which does not involve a flyback, is applied, the scan unit 1010 collects data from multiple concentric cross sections corresponding to the multiple patterns. The cross-sectional image generation unit 1020 generates cross-sectional images corresponding to each concentric cross section based on the data collected from the multiple concentric cross sections.
図8A~図8Cの例では、スキャン部1010は、円形パターン1201に相当する円柱側面1201bからデータを収集し、且つ、円形パターン1202に相当する円柱側面1202bからデータを収集する。断面画像生成部1020は、円柱側面1201bから収集されたデータに基づいて円柱側面1201bを描出した断面画像を生成し、且つ、円柱側面1202bから収集されたデータに基づいて円柱側面1202bを描出した断面画像を生成する。 In the example of Figures 8A to 8C, the scanning unit 1010 collects data from the cylinder side surface 1201b, which corresponds to the circular pattern 1201, and collects data from the cylinder side surface 1202b, which corresponds to the circular pattern 1202. The cross-sectional image generating unit 1020 generates a cross-sectional image depicting the cylinder side surface 1201b based on the data collected from the cylinder side surface 1201b, and generates a cross-sectional image depicting the cylinder side surface 1202b based on the data collected from the cylinder side surface 1202b.
スキャン部1010が、フライバックを伴わないスキャンパターンによるOCTスキャンを眼底Efに複数回適用して複数のデータを収集した場合、断面画像生成部1020は、複数回のスキャンにそれぞれ対応する複数の断面画像を生成し、これら複数の断面画像を合成して合成断面画像を生成することができる。複数の断面画像の合成は、例えば加算平均である。加算平均は、スペックルノイズなどのランダムノイズを低減する。 When the scanning unit 1010 applies OCT scans to the fundus Ef multiple times using a scan pattern that does not involve flyback to collect multiple pieces of data, the cross-sectional image generating unit 1020 can generate multiple cross-sectional images corresponding to the multiple scans and synthesize these multiple cross-sectional images to generate a composite cross-sectional image. The synthesis of multiple cross-sectional images is, for example, averaging. The averaging reduces random noise such as speckle noise.
図8A~図8Cの例においては、スキャン部1010は、円形パターン1201によるサークルスキャンを複数回行って円柱側面1201bから複数のデータを収集し、且つ、円形パターン1202によるサークルスキャンを複数回行って円柱側面1202bから複数のデータを収集する。断面画像生成部1020は、円柱側面1201bから収集された複数のデータにそれぞれ基づいて円柱側面1201bを描出した複数の断面画像を生成し、且つ、円柱側面1202bから収集された複数のデータにそれぞれ基づいて円柱側面1202bを描出した複数の断面画像を生成する。更に、断面画像生成部1020は、円柱側面1201bの複数の断面画像を合成して合成断面画像を生成し、且つ、円柱側面1202bの複数の断面画像を合成して合成断面画像を生成する。 In the example of Figures 8A to 8C, the scanning unit 1010 performs a circle scan using a circular pattern 1201 multiple times to collect multiple pieces of data from the cylinder's side surface 1201b, and performs a circle scan using a circular pattern 1202 multiple times to collect multiple pieces of data from the cylinder's side surface 1202b. The cross-sectional image generating unit 1020 generates multiple cross-sectional images depicting the cylinder's side surface 1201b based on each of the multiple pieces of data collected from the cylinder's side surface 1201b, and generates multiple cross-sectional images depicting the cylinder's side surface 1202b based on each of the multiple pieces of data collected from the cylinder's side surface 1202b. Furthermore, the cross-sectional image generating unit 1020 synthesizes the multiple cross-sectional images of the cylinder's side surface 1201b to generate a composite cross-sectional image, and synthesizes the multiple cross-sectional images of the cylinder's side surface 1202b to generate a composite cross-sectional image.
スキャン部1010が、フライバックを伴わないスキャンパターンによるOCTスキャンを眼底Efに複数回適用して複数のデータを収集し、更に、断面画像生成部1020が、複数回のスキャンにそれぞれ対応する複数の断面画像を生成した場合、眼科装置1000は、生成された複数の断面画像のうちから、後段の処理(例えば、血管領域の特定)に好適な断面画像を選択するようにしてもよい。ここで、各断面画像の評価は、任意の方法で実行されてよい。非限定的な例として、人間の視覚認知に基づく画像テクスチャ特徴量であるPortilla-Simoncelli Statistics(PSS)を評価値として用いることができる。この場合、PSS値が大きい断面画像が選択される。 If the scanning unit 1010 applies OCT scans to the fundus Ef multiple times using a scan pattern that does not involve flyback to collect multiple pieces of data, and the cross-sectional image generating unit 1020 generates multiple cross-sectional images corresponding to the multiple scans, the ophthalmologic device 1000 may select a cross-sectional image suitable for subsequent processing (e.g., identifying a vascular region) from among the multiple cross-sectional images generated. Here, evaluation of each cross-sectional image may be performed using any method. As a non-limiting example, Portilla-Simoncelli Statistics (PSS), an image texture feature based on human visual perception, may be used as the evaluation value. In this case, a cross-sectional image with a large PSS value is selected.
血管領域特定部1030は、断面画像生成部1020により生成された各断面画像を解析して血管領域群を検出するように構成されている。これにより、断面画像生成部1020により生成された複数の断面画像にそれぞれ対応する複数の血管領域群が特定される。血管領域群は、1つ以上の血管領域を含む。血管領域は、血管の断面に相当する画像領域である。 The vascular region identification unit 1030 is configured to analyze each cross-sectional image generated by the cross-sectional image generation unit 1020 and detect vascular region groups. This identifies multiple vascular region groups corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020. A vascular region group includes one or more vascular regions. A vascular region is an image region corresponding to a cross section of a blood vessel.
断面画像から血管領域を検出するための画像解析の手法は、任意であってよい。例えば、この画像解析は、任意の画像セグメンテーションであってよい。血管領域特定部1030により実行される画像セグメンテーションの手法は、例えば、機械学習アルゴリズム(機械学習モデル)を利用した画像セグメンテーション法、及び、非機械学習アルゴリズムを利用した画像セグメンテーション法のいずれか一方又は双方であってよい。 The image analysis method for detecting vascular regions from cross-sectional images may be any method. For example, this image analysis may be any image segmentation method. The image segmentation method performed by the vascular region identification unit 1030 may be, for example, either or both of an image segmentation method using a machine learning algorithm (machine learning model) and an image segmentation method using a non-machine learning algorithm.
血管領域特定部1030に適用可能な画像セグメンテーション法の非限定的な例として、閾値処理ベースの手法、エッジ検出ベースの手法、リジョンベースの手法、クラスタリングベースの手法、畳み込みニューラルネットワーク(CNN)ベースの手法、トランスフォーマーベースの手法、敵対的生成ネットワーク(GAN)ベースの手法、自己教師あり学習(SSL)ベースの手法、グラフベースの手法などがある。血管領域特定部1030により実行される画像セグメンテーションは、2つ以上の手法の組み合わせであってもよい。 Non-limiting examples of image segmentation methods applicable to the vascular region identification unit 1030 include thresholding-based methods, edge detection-based methods, region-based methods, clustering-based methods, convolutional neural network (CNN)-based methods, transformer-based methods, generative adversarial network (GAN)-based methods, self-supervised learning (SSL)-based methods, graph-based methods, etc. The image segmentation performed by the vascular region identification unit 1030 may be a combination of two or more methods.
いくつかの態様において、血管領域特定部1030は、断面画像生成部1020により生成された各断面画像に対して、スキャン部1010により眼底Efに適用されたOCTスキャンにおけるAスキャン方向(z方向)へのプロジェクション(画像積算)を適用する。これにより、断面画像生成部1020により生成された複数の断面画像にそれぞれ対応する複数のプロジェクション画像が生成される。更に、血管領域特定部1030は、生成された各プロジェクション画像における輝度分布に基づいて、当該プロジェクション画像の元画像である断面画像(つまり、当該プロジェクション画像に対応する断面画像)から血管領域群を検出する。一般に、血管に相当する画素の輝度は、血管以外の組織に相当する画素の輝度よりも低い。本態様の血管領域検出は、プロジェクション画像における低輝度領域を検出する処理(例えば、閾値処理、二値化など)を含んでよい。また、本態様は、輝度ベースの画像セグメンテーション法に、輝度以外の指標に基づく画像セグメンテーション法を組み合わせてもよい。 In some embodiments, the vascular region identification unit 1030 applies projection (image accumulation) in the A-scan direction (z direction) of the OCT scan applied to the fundus Ef by the scan unit 1010 to each cross-sectional image generated by the cross-sectional image generation unit 1020. This generates multiple projection images corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020. Furthermore, the vascular region identification unit 1030 detects a vascular region group from the cross-sectional image that is the original image of the projection image (i.e., the cross-sectional image corresponding to the projection image) based on the brightness distribution in each generated projection image. Generally, the brightness of pixels corresponding to blood vessels is lower than the brightness of pixels corresponding to tissue other than blood vessels. The vascular region detection of this embodiment may include processing (e.g., threshold processing, binarization, etc.) to detect low-brightness regions in the projection image. Furthermore, this embodiment may combine a brightness-based image segmentation method with an image segmentation method based on an index other than brightness.
血管領域特定部1030は、回路を含むハードウェアと、血管領域特定ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230(血管領域特定部231)を用いて実現される血管領域特定機能によって、OCT断面画像から血管領域を検出するための解析処理を行うことができる。 The vascular region identification unit 1030 is realized by the cooperation of hardware including circuitry and vascular region identification software. The non-limiting aspect of the ophthalmologic apparatus 1 can perform analysis processing to detect vascular regions from OCT cross-sectional images using the vascular region identification function realized using the data processing unit 230 (vascular region identification unit 231).
血管領域対応付け部1040は、血管領域特定部1030により複数の断面画像からそれぞれ特定された複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行うように構成されている。 The vascular region matching unit 1040 is configured to match vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups identified from multiple cross-sectional images by the vascular region identification unit 1030.
血管領域対応付け処理を説明するために、互いに隣接する2つの断面画像を第1の断面画像及び第2の断面画像と称する。また、第1の断面画像に対応する眼底Efの断面を第1の断面と称し、第2の断面画像に対応する眼底Efの断面を第2の断面と称する。更に、第1の断面画像から特定された血管領域群を第1の血管領域群と称し、第2の断面画像から特定された血管領域群を第2の血管領域群と称する。 For the purpose of explaining the vascular region matching process, two adjacent cross-sectional images will be referred to as the first cross-sectional image and the second cross-sectional image. Furthermore, the cross-section of the fundus Ef corresponding to the first cross-sectional image will be referred to as the first cross-section, and the cross-section of the fundus Ef corresponding to the second cross-sectional image will be referred to as the second cross-section. Furthermore, the vascular region group identified from the first cross-sectional image will be referred to as the first vascular region group, and the vascular region group identified from the second cross-sectional image will be referred to as the second vascular region group.
第1の断面と第2の断面との間隔が十分に小さい場合、非限定的な態様の血管領域対応付け部1040は、第1の断面画像における各血管領域(第1の血管領域と称する)の位置と、第2の断面画像における各血管領域(第2の血管領域と称する)の位置とを求める。血管領域の位置は、血管領域における特徴点の位置(例えば、中心位置、重心位置、上端位置、下端位置など)であってよい。また、血管領域の位置は、それが検出された断面画像における位置(つまり、当該断面画像の画素位置を表現する2次元座標系の座標)を示す情報でもよいし、第1の断面及び第2の断面の双方を含む3次元画像における位置(当該3次元領域における位置を表現する3次元座標系の座標)を示す情報でもよいし、位置の定義に使用可能な別の情報でもよい。 When the distance between the first cross section and the second cross section is sufficiently small, the vascular region association unit 1040 in a non-limiting embodiment determines the position of each vascular region (referred to as the first vascular region) in the first cross section image and the position of each vascular region (referred to as the second vascular region) in the second cross section image. The position of a vascular region may be the position of a feature point in the vascular region (e.g., the center position, the center of gravity position, the upper end position, the lower end position, etc.). Furthermore, the position of a vascular region may be information indicating its position in the cross section image in which it was detected (i.e., coordinates in a two-dimensional coordinate system representing the pixel position of the cross section image), or information indicating its position in a three-dimensional image including both the first cross section and the second cross section (coordinates in a three-dimensional coordinate system representing the position in the three-dimensional region), or other information that can be used to define the position.
血管の走行状態が急激に(不連続的に)変化することは稀であり、特に、OCT血流計測の対象となる比較的太い血管においては非常に稀である。よって、第1の断面と第2の断面との間隔が十分に小さい場合には、1つの血管に関する第1の断面画像における血管領域(第1の血管領域)の位置と、第2の断面画像における血管領域(第2の血管領域)の位置との間に、大きなズレが介在することは無いと考えることができる。 It is rare for the course of blood vessels to change suddenly (discontinuously), and this is particularly rare in the relatively large blood vessels that are the subject of OCT blood flow measurement. Therefore, if the distance between the first cross section and the second cross section is sufficiently small, it can be assumed that there will be no significant discrepancy between the position of the blood vessel region (first blood vessel region) in the first cross-sectional image for one blood vessel and the position of the blood vessel region (second blood vessel region) in the second cross-sectional image.
この仮定の下に、血管領域対応付け部1040は、第1の断面画像における各血管領域の位置と、第2の断面画像における各血管領域の位置とを比較することにより、同一血管の異なる断面に相当する第1の血管領域と第2の血管領域とのペアを特定し、これらを互いに対応付けることができる。例えば、第1の血管領域群に属する任意の第1の血管領域について、血管領域対応付け部1040は、当該第1の血管領域の位置と、第2の血管領域群に属する各第2の血管領域の位置とを比較して、当該第1の血管領域に対する距離が最も小さい第2の血管領域を特定し、特定された第2の血管領域を当該第1の血管領域に対応付けるように構成されてよい。 Under this assumption, the vascular region association unit 1040 can identify pairs of first and second vascular regions corresponding to different cross sections of the same blood vessel by comparing the position of each vascular region in the first cross-sectional image with the position of each vascular region in the second cross-sectional image, and associate these with each other. For example, for any first vascular region belonging to a first vascular region group, the vascular region association unit 1040 may be configured to compare the position of the first vascular region with the position of each second vascular region belonging to a second vascular region group, identify the second vascular region that is the shortest distance from the first vascular region, and associate the identified second vascular region with the first vascular region.
第1の断面と第2の断面との間隔が十分に小さい場合、同一血管に相当する第1の血管領域の位置と第2の血管領域の位置とが近いことを仮定できるだけでなく、別の仮定を考慮することもできる。例えば、同一血管に相当する第1の血管領域の面積と第2の血管領域の面積とが実質的に等しいと仮定してもよいし、同一血管に相当する第1の血管領域の形状と第2の血管領域の形状とが実質的に等しいと仮定してもよい。また、第1の断面と第2の断面との間隔が十分に小さい場合、これら断面の間の領域において血管の交差は生じないと考えることができるので、双方の断面にそれぞれ描出されている複数の血管の空間的な順序(位置的な順序)は同じであると仮定することができる。血管領域対応付け部1040は、このような仮定を考慮して血管領域対応付け処理を実行するように構成されてもよい。 If the distance between the first cross section and the second cross section is sufficiently small, not only can it be assumed that the positions of the first vascular region and the second vascular region corresponding to the same blood vessel are close to each other, but other assumptions can also be considered. For example, it can be assumed that the areas of the first vascular region and the second vascular region corresponding to the same blood vessel are substantially equal, or that the shapes of the first vascular region and the second vascular region corresponding to the same blood vessel are substantially equal. Furthermore, if the distance between the first cross section and the second cross section is sufficiently small, it can be assumed that no crossing of blood vessels occurs in the region between these cross sections, and therefore it can be assumed that the spatial order (positional order) of the multiple blood vessels depicted on both cross sections is the same. The vascular region matching unit 1040 can be configured to perform the vascular region matching process taking such assumptions into account.
第1の断面と第2の断面との間隔が十分に小さいか否かの基準は、任意に決定されてよく、例えば、OCTスキャンが適用される領域の眼底Efにおける位置、眼底血管の状態などを考慮することができる。また、OCT血流計測が行われる場合、例えば図5Aに示す2つの補足断面C1及びC2(図6Aに示す補足断面画像G1及びG2)の間の距離は十分に小さく設定されるのが一般的であるから、同様の用途を意図する実施形態では、第1の断面と第2の断面との間隔は十分に小さいと仮定することができる。 The criteria for determining whether the distance between the first and second cross sections is sufficiently small may be determined arbitrarily, and may take into consideration, for example, the position of the area to which the OCT scan is applied on the fundus Ef, the state of the fundus blood vessels, etc. Furthermore, when OCT blood flow measurement is performed, the distance between the two supplementary cross sections C1 and C2 shown in FIG. 5A (supplementary cross section images G1 and G2 shown in FIG. 6A) is generally set to be sufficiently small, so in embodiments intended for similar applications, the distance between the first and second cross sections can be assumed to be sufficiently small.
これに対し、第1の断面と第2の断面との間隔が十分に小さくない場合、非限定的な態様の血管領域対応付け部1040は、例えば、別途に取得された画像(正面眼底画像及び/又は3次元眼底画像)を参照することによって、第1の断面と第2の断面との間の位置関係や、第1の血管領域群に属する各血管領域と第2の血管領域群に属する各血管領域との間の位置関係を把握するように構成されてよい。更に、血管領域対応付け部1040は、得られた位置関係を参照することによって、同一血管の異なる断面に相当する第1の血管領域と第2の血管領域とのペアを特定し、これらを互いに対応付けるように構成されてよい。 In contrast, if the distance between the first cross section and the second cross section is not sufficiently small, the vascular region association unit 1040 in a non-limiting embodiment may be configured to, for example, determine the positional relationship between the first cross section and the second cross section, and the positional relationship between each vascular region belonging to the first vascular region group and each vascular region belonging to the second vascular region group, by referring to separately acquired images (frontal fundus images and/or three-dimensional fundus images). Furthermore, the vascular region association unit 1040 may be configured to identify pairs of first and second vascular regions corresponding to different cross sections of the same blood vessel by referring to the obtained positional relationship, and associate these with each other.
血管領域対応付け部1040は、回路を含むハードウェアと、血管領域対応付けソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて実現される血管領域対応付け機能によって、血管領域間の対応付けのための解析処理を行うことができる。 The vascular region matching unit 1040 is realized by the cooperation of hardware including circuits and vascular region matching software. The non-limiting aspect of the ophthalmologic apparatus 1 can perform analytical processing for matching between vascular regions using the vascular region matching function realized using the data processing unit 230.
血管マップ生成部1050は、血管領域対応付け部1040により得られた血管領域の対応付けの結果に基づいて、血管の分布を示す情報(血管マップ)を生成するように構成されている。血管マップにより表現される血管分布の範囲は、断面画像生成部1020により生成された複数の断面画像に対応する複数の断面によって画定される眼底Efの領域の少なくとも一部を含む。 The vascular map generation unit 1050 is configured to generate information (vascular map) indicating the distribution of blood vessels based on the results of the vascular region matching obtained by the vascular region matching unit 1040. The range of vascular distribution represented by the vascular map includes at least a portion of the area of the fundus Ef defined by the multiple cross sections corresponding to the multiple cross-sectional images generated by the cross-sectional image generation unit 1020.
血管マップは、血管領域対応付け部1040により互いに対応付けられた複数の血管領域に相当する血管(前述した同一血管)の位置(座標)を少なくとも示す情報であり、眼底Efにおける血管の分布状態を示す情報である。血管マップは、血管位置を可視化して得られる視覚情報であってもよいし、可視化されていない血管位置データであってもよい。視覚情報は、画像処理、画像解析、表示、印刷などに提供されてよい。血管位置データは、データ処理、データ解析、画像化(可視化)などに提供されてよい。 The vascular map is information that indicates at least the positions (coordinates) of blood vessels (the same blood vessels mentioned above) corresponding to multiple vascular regions that have been associated with each other by the vascular region association unit 1040, and is information that indicates the distribution state of blood vessels in the fundus Ef. The vascular map may be visual information obtained by visualizing vascular positions, or may be unvisualized vascular position data. The visual information may be provided for image processing, image analysis, display, printing, etc. The vascular position data may be provided for data processing, data analysis, imaging (visualization), etc.
血管マップ生成部1050は、回路を含むハードウェアと、血管マップ生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて実現される血管マップ生成機能によって、血管マップ生成のための処理を行うことができる。 The vascular map generation unit 1050 is realized by the cooperation of hardware including circuits and vascular map generation software. The non-limiting aspect of the ophthalmic device 1 can perform processing for generating a vascular map using a vascular map generation function realized using the data processing unit 230.
眼科装置1000のいくつかの動作例を説明する。各動作例における工程の処理内容は、非限定的なものであり、任意に変形されてよい。また、各動作例における工程の順序は、非限定的なものであり、任意に変形されてよい。 Several operational examples of the ophthalmic device 1000 will be described. The processing content of the steps in each operational example is not limited and may be modified as desired. Furthermore, the order of the steps in each operational example is not limited and may be modified as desired.
図9は、眼科装置1000の1つの動作例を示す。非限定的な具体例として、図8A~図8Cを参照する。 Figure 9 shows an example of the operation of the ophthalmic device 1000. For a non-limiting example, see Figures 8A to 8C.
まず、ステップS1において、スキャン部1010は、フライバックを伴わないスキャンパターンによるOCTスキャンを被検眼Eの眼底Efに適用してデータを収集する。 First, in step S1, the scanning unit 1010 applies an OCT scan using a scan pattern without flyback to the fundus Ef of the subject's eye E to collect data.
ステップS1の具体例において、OCTスキャンは、図8Cに示す円形パターン1201に沿ったサークルスキャン1203aと、スキャン目標移動1203bと、円形パターン1202に沿ったサークルスキャン1203cとの組み合わせであってよい。これにより、円形パターン1201に沿った円柱側面1201bからデータが収集され、且つ、円形パターン1202に沿った円柱側面1202bからデータが収集される。 In a specific example of step S1, the OCT scan may be a combination of a circle scan 1203a along the circular pattern 1201, scan target movement 1203b, and a circle scan 1203c along the circular pattern 1202, as shown in FIG. 8C. This allows data to be collected from the cylinder side surface 1201b along the circular pattern 1201, and data to be collected from the cylinder side surface 1202b along the circular pattern 1202.
ステップS2において、断面画像生成部1020は、ステップS1で収集されたデータに基づいて、眼底Efの複数の断面にそれぞれ対応する複数の断面画像を生成する。 In step S2, the cross-sectional image generation unit 1020 generates multiple cross-sectional images corresponding to multiple cross sections of the fundus oculi Ef based on the data collected in step S1.
ステップS2の具体例において、断面画像生成部1020は、ステップS1において円形パターン1201に沿った円柱側面1201bから収集されたデータに基づいて、円柱側面1201bを表現した断面画像1301を生成する(図10Aを参照)。更に、断面画像生成部1020は、ステップS1において円形パターン1202に沿った円柱側面1202bから収集されたデータに基づいて、円柱側面1202bを表現した断面画像1302を生成する(図10Aを参照)。 In a specific example of step S2, the cross-sectional image generation unit 1020 generates a cross-sectional image 1301 representing the cylinder side surface 1201b based on data collected from the cylinder side surface 1201b along the circular pattern 1201 in step S1 (see FIG. 10A). Furthermore, the cross-sectional image generation unit 1020 generates a cross-sectional image 1302 representing the cylinder side surface 1202b based on data collected from the cylinder side surface 1202b along the circular pattern 1202 in step S1 (see FIG. 10A).
ステップS3において、血管領域特定部1030は、ステップS2において生成された複数の断面画像のそれぞれから血管領域群を検出する。これにより、複数の断面画像にそれぞれ対応する複数の血管領域群が特定される。 In step S3, the vascular region identification unit 1030 detects vascular region groups from each of the multiple cross-sectional images generated in step S2. This identifies multiple vascular region groups corresponding to each of the multiple cross-sectional images.
ステップS3の具体例において、血管領域特定部1030は、ステップS2において生成された断面画像1301から、血管領域群1301a、1301b、1301c、及び1301dを検出する図10Aを参照)。更に、血管領域特定部1030は、ステップS2において生成された断面画像1302から、血管領域群1302a、1302b、1302c、及び1302dを検出する図10Aを参照)。 In a specific example of step S3, the vascular region identification unit 1030 detects vascular region groups 1301a, 1301b, 1301c, and 1301d from the cross-sectional image 1301 generated in step S2 (see Figure 10A). Furthermore, the vascular region identification unit 1030 detects vascular region groups 1302a, 1302b, 1302c, and 1302d from the cross-sectional image 1302 generated in step S2 (see Figure 10A).
ステップS4において、血管領域対応付け部1040は、ステップS4で特定された複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う。 In step S4, the vascular region association unit 1040 associates vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified in step S4.
ステップS4の具体例において、血管領域対応付け部1040は、断面画像1301に対応する血管領域群1301a、1301b、1301c、及び1301dと、断面画像1302に対応する血管領域群1302a、1302b、1302c、及び1302dとの間において、同一血管の異なる断面に相当する血管領域のペアを特定する。本例では、次の4つの血管領域ペアが特定される:第1の血管の2つの断面に相当する血管領域1301aと血管領域1302aとからなる第1のペア;第2の血管の2つの断面に相当する血管領域1301bと血管領域1302bとからなる第2のペア;第3の血管の2つの断面に相当する血管領域1301cと血管領域1302cとからなる第3のペア;第4の血管の2つの断面に相当する血管領域1301dと血管領域1302dとからなる第4のペア。血管領域対応付け部1040は、各ペアに属する2つの血管領域を互いに対応付ける。これにより、第1のペア間の対応付け1303aと、第2のペア間の対応付け1303bと、第3のペア間の対応付け1303cと、第4のペア間の対応付け1303dとが求められる。 In a specific example of step S4, the vascular region matching unit 1040 identifies pairs of vascular regions corresponding to different cross sections of the same blood vessel between the vascular region group 1301a, 1301b, 1301c, and 1301d corresponding to the cross-sectional image 1301 and the vascular region group 1302a, 1302b, 1302c, and 1302d corresponding to the cross-sectional image 1302. In this example, the following four vascular region pairs are identified: a first pair consisting of vascular region 1301a and vascular region 1302a corresponding to two cross sections of the first blood vessel; a second pair consisting of vascular region 1301b and vascular region 1302b corresponding to two cross sections of the second blood vessel; a third pair consisting of vascular region 1301c and vascular region 1302c corresponding to two cross sections of the third blood vessel; and a fourth pair consisting of vascular region 1301d and vascular region 1302d corresponding to two cross sections of the fourth blood vessel. The vascular region association unit 1040 associates the two vascular regions belonging to each pair. This results in association 1303a between the first pair, association 1303b between the second pair, association 1303c between the third pair, and association 1303d between the fourth pair.
ステップS5において、血管マップ生成部1050は、ステップS4で得られた血管領域の対応付けの結果に基づいて血管マップを生成する。 In step S5, the vascular map generation unit 1050 generates a vascular map based on the vascular region matching results obtained in step S4.
ステップS5の具体例において、血管マップ生成部1050は、上記第1のペアに属する血管領域1301aと血管領域1302aとを接続する第1の接続領域と、上記第2のペアに属する血管領域1301bと血管領域1302bとを接続する第2の接続領域と、上記第3のペアに属する血管領域1301cと血管領域1302cとを接続する第3の接続領域と、上記第4のペアに属する血管領域1301dと血管領域1302dとを接続する第4の接続領域とを決定する。 In a specific example of step S5, the vascular map generation unit 1050 determines a first connection region connecting vascular regions 1301a and 1302a belonging to the first pair, a second connection region connecting vascular regions 1301b and 1302b belonging to the second pair, a third connection region connecting vascular regions 1301c and 1302c belonging to the third pair, and a fourth connection region connecting vascular regions 1301d and 1302d belonging to the fourth pair.
第1のペアにおいて、血管領域1301aは、円形パターン1201に沿った円柱側面1201b上の領域であり、血管領域1302aは、円形パターン1202に沿った円柱側面1202b上の領域である。第1の接続領域は、血管領域1301aと血管領域1302aとを結ぶ。第1の接続領域は2つの端部を有し、第1の端部は血管領域1301a(円柱側面1201b、円形パターン1201)に配置され、第2の端部は血管領域1302a(円柱側面1202b、円形パターン1202)に配置される。第2のペアに基づく第2の接続領域、第3のペアに基づく第3の接続領域、及び第4のペアに基づく第4の接続領域についても同様である。 In the first pair, vascular region 1301a is the region on cylinder side surface 1201b along circular pattern 1201, and vascular region 1302a is the region on cylinder side surface 1202b along circular pattern 1202. The first connection region connects vascular region 1301a and vascular region 1302a. The first connection region has two ends, the first end being located in vascular region 1301a (cylinder side surface 1201b, circular pattern 1201) and the second end being located in vascular region 1302a (cylinder side surface 1202b, circular pattern 1202). The same applies to the second connection region based on the second pair, the third connection region based on the third pair, and the fourth connection region based on the fourth pair.
図10B及び図10Cは、血管マップの2つの非限定的な例を示す。図10Bの血管マップ1310は、正面の視点から見たときの2次元的な血管分布を表す。破線で示すオブジェクトは、説明を補助するために記載されたものであり、血管マップ1310に可視化されていなくてよい。少なくとも円形パターン1201及び円形パターン1202は、可視化されてもよい。符号1311は、視神経乳頭を示す。符号1312a、1312b、1312c、及び1312dのそれぞれは、血管を示す。 10B and 10C show two non-limiting examples of vascular maps. The vascular map 1310 in FIG. 10B represents a two-dimensional vascular distribution when viewed from a frontal perspective. Objects shown with dashed lines are included to aid in explanation and may not be visualized in the vascular map 1310. At least the circular patterns 1201 and 1202 may be visualized. Reference numeral 1311 denotes the optic disc. Reference numerals 1312a, 1312b, 1312c, and 1312d each denote a blood vessel.
符号1313aは、第1のペアに属する2つの血管領域1301a及び1302aを結ぶ第1の接続領域を示す。第1の接続領域1313aは、血管1312aの一部に相当する。符号1313bは、第2のペアに属する2つの血管領域1301b及び1302bを結ぶ第2の接続領域を示す。第2の接続領域1313bは、血管1312bの一部に相当する。符号1313cは、第3のペアに属する2つの血管領域1301c及び1302cを結ぶ第3の接続領域を示す。第3の接続領域1313cは、血管1312cの一部に相当する。符号1313dは、第4のペアに属する2つの血管領域1301d及び1302dを結ぶ第4の接続領域を示す。第4の接続領域1313dは、血管1312dの一部に相当する。 Reference symbol 1313a indicates a first connection region connecting two vascular regions 1301a and 1302a belonging to the first pair. The first connection region 1313a corresponds to a portion of the blood vessel 1312a. Reference symbol 1313b indicates a second connection region connecting two vascular regions 1301b and 1302b belonging to the second pair. The second connection region 1313b corresponds to a portion of the blood vessel 1312b. Reference symbol 1313c indicates a third connection region connecting two vascular regions 1301c and 1302c belonging to the third pair. The third connection region 1313c corresponds to a portion of the blood vessel 1312c. Reference symbol 1313d indicates a fourth connection region connecting two vascular regions 1301d and 1302d belonging to the fourth pair. The fourth connection region 1313d corresponds to a portion of the blood vessel 1312d.
視神経乳頭1311及び/又は血管1312a~1312dを可視化するために、正面眼底画像を利用することができる。この正面眼底画像は、例えば、眼底カメラ画像、SLO画像、又はOCT正面画像(プロジェクション画像、アンファス画像など)であってよい。 A frontal fundus image can be used to visualize the optic disc 1311 and/or blood vessels 1312a-1312d. This frontal fundus image may be, for example, a fundus camera image, an SLO image, or an OCT frontal image (projection image, an annular image, etc.).
図10Cの血管マップ1320は、3次元的な血管分布を表す。血管マップ1320は、3次元座標系において定義された3次元データでもよいし、この3次元データを画像化した3次元画像データ(ボリュームデータ、スタックデータなど)でもよいし、この3次元画像データのレンダリング画像(ボリュームレンダリング画像など)でもよい。破線で示すオブジェクトは、説明を補助するために記載されたものであり、血管マップ1320に可視化されていなくてよい。なお、円柱側面1201b及び円柱側面1202bや、図示しない円形パターン1201及び1202は、可視化されてもよい。符号1321は、視神経乳頭を示す。符号1322a、1322b、13222c、及び1322dのそれぞれは、血管を示す。 The vascular map 1320 in Figure 10C represents a three-dimensional distribution of blood vessels. The vascular map 1320 may be three-dimensional data defined in a three-dimensional coordinate system, three-dimensional image data (volume data, stack data, etc.) that visualizes this three-dimensional data, or a rendered image of this three-dimensional image data (volume rendering image, etc.). Objects indicated by dashed lines are shown to aid in explanation and do not need to be visualized in the vascular map 1320. Note that the cylinder side surfaces 1201b and 1202b, as well as the circular patterns 1201 and 1202 (not shown), may be visualized. Reference numeral 1321 denotes the optic disc. Reference numerals 1322a, 1322b, 13222c, and 1322d each denote a blood vessel.
符号1323aは、第1のペアに属する2つの血管領域1301a及び1302aを結ぶ第1の接続領域を示す。第1の接続領域1323aは、血管1322aの一部に相当する。符号1323bは、第2のペアに属する2つの血管領域1301b及び1302bを結ぶ第2の接続領域を示す。第2の接続領域1323bは、血管1322bの一部に相当する。符号1323cは、第3のペアに属する2つの血管領域1301c及び1302cを結ぶ第3の接続領域を示す。第3の接続領域1323cは、血管1322cの一部に相当する。符号1323dは、第4のペアに属する2つの血管領域1301d及び1302dを結ぶ第4の接続領域を示す。第4の接続領域1323dは、血管1322dの一部に相当する。 Reference symbol 1323a indicates a first connection region connecting two vascular regions 1301a and 1302a belonging to the first pair. The first connection region 1323a corresponds to a portion of the blood vessel 1322a. Reference symbol 1323b indicates a second connection region connecting two vascular regions 1301b and 1302b belonging to the second pair. The second connection region 1323b corresponds to a portion of the blood vessel 1322b. Reference symbol 1323c indicates a third connection region connecting two vascular regions 1301c and 1302c belonging to the third pair. The third connection region 1323c corresponds to a portion of the blood vessel 1322c. Reference symbol 1323d indicates a fourth connection region connecting two vascular regions 1301d and 1302d belonging to the fourth pair. The fourth connection region 1323d corresponds to a portion of the blood vessel 1322d.
本動作例に係る処理手順を実行可能な眼科装置1000は、フライバックを伴わないスキャンパターンを用いたOCTスキャンにより眼底Efからデータを収集して複数の断面にそれぞれ対応する複数の断面画像を生成し、複数の断面画像にそれぞれ対応する複数の血管領域群を特定し、同一血管の異なる断面に相当する血管領域同士を対応付けして血管マップを生成する。眼科装置1000の1つの利点は、フライバックを伴わないスキャンパターンを用いることによって、迅速に血管マップを生成できることである。これにより、OCT血流計測(眼底血流動態計測)における準備段階の工程(例えば、ドップラー角度推定、アライメント、注目血管の探索など)に血管マップを利用することが可能になり、よって、OCT血流計測の品質向上を図ることが可能になる。 The ophthalmic device 1000 capable of executing the processing procedures of this operational example collects data from the fundus Ef by OCT scanning using a scan pattern without flyback, generates multiple cross-sectional images corresponding to multiple cross sections, identifies multiple groups of vascular regions corresponding to the multiple cross-sectional images, and generates a vascular map by matching vascular regions corresponding to different cross sections of the same blood vessel. One advantage of the ophthalmic device 1000 is that it can quickly generate a vascular map by using a scan pattern without flyback. This makes it possible to use the vascular map in preparatory steps (e.g., Doppler angle estimation, alignment, search for blood vessels of interest, etc.) in OCT blood flow measurement (fundus blood flow dynamics measurement), thereby improving the quality of OCT blood flow measurement.
図11は、1つの非限定的な態様に係る眼科装置1400の構成を示す。眼科装置1400の要素のうち、図7の眼科装置1000の要素と同じ名称を有し且つ同じ符号を付されたものは、特に言及しない限り、眼科装置1000における対応要素と同様の構成及び機能を有していてよい。ただし、当該対応要素の変形手段、均等手段、代替手段などを採用することは、除外されない。 FIG. 11 shows the configuration of an ophthalmic device 1400 according to one non-limiting embodiment. Elements of the ophthalmic device 1400 that have the same names and symbols as elements of the ophthalmic device 1000 in FIG. 7 may have the same configuration and function as the corresponding elements in the ophthalmic device 1000, unless otherwise specified. However, this does not exclude the adoption of modified means, equivalent means, alternative means, etc. for the corresponding elements.
眼科装置1000と同様の要素として、眼科装置1400は、スキャン部1010と、断面画像生成部1020と、血管領域特定部1030と、血管領域対応付け部1040と、血管マップ生成部1050とを有する。これらに加えて、眼科装置1400は、スキャン制御部1060と、注目血管指定部1070と、血流動態情報生成部1080と、表示制御部1090と、表示装置1100とを備えている。 The ophthalmic device 1400 has similar elements to the ophthalmic device 1000, including a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, and a vascular map generating unit 1050. In addition to these, the ophthalmic device 1400 also has a scanning control unit 1060, a blood vessel designation unit 1070, a hemodynamic information generating unit 1080, a display control unit 1090, and a display device 1100.
スキャン制御部1060は、スキャン部1010を制御するように構成されている。例えば、スキャン制御部1060は、予め指定された眼底Efの注目血管に対してOCT血流計測の反復スキャンを適用するようにスキャン部1010の制御を実行することができる。注目血管の指定は、自動で実行されてもよいし、手動で実行されてよい。注目血管の自動指定は、例えば、後述の注目血管指定部1070により実行される。手動指定は、例えば、ユーザーインターフェイス(前述のユーザーインターフェイス240)を用いた、眼底Efの画像の表示と、この表示画像に対する位置指定操作との組み合わせによって行われる。 The scan control unit 1060 is configured to control the scan unit 1010. For example, the scan control unit 1060 can control the scan unit 1010 to apply repeated scans of OCT blood flow measurement to a pre-specified blood vessel of interest in the fundus Ef. The blood vessel of interest may be designated automatically or manually. Automatic designation of the blood vessel of interest is performed, for example, by the blood vessel designation unit 1070 described below. Manual designation is performed, for example, by combining the display of an image of the fundus Ef using a user interface (the above-mentioned user interface 240) with a position designation operation on this displayed image.
スキャン制御部1060は、回路を含むハードウェアと、スキャン制御ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、制御部240(主制御部211)を用いてスキャン制御部1060の機能を実現することができる。 The scan control unit 1060 is realized by cooperation between hardware including circuits and scan control software. In a non-limiting embodiment, the ophthalmologic apparatus 1 can realize the functions of the scan control unit 1060 using the control unit 240 (main control unit 211).
注目血管指定部1070は、血管マップ生成部1050により生成された血管マップ1051を解析して、OCT血流計測の適用対象となる注目血管を指定するように構成されている。前述したように、血管マップ生成部1050により生成された血管マップ1051には、眼底Efにおける複数の血管の分布状態が可視化されている。注目血管指定部1070は、血管マップ1051に提示されている複数の血管のうちから注目血管を選択するように構成されてよい。選択される注目血管の個数は、1つ以上の任意の個数であってよい。注目血管指定部1070により指定された注目血管は、例えば、図5A又は図5Bに示す注目血管Dbとして扱われる。 The vessel of interest designation unit 1070 is configured to analyze the vessel map 1051 generated by the vessel map generation unit 1050 and designate a vessel of interest to which OCT blood flow measurement is to be applied. As described above, the vessel map 1051 generated by the vessel map generation unit 1050 visualizes the distribution of multiple blood vessels in the fundus Ef. The vessel of interest designation unit 1070 may be configured to select a vessel of interest from among the multiple blood vessels presented in the vessel map 1051. The number of vessels of interest selected may be any number greater than or equal to one. The vessel of interest designated by the vessel of interest designation unit 1070 is treated as, for example, the vessel of interest Db shown in FIG. 5A or 5B.
注目血管指定部1070は、予め設定された基準に基づいて注目血管の選択を行う。この基準を選択基準と称する。選択基準は、血管の向きに関する指標を含んでいてよい。この血管向き指標の例として、OCT血流計測におけるドップラー角の大きさ(ドップラー角度)、OCT血流計測におけるドップラー角度の好適度などがある。 The vessel of interest designation unit 1070 selects the vessel of interest based on preset criteria. These criteria are referred to as selection criteria. The selection criteria may include an index related to the direction of the vessel. Examples of such vessel direction indices include the magnitude of the Doppler angle (Doppler angle) in OCT blood flow measurement, the suitability of the Doppler angle in OCT blood flow measurement, etc.
いくつかの態様では、OCT血流計測におけるドップラー角度の好適な値は、約80度であり、実際の計測においては、77度~83度の範囲を目標として注目血管及び注目断面の探索が行われる。ただし、ドップラー角度が75度程度まで小さくなっても計測を行うことは可能であるが、位相ラッピングが生じやすくなるという問題が生じるため、目標範囲の下限値は77度程度に設定することが望ましいと考えられる。また、ドップラー角度が85度程度まで大きくなると、検出されるドップラー信号の強度が低下するという問題が生じるため、目標範囲の上限値は83度程度に設定することが望ましいと考えられる。なお、この目標範囲は非限定的な例であり、別の目標範囲を採用してもよい。 In some embodiments, the preferred value for the Doppler angle in OCT blood flow measurement is approximately 80 degrees, and in actual measurements, the search for the blood vessel and cross section of interest is performed with a target range of 77 to 83 degrees. However, while measurements can be performed even if the Doppler angle is as small as 75 degrees, there is a problem of phase wrapping becoming more likely to occur, so it is considered desirable to set the lower limit of the target range to approximately 77 degrees. Furthermore, if the Doppler angle is as large as 85 degrees, there is a problem of a decrease in the strength of the detected Doppler signal, so it is considered desirable to set the upper limit of the target range to approximately 83 degrees. Note that this target range is a non-limiting example, and a different target range may be used.
ドップラー角度に関するこのような実情を考慮し、OCT血流計測におけるドップラー角度の好適な範囲は、典型的には、77度~83度の範囲に設定される。このようにして設定される範囲をドップラー角度許容範囲と称する。眼科装置1400(例えば、血管マップ生成部1050、注目血管指定部1070、及び血流動態情報生成部1080のうちのいずれかの要素)は、血管マップ1051に提示されている各血管のドップラー角度を推定することができる。その演算手法については、前述した眼科装置1のドップラー角度算出部233を参照されたい。 Taking these actual circumstances regarding Doppler angles into consideration, the preferred range of Doppler angles in OCT blood flow measurement is typically set to between 77 degrees and 83 degrees. This range is referred to as the acceptable Doppler angle range. The ophthalmic device 1400 (for example, any element of the vascular map generation unit 1050, the vessel of interest designation unit 1070, and the hemodynamic information generation unit 1080) can estimate the Doppler angle of each blood vessel presented in the vascular map 1051. For details on this calculation method, please refer to the Doppler angle calculation unit 233 of the ophthalmic device 1 described above.
注目血管指定部1070は、血管マップ1051に提示されている各血管のドップラー角度と、ドップラー角度許容範囲とを比較する。或る血管のドップラー角度の値が当該許容範囲に含まれる場合、注目血管指定部1070は、当該血管のドップラー角度は良好であると判定する。良好なドップラー角度を有する血管が2つ以上存在し、且つ、指定される注目血管の個数が1つである場合、注目血管指定部1070は、最適値(典型的には80度)に最も近いドップラー角度を有する血管を選択するように構成されてよい。いくつかの態様では、良好なドップラー角度を有する各血管を注目血管に指定してもよい。また、いくつかの態様では、良好なドップラー角度を有する複数の血管のうちから所定個数の注目血管を選択してもよい。 The vessel of interest designation unit 1070 compares the Doppler angle of each blood vessel displayed on the blood vessel map 1051 with the Doppler angle tolerance range. If the Doppler angle value of a certain blood vessel falls within the tolerance range, the vessel of interest designation unit 1070 determines that the Doppler angle of that blood vessel is good. If there are two or more blood vessels with good Doppler angles and only one blood vessel of interest is designated, the vessel of interest designation unit 1070 may be configured to select the blood vessel with the Doppler angle closest to the optimal value (typically 80 degrees). In some embodiments, each blood vessel with a good Doppler angle may be designated as a blood vessel of interest. In some embodiments, a predetermined number of blood vessels of interest may be selected from a plurality of blood vessels with good Doppler angles.
ドップラー角度や好適度などの向き情報に加えて、又は、向き情報の代わりに、選択基準は、別の指標を含んでいてもよい。別の指標の例として、血管の種類(例えば、動脈、静脈)、寸法(血管径)、屈曲度、位置(例えば、視神経乳頭に対する位置)などがある。注目血管指定部1070は、2種類以上の指標を段階的に又は並行的に考慮して注目血管を選択することができる。 In addition to or instead of orientation information such as Doppler angle and suitability, the selection criteria may include other indices. Examples of other indices include blood vessel type (e.g., artery, vein), size (blood vessel diameter), tortuosity, and position (e.g., position relative to the optic disc). The blood vessel of interest designation unit 1070 can select a blood vessel of interest by considering two or more types of indices in stages or in parallel.
別の非限定的な例の注目血管指定部1070は、血管マップ1051に提示された複数の血管のうちからユーザーが選択した血管を注目血管に指定するように構成されてよい。本例において、眼科装置1400(表示制御部1090及び表示装置1100)は、血管マップ1051(これを可視化した画像)を表示する。ユーザーは、表示された血管マップ1051を参照して注目血管を選択することができる。選択された注目血管を入力する操作は、図示しないユーザーインターフェイス(例えば、眼科装置1のユーザーインターフェイス240)を用いて行われる。 In another non-limiting example, the vessel of interest designation unit 1070 may be configured to designate a vessel selected by the user from among the multiple blood vessels presented in the vessel map 1051 as the vessel of interest. In this example, the ophthalmic device 1400 (display control unit 1090 and display device 1100) displays the vessel map 1051 (a visualized image of the vessel map 1051). The user can select a vessel of interest by referring to the displayed vessel map 1051. The operation of inputting the selected vessel of interest is performed using a user interface (e.g., the user interface 240 of the ophthalmic device 1) not shown.
注目血管指定部1070は、回路を含むハードウェアと、注目血管指定ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて注目血管指定部1070の機能を実現することができる。 The vessel of interest designation unit 1070 is realized by the cooperation of hardware including circuitry and software for designating the vessel of interest. The non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the vessel of interest designation unit 1070 using the data processing unit 230.
血流動態情報生成部1080は、スキャン部1010がOCT血流計測のためのスキャンを眼底Efに適用して収集されたデータに基づいて、注目血管指定部1070により指定された注目血管における血流動態情報を生成するように構成されている。 The hemodynamic information generation unit 1080 is configured to generate hemodynamic information for the blood vessel of interest designated by the blood vessel of interest designation unit 1070 based on data collected by the scanning unit 1010 when the scanning unit 1010 applies a scan for OCT blood flow measurement to the fundus Ef.
血流動態情報生成部1080は、眼科装置1のデータ処理部230と同様の構成を備えていてよい。具体的には、血流動態情報生成部1080は、図4に示す画像生成部220、血管領域特定部231、及び血流動態情報生成部232を含んでいてよい。 The hemodynamic information generation unit 1080 may have a configuration similar to that of the data processing unit 230 of the ophthalmologic apparatus 1. Specifically, the hemodynamic information generation unit 1080 may include the image generation unit 220, vascular region identification unit 231, and hemodynamic information generation unit 232 shown in FIG. 4.
本態様において、スキャン制御部1060は、注目血管指定部1070により指定された注目血管に対して、血流動態計測のための反復スキャンを適用するように、スキャン部1010を制御する。図5A~図6Bに示すように、反復スキャンは、注目血管の特定の断面(注目断面)に対して適用される。本態様の注目断面は、血管マップ1051に提示されている血管における任意の位置に設定されてよい。 In this embodiment, the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement to the blood vessel of interest designated by the blood vessel of interest designation unit 1070. As shown in Figures 5A to 6B, the repeated scans are applied to a specific cross section (cross section of interest) of the blood vessel of interest. In this embodiment, the cross section of interest may be set at any position in the blood vessel displayed on the blood vessel map 1051.
非限定的な具体例として、図10Bの血管マップ1310に提示された血管1312aが注目血管に指定された場合、スキャン制御部1060は、第1の接続領域1313aにおける任意の位置に断面が設定される。例えば、第1の接続領域1313aにおいて2つの円形パターン1201及び1202から等距離の位置(つまり、第1の接続領域1313aが接続する2つの血管領域1301a及び1302aの中間の位置)に注目断面を設定してもよい。この注目断面は、図5A及び図6Aに示す注目断面C0に相当し、2つの血管領域1301a及び1302aは、2つの補足断面C1及びC2(2つの補足断面画像G1及びG2)に相当する。 As a non-limiting example, when the blood vessel 1312a presented in the blood vessel map 1310 in FIG. 10B is designated as the blood vessel of interest, the scan control unit 1060 sets a cross section at an arbitrary position in the first connection region 1313a. For example, the cross section of interest may be set at a position equidistant from the two circular patterns 1201 and 1202 in the first connection region 1313a (i.e., a position midway between the two blood vessel regions 1301a and 1302a connected by the first connection region 1313a). This cross section of interest corresponds to the cross section of interest C0 shown in FIGS. 5A and 6A, and the two blood vessel regions 1301a and 1302a correspond to the two supplementary cross sections C1 and C2 (two supplementary cross section images G1 and G2).
注目血管の注目断面に対する反復スキャンによってスキャン部1010が収集したデータを、OCT血流計測データ1075と称する。血流動態情報生成部1080は、OCT血流計測データ1075に基づいて、注目血管指定部1070により指定された注目血管における血流動態情報1085を生成する。 The data collected by the scanning unit 1010 through repeated scans of the cross section of interest of the blood vessel of interest is referred to as OCT blood flow measurement data 1075. The blood flow dynamics information generating unit 1080 generates blood flow dynamics information 1085 of the blood vessel of interest designated by the blood vessel of interest designating unit 1070 based on the OCT blood flow measurement data 1075.
血流動態情報生成部1080は、回路を含むハードウェアと、血流動態情報生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて血流動態情報生成部1080の機能を実現することができる。 The hemodynamic information generation unit 1080 is realized by the cooperation of hardware including circuits and hemodynamic information generation software. The ophthalmologic apparatus 1 in a non-limiting embodiment can realize the functions of the hemodynamic information generation unit 1080 using the data processing unit 230.
表示装置1100は、任意の種類のディスプレイである。表示装置1100は、眼科装置1の表示部241であってよく、例えば、表示装置3であってよい。本態様の表示装置1100は、眼科装置1400の要素である。別の態様の表示装置は、眼科装置の周辺機器(外部装置)であってよい。 The display device 1100 is any type of display. The display device 1100 may be the display unit 241 of the ophthalmic device 1, for example, the display device 3. The display device 1100 of this embodiment is a component of the ophthalmic device 1400. In another embodiment, the display device may be a peripheral device (external device) of the ophthalmic device.
表示制御部1090は、表示装置1100を制御して情報を表示させるように構成されている。表示制御部1090は、回路を含むハードウェアと、表示制御ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、制御部240(主制御部211)を用いて表示制御部1090の機能を実現することができる。 The display control unit 1090 is configured to control the display device 1100 to display information. The display control unit 1090 is realized by cooperation between hardware including circuits and display control software. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the display control unit 1090 using the control unit 240 (main control unit 211).
表示制御部1090は、眼科装置1400により生成された情報を表示装置1100に表示させる。眼科装置1400により生成された情報の例として、被検眼Eの画像、眼底Efの血管の向き情報(ドップラー角度、評価情報など)、血管マップ、これらのいずれかを処理(補正、解析、評価など)して得られた情報などがある。また、表示装置1100は、眼科装置1400が外部から取得した情報(例えば、画像)を表示装置1100に表示させる。 The display control unit 1090 causes the display device 1100 to display information generated by the ophthalmic device 1400. Examples of information generated by the ophthalmic device 1400 include an image of the subject's eye E, information on the direction of blood vessels in the fundus Ef (Doppler angle, evaluation information, etc.), a blood vessel map, and information obtained by processing any of these (correction, analysis, evaluation, etc.). In addition, the display device 1100 causes the display device 1100 to display information (e.g., images) acquired by the ophthalmic device 1400 from outside.
上記した血管の向き情報の非限定的な例である評価情報について説明する。評価情報は、血流動態計測のドップラー角度を評価して得られる情報である。血流動態情報生成部1080は、前述した手法でドップラー角の大きさを示す情報(ドップラー角度情報)を求める。更に、血流動態情報生成部1080は、このドップラー角度情報に所定の評価処理を適用する。この評価処理は、例えば、OCT血流計測におけるドップラー角の大きさの好適度を評価する。 Evaluation information, which is a non-limiting example of the blood vessel orientation information described above, will now be described. Evaluation information is information obtained by evaluating the Doppler angle of hemodynamic measurement. The hemodynamic information generation unit 1080 obtains information indicating the magnitude of the Doppler angle (Doppler angle information) using the method described above. Furthermore, the hemodynamic information generation unit 1080 applies a predetermined evaluation process to this Doppler angle information. This evaluation process, for example, evaluates the suitability of the magnitude of the Doppler angle in OCT blood flow measurement.
前述したように、OCT血流計測におけるドップラー角度の好適な範囲(目標範囲)は77度~83度の範囲に設定される場合があり、更に、最も好適な値は約80度に設定される場合がある。血流動態情報生成部1080は、評価処理において、対象の血管のドップラー角度の値を目標範囲と比較する。当該ドップラー角度の値が目標範囲に含まれる場合、血流動態情報生成部1080は、当該血管のドップラー角度は好適であると判定し、この判定結果を示す評価情報を生成する。他方、当該血管のドップラー角度が目標範囲に属さない場合、血流動態情報生成部1080は、当該血管のドップラー角度は好適でないことを示す評価情報を生成する。いくつかの態様において、目標範囲を複数の区間に分けることにより、ドップラー角度が好適である場合における評価をより詳細に行ってもよい。また、目標範囲の外部範囲を複数の区間に分けることによって、ドップラー角度が好適でない場合の評価をより詳細に行ってもよい。 As mentioned above, the preferred range (target range) of the Doppler angle in OCT blood flow measurement may be set between 77 degrees and 83 degrees, with the most preferred value being set at approximately 80 degrees. In the evaluation process, the hemodynamic information generation unit 1080 compares the Doppler angle value of the target blood vessel with the target range. If the Doppler angle value falls within the target range, the hemodynamic information generation unit 1080 determines that the Doppler angle of the blood vessel is preferred and generates evaluation information indicating this determination result. On the other hand, if the Doppler angle of the blood vessel does not fall within the target range, the hemodynamic information generation unit 1080 generates evaluation information indicating that the Doppler angle of the blood vessel is not preferred. In some embodiments, the target range may be divided into multiple sections to allow for more detailed evaluation of cases where the Doppler angle is preferred. Furthermore, the range outside the target range may be divided into multiple sections to allow for more detailed evaluation of cases where the Doppler angle is not preferred.
本態様の眼科装置1400が、眼底を正面から撮影する機能(例えば、眼科装置1の眼底カメラユニット2に相当する不図示の正面撮影部により実現される機能)、及び/又は、眼底Efを正面から撮影して生成された正面眼底画像を外部から受け付ける機能(例えば、眼科装置1のデータ入出力部290に相当する不図示の正面画像受付部)を有する場合、表示制御部1090は、正面眼底画像と血管マップ1051とを、表示装置1100に表示させることができる。例えば、表示制御部1090は、正面眼底画像を表示装置1100に表示させるとともに、血管マップ1051(これを可視化して得られた画像)を正面眼底画像上に表示させる。 If the ophthalmic device 1400 of this embodiment has the function of photographing the fundus from the front (for example, a function realized by a front photographing unit (not shown) corresponding to the fundus camera unit 2 of the ophthalmic device 1), and/or the function of externally receiving a front fundus image generated by photographing the fundus Ef from the front (for example, a front image receiving unit (not shown) corresponding to the data input/output unit 290 of the ophthalmic device 1), the display control unit 1090 can display the front fundus image and the vascular map 1051 on the display device 1100. For example, the display control unit 1090 can display the front fundus image on the display device 1100, and can also display the vascular map 1051 (an image obtained by visualizing this) on the front fundus image.
正面眼底画像上に血管マップ1051を表示させる場合、正面眼底画像と血管マップ1051との間の位置合わせ(レジストレーション)を行う必要がある。このレジストレーションは、例えば、血流動態情報生成部1080又は表示制御部1090に含まれる不図示のレジストレーション部によって実行されてよい。レジストレーション部は、回路を含むハードウェアと、レジストレーションソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて実現されるレジストレーション機能によって正面眼底画像と血管マップ1051との間のレジストレーションを実行することができる。このレジストレーションのいくつかの非限定的な例を以下に説明する。 When displaying the vascular map 1051 on a frontal fundus image, it is necessary to perform alignment (registration) between the frontal fundus image and the vascular map 1051. This registration may be performed, for example, by a registration unit (not shown) included in the hemodynamic information generation unit 1080 or the display control unit 1090. The registration unit is realized by cooperation between hardware including a circuit and registration software. A non-limiting aspect of the ophthalmologic apparatus 1 can perform registration between the frontal fundus image and the vascular map 1051 using a registration function realized using the data processing unit 230. Some non-limiting examples of this registration are described below.
非限定的な第1の例において、レジストレーション部は、血管マップ1051に提示されている複数の血管を複数のランドマークとして、血管マップ1051と正面眼底画像との間のレジストレーションを実行するように構成される。血管マップ1051に提示されている各血管は、円形パターン1201と円形パターン1202との間の領域に位置しており、また、2つの円形パターン1201及び1202は、乳頭中心1200aを中心とし且つ所定の半径R1及びR2をそれぞれ有している。この情報を利用することにより、複数のランドマークに相当する正面眼底画像中の部分を探索する範囲を制限することができる。血管マップ1051が図10Cの血管マップ1320のような3次元データである場合、血管マップ1051にプロジェクションを適用して2次元データ(図10Bの血管マップ1310のような正面画像)を生成し、この2次元データと正面眼底画像との間のレジストレーションを行ってもよい。 In a first non-limiting example, the registration unit is configured to perform registration between the vascular map 1051 and the frontal fundus image using the blood vessels displayed in the vascular map 1051 as landmarks. Each blood vessel displayed in the vascular map 1051 is located in the area between the circular pattern 1201 and the circular pattern 1202, and the two circular patterns 1201 and 1202 are centered on the optic disc center 1200a and have predetermined radii R1 and R2, respectively. By using this information, it is possible to limit the range for searching for portions in the frontal fundus image corresponding to the landmarks. If the vascular map 1051 is three-dimensional data such as the vascular map 1320 in FIG. 10C, projection may be applied to the vascular map 1051 to generate two-dimensional data (a frontal image such as the vascular map 1310 in FIG. 10B), and registration may be performed between this two-dimensional data and the frontal fundus image.
非限定的な第2の例において、レジストレーション部は、血管マップ1051に提示されている複数の血管のそれぞれの両端部を複数のランドマークとして、血管マップ1051と正面眼底画像との間のレジストレーションを実行するように構成される。血管マップ1051に提示されている各血管の両端部は、円形パターン1201上に位置する第1の端部、及び、円形パターン1202上に位置する第2の端部である。本例においても、第1の例と同様に、探索範囲を制限することができる。血管マップ1051が3次元データである場合の処理手順についても、第1の例と同様であってよい。 In a second non-limiting example, the registration unit is configured to perform registration between the vascular map 1051 and the frontal fundus image using both ends of each of the multiple blood vessels displayed on the vascular map 1051 as multiple landmarks. The both ends of each blood vessel displayed on the vascular map 1051 are a first end located on the circular pattern 1201 and a second end located on the circular pattern 1202. In this example, as in the first example, the search range can be limited. The processing procedure when the vascular map 1051 is three-dimensional data may also be the same as in the first example.
上記した第1及び第2の例では、血管マップ1051中のオブジェクトをランドマークとしてレジストレーションを行っている。これに対し、非限定的な第3の例は、血管マップ1051の生成に使用された複数の断面画像のうちの少なくとも1つの断面画像を利用する。本例のレジストレーション部は、断面画像にプロジェクションを適用することによって、サークル状画像を生成する。断面画像はOCT強度画像であり、血管が低輝度で描出されている。このような断面画像のプロジェクションとして生成されたサークル状画像においても、血管に相当する画素(血管画素)の輝度は、別の組織に相当する画素の輝度よりも低い。このような複数の血管画素を複数のランドマークに用いることにより、サークル状画像と正面眼底画像との間のレジストレーションを行うことができる。このレジストレーションの結果を用いることにより、サークル状画像の元画像である断面画像と正面眼底画像との間のレジストレーションを行うことができ、更に、この断面画像から生成された血管マップ1051と正面眼底画像との間のレジストレーションを行うことが可能である。 In the first and second examples described above, registration is performed using objects in the vascular map 1051 as landmarks. In contrast, a non-limiting third example uses at least one cross-sectional image from the multiple cross-sectional images used to generate the vascular map 1051. The registration unit in this example generates a circular image by applying projection to the cross-sectional image. The cross-sectional image is an OCT intensity image, in which blood vessels are depicted at low brightness. Even in a circular image generated as a projection of such a cross-sectional image, the brightness of pixels corresponding to blood vessels (vascular pixels) is lower than the brightness of pixels corresponding to other tissues. By using such multiple blood vessel pixels as multiple landmarks, registration can be performed between the circular image and the frontal fundus image. Using the results of this registration, registration can be performed between the cross-sectional image, which is the original image of the circular image, and the frontal fundus image. Furthermore, registration can be performed between the vascular map 1051 generated from this cross-sectional image and the frontal fundus image.
前述したように、本態様の眼科装置1400(例えば、血管マップ生成部1050、注目血管指定部1070、及び血流動態情報生成部1080のうちのいずれかの要素)は、血管マップ1051に提示されている各血管のドップラー角度を推定することができる。更に、眼科装置1400は、血管マップ1051中の各血管の位置情報とドップラー角度の推定値とを対応付けることによってドップラー角度マップを生成することができる。更に、眼科装置1400は、血管マップ1051中の各血管のドップラー角度の推定値に評価処理を適用して評価情報を生成し、各血管の位置情報と評価情報とを対応付けることによってドップラー角度評価マップを生成することができる。ドップラー角度マップやドップラー角度評価マップは、血管の向きの分布を示す情報を含んでいる。このようなマップを可視化することによって、血管の向きの分布を表現する視覚情報(向き分布画像)を生成することができる。表示制御部1090は、向き分布画像を表示装置1100に表示させることができる。また、表示制御部1090は、正面眼底画像上に向き分布画像を表示させてもよい。正面眼底画像と向き分布画像との間のレジストレーションは、上記のレジストレーション部により実行されてよい。 As described above, the ophthalmic device 1400 of this embodiment (e.g., any of the elements of the vascular map generation unit 1050, the vessel of interest designation unit 1070, and the hemodynamic information generation unit 1080) can estimate the Doppler angle of each blood vessel presented in the vascular map 1051. Furthermore, the ophthalmic device 1400 can generate a Doppler angle map by associating the position information of each blood vessel in the vascular map 1051 with the estimated Doppler angle. Furthermore, the ophthalmic device 1400 can generate evaluation information by applying an evaluation process to the estimated Doppler angle of each blood vessel in the vascular map 1051, and generate a Doppler angle evaluation map by associating the position information of each blood vessel with the evaluation information. Doppler angle maps and Doppler angle evaluation maps contain information indicating the distribution of blood vessel orientation. By visualizing such maps, visual information (orientation distribution image) representing the distribution of blood vessel orientation can be generated. The display control unit 1090 can display the orientation distribution image on the display device 1100. The display control unit 1090 may also display the orientation distribution image on the frontal fundus image. Registration between the frontal fundus image and the orientation distribution image may be performed by the registration unit described above.
眼科装置1400の非限定的な動作例を説明する。図12は、眼科装置1400の1つの動作例を示す。ステップS11~S15は、それぞれ、図9におけるステップS1~S5と同じ要領で実行されてよい。 A non-limiting example of the operation of the ophthalmic device 1400 will now be described. Figure 12 shows one example of the operation of the ophthalmic device 1400. Steps S11 to S15 may be performed in the same manner as steps S1 to S5 in Figure 9, respectively.
ステップS16において、注目血管指定部1070は、ステップS15で生成された血管マップ1051に提示されている各血管の向き情報(ドップラー角度、好適度など)を生成する。 In step S16, the vessel of interest designation unit 1070 generates orientation information (Doppler angle, suitability, etc.) for each blood vessel presented in the vessel map 1051 generated in step S15.
ステップS17において、注目血管指定部1070は、ステップS16で生成された各血管の向き情報に基づいて、血管マップ1051に提示されている複数の血管のうちから注目血管を指定する。なお、向き情報に加えて又は向き情報の代わりに、別の選択基準を参照して注目血管を指定してもよい。指定された注目血管の情報は、スキャン制御部1060に送られる。 In step S17, the vessel of interest designation unit 1070 designates a vessel of interest from among the multiple blood vessels presented in the vessel map 1051, based on the orientation information of each blood vessel generated in step S16. Note that in addition to or instead of the orientation information, the vessel of interest may be designated with reference to other selection criteria. Information on the designated vessel of interest is sent to the scan control unit 1060.
ステップS18において、スキャン制御部1060は、ステップS17で指定された注目血管に対して、血流動態計測(OCT血流計測)のための反復スキャンを適用するように、スキャン部1010を制御する。 In step S18, the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessel of interest designated in step S17.
指定された注目血管に相当する眼底Efの血管は、例えば、眼底Efの赤外観察画像(リアルタイム動画像)を参照して探索され特定される。このとき、前述したレジストレーションと同様の処理によって赤外観察画像と血管マップ1051との間の位置合わせを行ってもよい。 The blood vessels in the fundus Ef corresponding to the specified blood vessels of interest are searched for and identified, for example, by referring to an infrared observation image (real-time moving image) of the fundus Ef. At this time, alignment between the infrared observation image and the blood vessel map 1051 may be performed using a process similar to the registration described above.
ステップS18において、スキャン制御部1060は、注目血管に対する反復スキャン(前述の主走査)をスキャン部1010に実行させる制御に加えて、ドップラー角度を新たに求めるためのスキャン(前述の補足走査)をスキャン部1010に実行させる制御も行ってもよい。反復的スキャンにより収集されたOCT血流計測データ1075(及び、補足走査により収集されたデータ)は、血流動態情報生成部1080に送られる。 In step S18, the scan control unit 1060 may control the scan unit 1010 to perform a repetitive scan (the aforementioned main scan) on the blood vessel of interest, as well as to perform a scan (the aforementioned supplementary scan) to newly determine the Doppler angle. The OCT blood flow measurement data 1075 collected by the repetitive scan (and the data collected by the supplementary scan) is sent to the hemodynamic information generation unit 1080.
ステップS19において、血流動態情報生成部1080は、ステップS17の反復スキャンにより収集されたOCT血流計測データ1075(及び、補足走査により収集されたデータ)に基づいて、注目血管指定部1070により指定された注目血管における血流動態情報1085を生成する。 In step S19, the hemodynamic information generation unit 1080 generates hemodynamic information 1085 for the blood vessel of interest designated by the blood vessel of interest designation unit 1070 based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S17 (and data collected by the supplementary scan).
いくつかの態様では、血流動態情報生成部1080は、ステップS17で注目血管から収集されたOCT血流計測データ1075と、ステップS16で算出された注目血管の向き情報(ドップラー角度)とに基づいて、注目血管における血流速度を算出する。 In some embodiments, the hemodynamic information generation unit 1080 calculates the blood flow velocity in the blood vessel of interest based on the OCT blood flow measurement data 1075 collected from the blood vessel of interest in step S17 and the orientation information (Doppler angle) of the blood vessel of interest calculated in step S16.
別のいくつかの態様では、血流動態情報生成部1080は、ステップS17の補足走査で収集されたデータに基づき注目血管のドップラー角度を算出し、このドップラー角度とステップS17の主走査で収集されたOCT血流計測データ1075とに基づいて、注目血管における血流速度を算出する。 In some other aspects, the blood flow dynamics information generation unit 1080 calculates the Doppler angle of the blood vessel of interest based on the data collected in the supplementary scan of step S17, and calculates the blood flow velocity in the blood vessel of interest based on this Doppler angle and the OCT blood flow measurement data 1075 collected in the main scan of step S17.
更に、血流動態情報生成部1080は、注目血管の血管径を算出する処理と、この血管径と血流速度とに基づき血流量を算出する処理とを実行してもよい。 Furthermore, the hemodynamic information generation unit 1080 may perform a process of calculating the vascular diameter of the blood vessel of interest and a process of calculating the blood flow volume based on this vascular diameter and the blood flow velocity.
ステップS20において、表示制御部1090は、ステップS19で生成された血流動態情報(血流速度、血流量、血管径、ドップラー角度など)を表示装置1100に表示させる。表示制御部1090は、本動作例で得られた任意の情報(画像、演算結果、解析結果など)を表示装置1100に表示させてもよい。例えば、血流動態情報生成部1080が、ドップラー角度に評価処理を適用して評価情報を生成し、更に、表示制御部1090が、この評価情報を表示装置1100に表示させることができる。 In step S20, the display control unit 1090 causes the display device 1100 to display the hemodynamic information (blood flow velocity, blood flow volume, blood vessel diameter, Doppler angle, etc.) generated in step S19. The display control unit 1090 may also cause the display device 1100 to display any information obtained in this operation example (images, calculation results, analysis results, etc.). For example, the hemodynamic information generation unit 1080 may apply evaluation processing to the Doppler angle to generate evaluation information, and the display control unit 1090 may then cause the display device 1100 to display this evaluation information.
本態様に係る眼科装置1400は、眼科装置1000の作用及び効果に加えて、以下に説明する作用及び効果を奏する。 The ophthalmic device 1400 according to this embodiment has the functions and advantages described below in addition to the functions and advantages of the ophthalmic device 1000.
眼科装置1400は、眼底の血流動態計測を行うことができる。更に、眼科装置1400は、血管マップに基づき注目血管を自動で指定することができ、又は、血管マップに基づき注目血管を指定する作業を支援することができる。加えて、眼科装置1400は、血管マップを利用して指定された注目血管を対象とする血流動態計測を自動で行うことができる。これにより、血流動態計測を適用する位置を指定する作業の容易化や省力化を促進することができ、当該作業の精度や正確度を向上することができる。したがって、本態様の眼科装置1400は、検査の容易化、省力化、迅速化を促進するものであり、検査の品質向上に貢献する。 The ophthalmic device 1400 is capable of measuring the hemodynamics of the fundus. Furthermore, the ophthalmic device 1400 can automatically designate a blood vessel of interest based on a vascular map, or can assist in the task of designating a blood vessel of interest based on a vascular map. In addition, the ophthalmic device 1400 can automatically perform hemodynamic measurements of a designated blood vessel of interest using a vascular map. This facilitates and reduces the labor required to designate the position at which to apply hemodynamic measurements, and improves the precision and accuracy of this task. Therefore, the ophthalmic device 1400 of this embodiment facilitates easier, more labor-saving, and faster examinations, contributing to improved examination quality.
眼科装置1400は、眼底イメージングで得られた画像から血管走行の情報を取得して眼底血管の向き情報を生成するという、新規な手法を提供するものである。眼科装置1400により生成される向き情報は、血流計測位置を指定する作業の容易化や省力化に寄与するものであり、当該作業の精度や正確度の向上に寄与するものである。また、眼科装置1400により生成される向き情報は、ドップラー角の大きさやその評価結果を提供するものであるから、好適な血流計測位置を予測することを可能にする。 The ophthalmic device 1400 provides a novel method of generating orientation information for fundus blood vessels by obtaining information on the course of blood vessels from images obtained by fundus imaging. The orientation information generated by the ophthalmic device 1400 contributes to simplification and labor savings in the task of specifying blood flow measurement positions, and contributes to improved precision and accuracy of the task. Furthermore, the orientation information generated by the ophthalmic device 1400 provides the magnitude of the Doppler angle and its evaluation results, making it possible to predict the optimal blood flow measurement position.
眼科装置1400は、血管マップ、眼底の画像、ドップラー角度、評価情報といった各種の情報を示す視覚情報を提供することができる。これにより、ユーザーは、被検眼(眼底)に関する情報を把握することができる。 The ophthalmologic device 1400 can provide visual information showing various types of information, such as vascular maps, fundus images, Doppler angles, and evaluation information. This allows the user to understand information about the subject's eye (fundus).
図13は、1つの非限定的な態様に係る眼科装置1500の構成を示す。眼科装置1500の要素のうち、図11の眼科装置1400の要素と同じ名称を有し且つ同じ符号を付されたものは、特に言及しない限り、眼科装置1400の当該対応要素と同様の構成及び機能を有していてよい。ただし、当該対応要素の変形手段、均等手段、代替手段を採用することは、除外されない。 Figure 13 shows the configuration of an ophthalmic device 1500 according to one non-limiting embodiment. Elements of the ophthalmic device 1500 that have the same names and symbols as elements of the ophthalmic device 1400 in Figure 11 may have the same configuration and function as the corresponding elements of the ophthalmic device 1400, unless otherwise specified. However, this does not exclude the use of modified, equivalent, or alternative means for the corresponding elements.
眼科装置1400と同様の要素として、眼科装置1500は、スキャン部1010と、断面画像生成部1020と、血管領域特定部1030と、血管領域対応付け部1040と、血管マップ生成部1050と、スキャン制御部1060と、血流動態情報生成部1080とを有する。これらに加えて、眼科装置1500は、観察画像生成部1110と、移動制御部1120と、移動機構1130とを備えている。 The ophthalmic device 1500 has the same elements as the ophthalmic device 1400: a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, a vascular map generating unit 1050, a scan control unit 1060, and a blood flow dynamics information generating unit 1080. In addition to these, the ophthalmic device 1500 also has an observation image generating unit 1110, a movement control unit 1120, and a movement mechanism 1130.
観察画像生成部1110は、眼底Efの赤外観察画像(リアルタイム動画像)を生成するように構成されている。非限定的な態様の眼科装置1は、眼底カメラユニット2を用いて観察画像生成部1110の機能を実現することができる。 The observation image generation unit 1110 is configured to generate an infrared observation image (real-time moving image) of the fundus oculi Ef. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the observation image generation unit 1110 using the fundus camera unit 2.
移動機構1130は、スキャン部1010を移動するように構成されている。非限定的な態様の眼科装置1は、移動機構150を用いて移動機構1130の機能を実現することができる。 The movement mechanism 1130 is configured to move the scanning unit 1010. In a non-limiting embodiment, the ophthalmologic apparatus 1 can achieve the functions of the movement mechanism 1130 using the movement mechanism 150.
移動制御部1120は、移動機構1130を制御するように構成されている。移動制御部1120は、回路を含むハードウェアと、移動制御ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、制御部210(主制御部211)及びデータ処理部230を用いて移動制御部1120の機能を実現することができる。 The movement control unit 1120 is configured to control the movement mechanism 1130. The movement control unit 1120 is realized by cooperation between hardware including circuits and movement control software. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the movement control unit 1120 using the control unit 210 (main control unit 211) and the data processing unit 230.
血管マップ生成部1050により生成された血管マップ1051は、移動制御部1120に入力される。また、観察画像生成部1110により生成される赤外観察画像は、移動制御部1120にリアルタイムで入力される。移動制御部1120は、血管マップ1051と赤外観察画像(リアルタイム動画像のフレーム)とを比較して、血管マップ1051と赤外観察画像の間の偏位を算出し、この偏位に基づいて移動機構1130を制御する。血管マップ1051と赤外観察画像の間の偏位の算出は、例えば、前述したレジストレーションと同様の手法で実行されてよい。移動制御部1120は、算出された偏位を打ち消すように移動機構1130を制御する。具体的には、移動制御部1120は、血管マップ1051に対する赤外観察画像の各フレーム(又は、間引き処理を介して得られた各フレーム)の偏位を算出し、逐次に算出される各偏位を所定の閾値と比較する。移動制御部1120は、この一連の処理を、偏位が閾値よりも小さくなるまで繰り返す。これにより、血管マップ1051に対する赤外観察画像の位置合わせが達成され、血管マップ1051を用いて行われる新規のアライメントが実現される。なお、ここに説明した一連の処理と並行して、血管マップ1051を生成するためのOCTスキャン及びデータ処理を繰り返し実行することによって、リアルタイムで更新される血管マップ1051と赤外観察画像(リアルタイム動画像)とを用いてアライメントを行ってもよい。 The vascular map 1051 generated by the vascular map generation unit 1050 is input to the movement control unit 1120. Furthermore, the infrared observation image generated by the observation image generation unit 1110 is input to the movement control unit 1120 in real time. The movement control unit 1120 compares the vascular map 1051 with the infrared observation image (a frame of a real-time moving image), calculates the deviation between the vascular map 1051 and the infrared observation image, and controls the movement mechanism 1130 based on this deviation. Calculation of the deviation between the vascular map 1051 and the infrared observation image may be performed, for example, using a method similar to the registration described above. The movement control unit 1120 controls the movement mechanism 1130 so as to cancel out the calculated deviation. Specifically, the movement control unit 1120 calculates the deviation of each frame of the infrared observation image (or each frame obtained through thinning processing) relative to the vascular map 1051, and compares each sequentially calculated deviation with a predetermined threshold. The movement control unit 1120 repeats this series of processes until the deviation becomes smaller than the threshold value. This achieves alignment of the infrared observation image with the vascular map 1051, and realizes new alignment using the vascular map 1051. Note that in parallel with the series of processes described here, alignment may be performed using the vascular map 1051, which is updated in real time, and the infrared observation image (real-time moving image), by repeatedly performing OCT scanning and data processing to generate the vascular map 1051.
眼科装置1500の非限定的な動作例を説明する。図14は、眼科装置1500の1つの動作例を示す。ステップS31~S35は、それぞれ、図9におけるステップS1~S5と同じ要領で実行されてよい。 A non-limiting example of the operation of the ophthalmic device 1500 will now be described. Figure 14 shows one example of the operation of the ophthalmic device 1500. Steps S31 to S35 may be performed in the same manner as steps S1 to S5 in Figure 9, respectively.
ステップS36において、眼科装置1500は、アライメントを開始する。具体的には、眼科装置1500は、観察画像生成部1110による眼底Efの赤外観察画像の生成を開始し、移動制御部1120及び移動機構1130の動作を開始する。 In step S36, the ophthalmic device 1500 starts alignment. Specifically, the ophthalmic device 1500 starts generating an infrared observation image of the fundus oculi Ef using the observation image generation unit 1110, and starts operation of the movement control unit 1120 and the movement mechanism 1130.
ステップS37において、移動制御部1120は、ステップS35で生成された血管マップ1051に対する赤外観察画像の偏位を算出し、算出された偏位を閾値と比較する。この処理は、偏位が閾値よりも小さくなるまで繰り返し実行される(ステップS38:No)。血管マップ1051に対する赤外観察画像の偏位が閾値よりも小さくなったら(ステップS38:Yes)、処理手順はステップS39に移行する。 In step S37, the movement control unit 1120 calculates the deviation of the infrared observation image relative to the vascular map 1051 generated in step S35 and compares the calculated deviation with a threshold value. This process is repeated until the deviation becomes smaller than the threshold value (step S38: No). When the deviation of the infrared observation image relative to the vascular map 1051 becomes smaller than the threshold value (step S38: Yes), the processing procedure proceeds to step S39.
ステップS39において、スキャン制御部1060は、眼底Efの注目血管に対して、血流動態計測(OCT血流計測)のための反復スキャンを適用するように、スキャン部1010を制御する。注目血管の指定は、前述した眼科装置1400(注目血管指定部1070)と同様の手法で実行されてよい。 In step S39, the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessels of interest in the fundus oculi Ef. The blood vessels of interest may be designated in the same manner as in the ophthalmic apparatus 1400 (blood vessel designation unit 1070) described above.
ステップS40において、血流動態情報生成部1080は、ステップS39の反復スキャンにより収集されたOCT血流計測データ1075に基づいて、注目血管における血流動態情報を生成する。 In step S40, the hemodynamic information generation unit 1080 generates hemodynamic information for the blood vessel of interest based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S39.
本態様に係る眼科装置1500は、眼科装置1400の作用及び効果に加えて、OCT画像から得られる眼底血管の位置を示す血管マップを利用した新規なアライメント手法を提供することが可能である。 In addition to the functions and effects of the ophthalmic device 1400, the ophthalmic device 1500 according to this embodiment can provide a novel alignment method that utilizes a vascular map that indicates the position of fundus blood vessels obtained from OCT images.
図15は、1つの非限定的な態様に係る眼科装置1600の構成を示す。眼科装置1600の要素のうち、図11の眼科装置1400の要素と同じ名称を有し且つ同じ符号を付されたものは、特に言及しない限り、眼科装置1400の当該対応要素と同様の構成及び機能を有していてよい。ただし、当該対応要素の変形手段、均等手段、代替手段を採用することは、除外されない。 Figure 15 shows the configuration of an ophthalmic device 1600 according to one non-limiting embodiment. Elements of the ophthalmic device 1600 that have the same names and are assigned the same reference numerals as elements of the ophthalmic device 1400 in Figure 11 may have the same configuration and function as the corresponding elements of the ophthalmic device 1400, unless otherwise specified. However, this does not exclude the adoption of modified, equivalent, or alternative means for the corresponding elements.
眼科装置1400と同様の要素として、眼科装置1600は、スキャン部1010と、断面画像生成部1020と、血管領域特定部1030と、血管領域対応付け部1040と、血管マップ生成部1050と、スキャン制御部1060と、血流動態情報生成部1080とを有する。これらに加えて、眼科装置1500は、移動機構1130と、移動制御部1140と、プロジェクション画像生成部1150とを備えている。移動機構1130は、図13の眼科装置1500の移動機構1130と同様の構成及び機能を有する。 The ophthalmic device 1600 has the same elements as the ophthalmic device 1400, including a scanning unit 1010, a cross-sectional image generating unit 1020, a vascular region identifying unit 1030, a vascular region matching unit 1040, a vascular map generating unit 1050, a scan control unit 1060, and a blood flow dynamics information generating unit 1080. In addition to these, the ophthalmic device 1500 has a movement mechanism 1130, a movement control unit 1140, and a projection image generating unit 1150. The movement mechanism 1130 has the same configuration and functions as the movement mechanism 1130 of the ophthalmic device 1500 in FIG. 13.
本例では、血管マップ1051は、図10Cの血管マップ1320のような3次元データである。プロジェクション画像生成部1150は、眼底Efに対するOCTスキャンにおけるAスキャン方向(z方向)へのプロジェクションを血管マップ1051に適用することによってプロジェクション画像を生成する。この血管マップ1051のプロジェクションは、血管マップ1051に対するプロジェクションに限定されず、血管マップ1051の元画像である複数の断面画像1021に対するプロジェクションでもよい。 In this example, the vascular map 1051 is three-dimensional data such as the vascular map 1320 in Figure 10C. The projection image generation unit 1150 generates a projection image by applying a projection in the A-scan direction (z direction) of an OCT scan of the fundus Ef to the vascular map 1051. This projection of the vascular map 1051 is not limited to a projection onto the vascular map 1051, but may also be a projection onto multiple cross-sectional images 1021, which are the original images of the vascular map 1051.
血管マップ1051が図10Bの血管マップ1310のような2次元データである場合、プロジェクション画像生成部1150が設けられている必要は無い。或いは、血管マップ1051が3次元データである場合には動作し、血管マップ1051が2次元データである場合には動作しないように構成されたプロジェクション画像生成部1150を設けてもよい。 If the vascular map 1051 is two-dimensional data such as the vascular map 1310 in Figure 10B, there is no need to provide a projection image generation unit 1150. Alternatively, a projection image generation unit 1150 may be provided that is configured to operate when the vascular map 1051 is three-dimensional data and not operate when the vascular map 1051 is two-dimensional data.
プロジェクション画像生成部1150は、回路を含むハードウェアと、プロジェクション画像生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いてプロジェクション画像生成部1150の機能を実現することができる。 The projection image generation unit 1150 is realized by the cooperation of hardware including circuits and projection image generation software. The ophthalmologic apparatus 1 in a non-limiting embodiment can realize the functions of the projection image generation unit 1150 using the data processing unit 230.
移動制御部1140は、移動機構1130を制御するように構成されている。移動制御部1120は、回路を含むハードウェアと、移動制御ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、制御部210(主制御部211)及びデータ処理部230を用いて移動制御部1120の機能を実現することができる。 The movement control unit 1140 is configured to control the movement mechanism 1130. The movement control unit 1120 is realized by cooperation between hardware including circuits and movement control software. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the movement control unit 1120 using the control unit 210 (main control unit 211) and the data processing unit 230.
プロジェクション画像生成部1150により血管マップ1051から生成されたプロジェクション画像は、移動制御部1140に入力される。本態様では、血管マップ1051が繰り返し生成され、各血管マップ1051(又は、間引き処理を介して得られた各血管マップ1051)からプロジェクション画像が生成される。 The projection image generated from the vascular map 1051 by the projection image generation unit 1150 is input to the movement control unit 1140. In this embodiment, the vascular map 1051 is repeatedly generated, and a projection image is generated from each vascular map 1051 (or each vascular map 1051 obtained through thinning processing).
眼科装置1600の非限定的な動作例を説明する。図16は、眼科装置1600の1つの動作例を示す。ステップS51~S55は、それぞれ、図9におけるステップS1~S5と同じ要領で実行されてよい。ステップS51~S55は、繰り返し実行される。これにより、血管マップ1051が逐次に生成される。血管マップ1051を生成する時間間隔、つまり、眼底Efに対してスキャンを適用する時間間隔は、一定であってよい。逐次に生成される血管マップ1051は、プロジェクション画像生成部1150に逐次に入力される。 A non-limiting example of the operation of the ophthalmic device 1600 will now be described. Figure 16 shows one example of the operation of the ophthalmic device 1600. Steps S51 to S55 may be executed in the same manner as steps S1 to S5 in Figure 9, respectively. Steps S51 to S55 are executed repeatedly. As a result, a blood vessel map 1051 is generated sequentially. The time interval for generating the blood vessel map 1051, i.e., the time interval for applying a scan to the fundus Ef, may be constant. The blood vessel map 1051 generated sequentially is input sequentially to the projection image generation unit 1150.
ステップS56において、プロジェクション画像生成部1150は、逐次に入力される血管マップ1051に対してプロジェクションを適用してプロジェクション画像を生成する。血管マップ1051の入力に対応して逐次に生成されるプロジェクション画像は、移動制御部1140に逐次に入力される。 In step S56, the projection image generation unit 1150 applies projection to the vascular map 1051 that is sequentially input to generate a projection image. The projection images that are sequentially generated in response to the input of the vascular map 1051 are sequentially input to the movement control unit 1140.
ステップS57において、移動制御部1140は、入力された2つのプロジェクション画像を比較してこれらプロジェクション画像の間の偏位を算出し、算出された偏位を閾値と比較する。この処理は、偏位が閾値よりも小さくなるまで繰り返し実行される(ステップS58:No)。偏位が閾値よりも小さくなったら(ステップS58:Yes)、処理手順はステップS59に移行する。偏位が閾値よりも小さくなったことは、被検眼Eの動きが十分に小さく、被検眼Eの位置が安定していることを意味する。 In step S57, the movement control unit 1140 compares the two input projection images, calculates the deviation between these projection images, and compares the calculated deviation with a threshold value. This process is repeated until the deviation becomes smaller than the threshold value (step S58: No). When the deviation becomes smaller than the threshold value (step S58: Yes), the processing procedure proceeds to step S59. The deviation becoming smaller than the threshold value means that the movement of the subject's eye E is sufficiently small and the position of the subject's eye E is stable.
ステップS57の処理は、例えば、連続する2つのプロジェクション画像から血管に相当する接続領域を検出する工程と、検出された接続領域をランドマークして2つのプロジェクション画像の間における位置の誤差を算出する工程と、この位置誤差(偏位)を閾値と比較する工程とを含む。 The processing of step S57 includes, for example, a step of detecting a connection area corresponding to a blood vessel from two consecutive projection images, a step of using the detected connection area as a landmark to calculate the positional error between the two projection images, and a step of comparing this positional error (deviation) with a threshold value.
プロジェクション画像間の偏位の評価に加えて、単一のプロジェクション画像におけるランドマークの分布に基づくアライメント評価を行ってもよい。例えば、単一のプロジェクション画像におけるランドマークの分布に基づいて、眼底Efの特徴点の位置を推定することができる。ステップS51のスキャンは、乳頭中心を中心とするサークルスキャンであり、単一のプロジェクション画像におけるランドマークは、サークル上に配置されている。よって、これらランドマークを結んで形成される円の中心は、乳頭中心に相当する。移動制御部1140は、このようにして検出される円の中心がプロジェクション画像の中心位置に配置されるように、移動機構1130を制御することができる。より一般に、移動制御部1140は、検出される円の中心がプロジェクション画像の所定位置に配置されるように、移動機構1130を制御することができる。 In addition to evaluating the deviation between projection images, alignment evaluation may be performed based on the distribution of landmarks in a single projection image. For example, the positions of feature points of the fundus oculi Ef can be estimated based on the distribution of landmarks in a single projection image. The scan in step S51 is a circle scan centered on the optic optic nerve center, and the landmarks in a single projection image are arranged on a circle. Therefore, the center of the circle formed by connecting these landmarks corresponds to the optic optic nerve center. The movement control unit 1140 can control the movement mechanism 1130 so that the center of the circle detected in this manner is positioned at the center position of the projection image. More generally, the movement control unit 1140 can control the movement mechanism 1130 so that the center of the circle detected is positioned at a predetermined position in the projection image.
ステップS59において、スキャン制御部1060は、眼底Efの注目血管に対して、血流動態計測(OCT血流計測)のための反復スキャンを適用するように、スキャン部1010を制御する。注目血管の指定は、前述した眼科装置1400(注目血管指定部1070)と同様の手法で実行されてよい。 In step S59, the scan control unit 1060 controls the scan unit 1010 to apply repeated scans for hemodynamic measurement (OCT blood flow measurement) to the blood vessels of interest in the fundus oculi Ef. The blood vessels of interest may be designated in the same manner as in the ophthalmic apparatus 1400 (blood vessel designation unit 1070) described above.
ステップS60において、血流動態情報生成部1080は、ステップS59の反復スキャンにより収集されたOCT血流計測データ1075に基づいて、注目血管における血流動態情報を生成する。 In step S60, the hemodynamic information generation unit 1080 generates hemodynamic information for the blood vessel of interest based on the OCT blood flow measurement data 1075 collected by the repeated scan in step S59.
本態様に係る眼科装置1600は、眼科装置1400の作用及び効果に加えて、OCT画像から得られる眼底血管の位置を示す血管マップを利用した新規なアライメント手法を提供することが可能である。 In addition to the functions and effects of the ophthalmic device 1400, the ophthalmic device 1600 according to this embodiment can provide a novel alignment method that utilizes a vascular map that indicates the position of fundus blood vessels obtained from OCT images.
図17は、1つの非限定的な態様に係る眼科装置2000の構成を示す。眼科装置2000は、スキャン部2010と、3次元画像生成部2020と、断面画像抽出部2030と、血管領域特定部2040と、血管領域対応付け部2050と、血管マップ生成部2060とを備えている。 FIG. 17 shows the configuration of an ophthalmic device 2000 according to one non-limiting embodiment. The ophthalmic device 2000 includes a scanning unit 2010, a three-dimensional image generating unit 2020, a cross-sectional image extracting unit 2030, a vascular region identifying unit 2040, a vascular region matching unit 2050, and a vascular map generating unit 2060.
図7の眼科装置1000に関する事項、図11の眼科装置1400に関する事項、図13の眼科装置1500に関する事項、図15の眼科装置1400に関する事項など、本開示に記載又は示唆されている任意の事項を、本態様の眼科装置2000に組み合わせることができる。 Any of the matters described or suggested in this disclosure, such as the matters relating to the ophthalmic device 1000 in FIG. 7, the matters relating to the ophthalmic device 1400 in FIG. 11, the matters relating to the ophthalmic device 1500 in FIG. 13, and the matters relating to the ophthalmic device 1400 in FIG. 15, can be combined with the ophthalmic device 2000 of this embodiment.
スキャン部2010は、前述した眼科装置1000のスキャン部1010と同様に、被検眼Eの眼底EfにOCTスキャンを適用してデータを収集するように構成されている。特に、スキャン部2010は、眼底Efの3次元領域にOCTスキャンを適用してデータを収集する。このOCTスキャンの非限定的な例として、ラスタースキャン、リサジュースキャンなどがある。非限定的な態様の眼科装置1は、眼底カメラユニット2及びOCTユニット100を用いてスキャン部2010の機能を実現することができる。 The scanning unit 2010, like the scanning unit 1010 of the ophthalmic apparatus 1000 described above, is configured to collect data by applying an OCT scan to the fundus Ef of the subject's eye E. In particular, the scanning unit 2010 collects data by applying an OCT scan to a three-dimensional region of the fundus Ef. Non-limiting examples of this OCT scan include a raster scan and a Lissajous scan. A non-limiting aspect of the ophthalmic apparatus 1 can realize the functions of the scanning unit 2010 using the fundus camera unit 2 and the OCT unit 100.
3次元画像生成部2020は、スキャン部2010により眼底Efの3次元領域から収集されたデータに基づいて眼底Efの3次元画像を生成するように構成されている。生成される3次元画像は、例えば、スタックデータ又はボリュームデータであってよい。3次元画像生成部2020は、回路を含むハードウェアと、3次元画像画像生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230(画像生成部220)を用いて3次元画像生成部2020の機能を実現することができる。 The three-dimensional image generating unit 2020 is configured to generate a three-dimensional image of the fundus oculi Ef based on data collected from a three-dimensional region of the fundus oculi Ef by the scanning unit 2010. The generated three-dimensional image may be, for example, stack data or volume data. The three-dimensional image generating unit 2020 is realized by cooperation between hardware including circuits and three-dimensional image generating software. A non-limiting aspect of the ophthalmologic apparatus 1 can realize the functions of the three-dimensional image generating unit 2020 using the data processing unit 230 (image generating unit 220).
断面画像抽出部2030は、3次元画像生成部2020により生成された3次元画像から断面画像を抽出する。特に、断面画像抽出部2030は、眼底Efの複数の断面にそれぞれ対応する複数の断面画像を3次元画像から抽出する。複数の断面は、フライバックを伴わないスキャンパターンに相当する。フライバックを伴わないスキャンパターンは、例えば、前述した眼科装置1000のスキャン部1010に関する説明におけるいずれかのスキャンパターンであってよく、例えば、図8A~図8Cに示す同心円スキャンであってよい。断面画像抽出部2030は、回路を含むハードウェアと、断面画像抽出ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて断面画像抽出部2030の機能を実現することができる。 The cross-sectional image extraction unit 2030 extracts cross-sectional images from the three-dimensional image generated by the three-dimensional image generation unit 2020. In particular, the cross-sectional image extraction unit 2030 extracts multiple cross-sectional images corresponding to multiple cross sections of the fundus oculi Ef from the three-dimensional image. The multiple cross sections correspond to a scan pattern without a flyback. The scan pattern without a flyback may be, for example, any of the scan patterns described above regarding the scan unit 1010 of the ophthalmic device 1000, such as the concentric circular scan shown in Figures 8A to 8C. The cross-sectional image extraction unit 2030 is realized by cooperation between hardware including circuits and cross-sectional image extraction software. A non-limiting aspect of the ophthalmic device 1 can realize the functions of the cross-sectional image extraction unit 2030 using the data processing unit 230.
血管領域特定部2040は、断面画像抽出部2030により3次元画像から抽出された複数の断面画像のそれぞれから血管領域群を検出することによって、これら複数の断面画像にそれぞれ対応する複数の血管領域群を特定するように構成されている。血管領域特定部2040は、前述した眼科装置1000の血管領域特定部1030と同様の機能を有する。血管領域特定部2040は、回路を含むハードウェアと、血管領域特定ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230(血管領域特定部231)を用いて血管領域特定部2040の機能を実現することができる。 The vascular region identification unit 2040 is configured to detect vascular region groups from each of the multiple cross-sectional images extracted from the three-dimensional image by the cross-sectional image extraction unit 2030, and thereby identify multiple vascular region groups corresponding to these multiple cross-sectional images. The vascular region identification unit 2040 has the same functions as the vascular region identification unit 1030 of the ophthalmic device 1000 described above. The vascular region identification unit 2040 is realized by cooperation between hardware including circuits and vascular region identification software. A non-limiting aspect of the ophthalmic device 1 can realize the functions of the vascular region identification unit 2040 using the data processing unit 230 (vascular region identification unit 231).
血管領域対応付け部2050は、血管領域特定部2040により複数の断面画像からそれぞれ特定された複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行うように構成されている。血管領域対応付け部2050は、前述した眼科装置1000の血管領域対応付け部1040と同様の機能を有する。血管領域対応付け部2050は、回路を含むハードウェアと、血管領域対応付けソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて血管領域対応付け部2050の機能を実現することができる。 The vascular region matching unit 2050 is configured to match vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups identified from multiple cross-sectional images by the vascular region identification unit 2040. The vascular region matching unit 2050 has the same functions as the vascular region matching unit 1040 of the ophthalmic device 1000 described above. The vascular region matching unit 2050 is realized by cooperation between hardware including circuits and vascular region matching software. A non-limiting aspect of the ophthalmic device 1 can achieve the functions of the vascular region matching unit 2050 using the data processing unit 230.
血管マップ生成部2060は、血管領域対応付け部2050により得られた血管領域の対応付けの結果に基づいて、血管の分布を示す血管マップを生成するように構成されている。血管マップ生成部2060は、前述した眼科装置1000の血管マップ生成部1050と同様の機能を有する。血管マップ生成部2060は、回路を含むハードウェアと、血管マップ生成ソフトウェアとの協働により実現される。非限定的な態様の眼科装置1は、データ処理部230を用いて血管マップ生成部2060の機能を実現することができる。 The vascular map generation unit 2060 is configured to generate a vascular map showing the distribution of blood vessels based on the vascular region matching results obtained by the vascular region matching unit 2050. The vascular map generation unit 2060 has the same functions as the vascular map generation unit 1050 of the ophthalmic device 1000 described above. The vascular map generation unit 2060 is realized by cooperation between hardware including circuits and vascular map generation software. The ophthalmic device 1 of a non-limiting aspect can realize the functions of the vascular map generation unit 2060 using the data processing unit 230.
図18は、眼科装置2000の1つの動作例を示す。 Figure 18 shows an example of the operation of the ophthalmic device 2000.
まず、ステップS71において、スキャン部2010は、被検眼Eの眼底Efの3次元領域にOCTスキャンを適用してデータを収集する。収集されたデータは、3次元画像生成部2020に送られる。 First, in step S71, the scanning unit 2010 applies OCT scanning to a three-dimensional region of the fundus Ef of the subject's eye E to collect data. The collected data is sent to the three-dimensional image generating unit 2020.
ステップS72において、3次元画像生成部2020は、ステップS71で眼底Efの3次元領域から収集されたデータに基づいて、眼底Efの3次元画像を生成する。生成された3次元画像は、断面画像抽出部2030に送られる。 In step S72, the three-dimensional image generation unit 2020 generates a three-dimensional image of the fundus oculi Ef based on the data collected from the three-dimensional region of the fundus oculi Ef in step S71. The generated three-dimensional image is sent to the cross-sectional image extraction unit 2030.
ステップS73において、断面画像抽出部2030は、ステップS72で生成された3次元画像から、フライバックを伴わないスキャンパターンに相当する眼底Efの複数の断面にそれぞれ対応する複数の断面画像を抽出する。例えば、断面画像抽出部2030は、ステップS72で生成された3次元画像から、同心に配置された2つのサークルスキャン(同心円スキャン)にそれぞれ対応する2つの断面画像を抽出する。抽出された複数の断面画像は、血管領域特定部2040に送られる。 In step S73, the cross-sectional image extraction unit 2030 extracts, from the three-dimensional image generated in step S72, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus oculi Ef corresponding to a scan pattern without flyback. For example, the cross-sectional image extraction unit 2030 extracts, from the three-dimensional image generated in step S72, two cross-sectional images corresponding to two concentric circle scans (concentric circle scans). The extracted cross-sectional images are sent to the vascular region identification unit 2040.
ステップS74において、血管領域特定部2040は、ステップS73で3次元画像から抽出された複数の断面画像のそれぞれから血管領域群を検出する。これにより、複数の断面画像にそれぞれ対応する複数の血管領域群が特定される。特定された複数の血管領域群は、血管領域対応付け部2050に送られる。 In step S74, the vascular region identification unit 2040 detects vascular region groups from each of the multiple cross-sectional images extracted from the 3D image in step S73. This identifies multiple vascular region groups corresponding to each of the multiple cross-sectional images. The identified multiple vascular region groups are sent to the vascular region association unit 2050.
ステップS75において、血管領域対応付け部2050は、ステップS74で特定された複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う。この血管領域対応付け処理の結果は、血管領域対応付け部2050に送られる。 In step S75, the vascular region matching unit 2050 matches vascular regions corresponding to different cross sections of the same blood vessel among the multiple vascular region groups identified in step S74. The results of this vascular region matching process are sent to the vascular region matching unit 2050.
ステップS76において、血管マップ生成部2060は、ステップS75で得られた血管領域の対応付けの結果に基づいて血管マップを生成する。 In step S76, the vascular map generation unit 2060 generates a vascular map based on the vascular region matching results obtained in step S75.
本動作例に係る処理手順を実行可能な眼科装置2000は、被検眼の眼底の3次元OCT画像を生成し、フライバックを伴わないスキャンパターンに相当する複数の断面にそれぞれ対応する複数の断面画像を3次元OCT画像から抽出し、複数の断面画像にそれぞれ対応する複数の血管領域群を特定し、同一血管の異なる断面に相当する血管領域同士を対応付けして血管マップを生成する。この血管マップは、OCT血流計測(眼底血流動態計測)における準備段階の工程(例えば、ドップラー角度推定、アライメント、注目血管の探索など)において利用可能である。よって、本態様の眼科装置2000は、OCT血流計測の品質向上に貢献するものである。 The ophthalmic device 2000 capable of executing the processing procedures of this operational example generates a 3D OCT image of the fundus of the subject's eye, extracts from the 3D OCT image multiple cross-sectional images corresponding to multiple cross sections corresponding to a scan pattern without flyback, identifies multiple groups of vascular regions corresponding to the multiple cross-sectional images, and generates a vascular map by associating vascular regions corresponding to different cross sections of the same blood vessel. This vascular map can be used in preparatory steps (e.g., Doppler angle estimation, alignment, search for blood vessels of interest, etc.) in OCT blood flow measurement (fundus blood flow dynamics measurement). Therefore, the ophthalmic device 2000 of this embodiment contributes to improving the quality of OCT blood flow measurement.
前述したように、本態様の眼科装置2000に対して様々な事項を組み合わせることができる。特に、図7の眼科装置1000に関する事項、図11の眼科装置1400に関する事項、図13の眼科装置1500に関する事項、図15の眼科装置1400に関する事項を、眼科装置2000に組み合わせることが可能である。このような組み合わせにより得られる眼科装置は、組み合わされた事項による作用及び効果を奏するものであり、更に、組み合わされた事項と眼科装置2000との相乗的な作用及び効果を奏する。また、複数の事項を眼科装置2000に組み合わせて得られる眼科装置は、当該複数の事項のうちの2つ以上の事項の相乗的な作用及び効果、並びに、当該複数の事項のうちの2つ以上の事項と眼科措置2000との相乗的な作用及び効果を奏する。これらの作用及び効果については、当業者であれば、本開示から理解することができるであろう。 As mentioned above, various features can be combined with the ophthalmic device 2000 of this embodiment. In particular, the features related to the ophthalmic device 1000 in FIG. 7, the features related to the ophthalmic device 1400 in FIG. 11, the features related to the ophthalmic device 1500 in FIG. 13, and the features related to the ophthalmic device 1400 in FIG. 15 can be combined with the ophthalmic device 2000. An ophthalmic device obtained by such a combination exhibits the functions and effects of the combined features, and further exhibits synergistic functions and effects between the combined features and the ophthalmic device 2000. Furthermore, an ophthalmic device obtained by combining multiple features with the ophthalmic device 2000 exhibits synergistic functions and effects between two or more of the multiple features, and synergistic functions and effects between two or more of the multiple features and the ophthalmic treatment 2000. Those skilled in the art will be able to understand these functions and effects from this disclosure.
〈他の実施形態〉
本開示が、眼科装置以外のカテゴリーの実施形態も提供するものであることは、当業者であれば理解することができるであろう。例えば、本開示は、眼科装置を制御する方法の実施形態、眼科情報処理装置を制御する方法の実施形態、いずれかの方法における各ステップをコンピュータに実行させるプログラムの実施形態、及び、いずれかのプログラムが記録されたコンピュータ可読な非一時的記録媒体の実施形態を提供することができる。記録媒体の形態は任意であってよい。例えば、記録媒体は、磁気ディスク、光ディスク、光磁気ディスク、及び半導体メモリのいずれかであってよい。
Other Embodiments
It will be understood by those skilled in the art that the present disclosure also provides embodiments in categories other than ophthalmic devices. For example, the present disclosure may provide an embodiment of a method for controlling an ophthalmic device, an embodiment of a method for controlling an ophthalmic information processing device, an embodiment of a program for causing a computer to execute each step of any of the methods, and an embodiment of a computer-readable non-transitory recording medium on which any of the programs is recorded. The recording medium may take any form. For example, the recording medium may be any of a magnetic disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
いくつかの実施形態は、OCTスキャンを実行するスキャン部とプロセッサとを有する眼科装置を制御する方法である。本実施形態に係る方法は、プロセッサに、スキャン制御と、断面画像生成処理と、血管領域特定処理と、血管領域対応付け処理と、血管マップ生成処理とを実行させる。スキャン制御は、フライバックを伴わないスキャンパターンによるOCTスキャンを被検眼の眼底に適用してデータを収集するようにスキャン部を制御する。断面画像生成処理は、スキャン制御の下に眼底から収集されたデータに基づいて、眼底の複数の断面にそれぞれ対応する複数の断面画像を生成する。血管領域特定処理は、生成された複数の断面画像のそれぞれから血管領域群を検出することによって、これら複数の断面画像にそれぞれ対応する複数の血管領域群を特定する。血管領域対応付け処理は、複数の断面にそれぞれ対応する複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う。血管マップ生成処理は、血管領域対応付け処理の結果に基づいて、血管の分布を示す血管マップを生成する。本実施形態に係る方法により、眼科装置は、前述した図9に示す手順の処理を実行することが可能である。 Some embodiments are methods for controlling an ophthalmic device having a scanning unit and a processor that performs OCT scans. The method according to this embodiment causes the processor to perform scan control, cross-sectional image generation processing, vascular region identification processing, vascular region correspondence processing, and vascular map generation processing. The scan control controls the scanning unit to collect data by applying an OCT scan using a scan pattern that does not involve flyback to the fundus of the subject's eye. The cross-sectional image generation processing generates multiple cross-sectional images corresponding to multiple cross sections of the fundus based on data collected from the fundus under scan control. The vascular region identification processing identifies multiple vascular region groups corresponding to the multiple cross-sectional images by detecting vascular region groups from each of the multiple cross-sectional images generated. The vascular region correspondence processing associates vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups corresponding to the multiple cross sections. The vascular map generation processing generates a vascular map showing the distribution of blood vessels based on the results of the vascular region correspondence processing. The method according to this embodiment enables the ophthalmic device to perform the processing steps shown in FIG. 9 described above.
本実施形態に係る方法をコンピュータを含む眼科装置に実行させるプログラムを構成することが可能である。また、このようなプログラムを記録したコンピュータ可読な非一時的記録媒体を作成することが可能である。この非一時的記録媒体は、任意の形態であってよく、その例として、磁気ディスク、光ディスク、光磁気ディスク、半導体メモリなどがある。本実施形態に係る方法、プログラム、及び記録媒体に対して、本開示で説明した任意の事項を組み合わせることができる。 It is possible to create a program that causes an ophthalmic device including a computer to execute the method according to this embodiment. It is also possible to create a computer-readable non-transitory recording medium on which such a program is recorded. This non-transitory recording medium may be in any form, including, for example, a magnetic disk, optical disk, magneto-optical disk, and semiconductor memory. Any of the features described in this disclosure may be combined with the method, program, and recording medium according to this embodiment.
いくつかの実施形態は、OCTスキャンを実行するスキャン部とプロセッサとを有する眼科装置を制御する方法である。本実施形態に係る方法は、プロセッサに、スキャン制御と、3次元画像生成処理と、断面画像抽出処理と、血管領域特定処理と、血管領域対応付け処理と、血管マップ生成処理とを実行させる。スキャン制御は、被検眼の眼底の3次元領域にOCTスキャンを適用してデータを収集するようにスキャン部を制御する。3次元画像生成処理は、スキャン制御の下に眼底の3次元領域から収集されたデータに基づいて眼底の3次元画像を生成する。断面画像抽出処理は、フライバックを伴わないスキャンパターンに相当する眼底の複数の断面にそれぞれ対応する複数の断面画像を、3次元画像生成処理で生成された3次元画像から抽出する。血管領域特定処理は、3次元画像から抽出された複数の断面画像のそれぞれから血管領域群を検出することによって、これら複数の断面画像にそれぞれ対応する複数の血管領域群を特定する。血管領域対応付け処理は、複数の断面にそれぞれ対応する複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う。血管マップ生成処理は、血管領域対応付け処理の結果に基づいて、血管の分布を示す血管マップを生成する。本実施形態に係る方法により、眼科装置は、前述した図18に示す手順の処理を実行することが可能である。 Some embodiments are methods for controlling an ophthalmic device having a scanning unit and a processor that performs OCT scanning. The method of this embodiment causes the processor to perform scan control, 3D image generation processing, cross-sectional image extraction processing, vascular region identification processing, vascular region correspondence processing, and vascular map generation processing. The scan control controls the scanning unit to apply an OCT scan to a 3D region of the fundus of the subject's eye to collect data. The 3D image generation processing generates a 3D image of the fundus based on data collected from the 3D region of the fundus under the scan control. The cross-sectional image extraction processing extracts, from the 3D image generated by the 3D image generation processing, multiple cross-sectional images corresponding to multiple cross sections of the fundus corresponding to a scan pattern without flyback. The vascular region identification processing detects vascular region groups from each of the multiple cross-sectional images extracted from the 3D image, thereby identifying multiple vascular region groups corresponding to these multiple cross-sectional images. The vascular region correspondence processing associates vascular regions corresponding to different cross sections of the same blood vessel among multiple vascular region groups corresponding to multiple cross sections. The vascular map generation process generates a vascular map showing the distribution of blood vessels based on the results of the vascular region association process. The method according to this embodiment enables the ophthalmologic apparatus to execute the process steps shown in FIG. 18 described above.
本実施形態に係る方法をコンピュータを含む眼科装置に実行させるプログラムを構成することが可能である。また、このようなプログラムを記録したコンピュータ可読な非一時的記録媒体を作成することが可能である。本実施形態に係る方法、プログラム、及び記録媒体に対して、本開示で説明した任意の事項を組み合わせることができる。 It is possible to create a program that causes an ophthalmic device including a computer to execute the method according to this embodiment. It is also possible to create a computer-readable non-transitory recording medium on which such a program is recorded. Any of the features described in this disclosure can be combined with the method, program, and recording medium according to this embodiment.
以上、図面を参照して本開示に係るいくつかの実施形態について述べた。ただし、これらは、非限定的な例示であり、上記以外の様々な構成を採用することもできる。 Several embodiments of the present disclosure have been described above with reference to the drawings. However, these are non-limiting examples, and various configurations other than those described above may also be adopted.
また、上記の説明で用いた複数のフローチャートでは、複数の工程(処理)が順番に記載されている。しかしながら、各実施形態で実行される工程の実行順序は、記載された順序に限定されない。各実施形態では、図示される工程の順序を内容的に支障の無い範囲で変更することができる。また、上記の各実施形態は、内容が相反しない範囲において少なくとも部分的に組み合わせることができる。 Furthermore, in the multiple flowcharts used in the above explanation, multiple steps (processes) are described in order. However, the order in which the steps are performed in each embodiment is not limited to the order described. In each embodiment, the order of the steps shown in the figures can be changed to the extent that the content is not affected. Furthermore, the above embodiments can be at least partially combined to the extent that the content is not contradictory.
上記の実施形態の一部又は全部は、以下の付記のようにも記載されうる。しかしながら、本開示に係る実施形態は、以下の付記に限定されない。 Some or all of the above embodiments may also be described as follows. However, embodiments of the present disclosure are not limited to the following notes.
〔1〕フライバックを伴わないスキャンパターンによる光コヒーレンストモグラフィスキャンを被検眼の眼底に適用してデータを収集するスキャン部と、
前記スキャン部により収集された前記データに基づいて、前記眼底の複数の断面にそれぞれ対応する複数の断面画像を生成する断面画像生成部と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定部と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け部と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成部と
を備える、眼科装置。
[1] a scanning unit that applies an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data;
a cross-sectional image generating unit that generates a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data collected by the scanning unit;
a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images;
a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on a result of the association.
〔2〕前記スキャンパターンは、それぞれがフライバックを伴わない複数のパターンを含む、
上記1の眼科装置。
[2] The scan pattern includes a plurality of patterns each of which does not involve a flyback.
The ophthalmic device according to item 1 above.
〔3〕前記スキャンパターンは、同心に配置された前記複数のパターンの組み合わせである同心パターンを含み、
前記眼底の前記複数の断面は、前記複数のパターンにそれぞれ相当する複数の同心断面を含み、
前記スキャン部は、前記同心パターンによる光コヒーレンストモグラフィスキャンを前記眼底に適用してデータを収集し、
前記断面画像生成部は、前記同心パターンによる前記光コヒーレンストモグラフィスキャンによって収集された前記データに基づいて、前記複数の同心断面にそれぞれ対応する前記複数の断面画像を生成する、
上記2の眼科装置。
[3] The scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically,
the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns;
the scanning unit applies an optical coherence tomography scan using the concentric pattern to the fundus to collect data;
the cross-sectional image generation unit generates the plurality of cross-sectional images corresponding to the plurality of concentric cross sections, respectively, based on the data collected by the optical coherence tomography scan using the concentric pattern;
The ophthalmic device mentioned above.
〔4〕前記同心パターンは、同心に配置された複数の円形パターンの組み合わせである同心円パターンを含み、
前記複数の同心断面は、前記複数の円形パターンにそれぞれ相当する同心に配置された複数の円柱側面を含み、
前記スキャン部は、前記同心円パターンによる光コヒーレンストモグラフィスキャンを前記眼底に適用してデータを収集し、
前記断面画像生成部は、前記同心円パターンによる前記光コヒーレンストモグラフィスキャンによって収集された前記データに基づいて、前記複数の円柱側面にそれぞれ対応する前記複数の断面画像を生成する、
上記3の眼科装置。
[4] The concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically,
the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged and correspond to the plurality of circular patterns, respectively;
the scanning unit applies an optical coherence tomography scan using the concentric circle pattern to the fundus to collect data;
the cross-sectional image generating unit generates the plurality of cross-sectional images corresponding to the plurality of cylindrical side surfaces, respectively, based on the data collected by the optical coherence tomography scan using the concentric circle pattern.
The ophthalmic device mentioned above.
〔5〕被検眼の眼底の3次元領域に光コヒーレンストモグラフィスキャンを適用してデータを収集するスキャン部と、
前記スキャン部により収集された前記データに基づいて、前記眼底の3次元画像を生成する3次元画像生成部と、
フライバックを伴わないスキャンパターンに相当する前記眼底の複数の断面にそれぞれ対応する複数の断面画像を前記3次元画像から抽出する断面画像抽出部と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定部と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け部と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成部と
を備える、眼科装置。
[5] a scanning unit that applies optical coherence tomography scanning to a three-dimensional region of the fundus of the subject's eye to collect data;
a three-dimensional image generating unit that generates a three-dimensional image of the fundus based on the data collected by the scanning unit;
a cross-sectional image extracting unit that extracts, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback;
a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images;
a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on a result of the association.
〔6〕前記スキャンパターンは、それぞれがフライバックを伴わない複数のパターンを含む、
上記5の眼科装置。
[6] The scan pattern includes a plurality of patterns each of which does not involve a flyback.
The ophthalmic devices listed above.
〔7〕前記スキャンパターンは、同心に配置された前記複数のパターンの組み合わせである同心パターンを含み、
前記眼底の前記複数の断面は、前記複数のパターンにそれぞれ相当する複数の同心断面を含み、
前記断面画像抽出部は、前記複数の同心断面にそれぞれ対応する前記複数の断面画像を前記3次元画像から抽出する、
上記6の眼科装置。
[7] The scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically,
the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns;
the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of concentric cross sections from the three-dimensional image;
The ophthalmic devices listed above.
〔8〕前記同心パターンは、同心に配置された複数の円形パターンの組み合わせである同心円パターンを含み、
前記複数の同心断面は、前記複数の円形パターンにそれぞれ相当する同心に配置された複数の円柱側面を含み、
前記断面画像抽出部は、前記複数の円柱側面にそれぞれ対応する前記複数の断面画像を前記3次元画像から抽出する、
上記7の眼科装置。
[8] The concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically,
the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged and correspond to the plurality of circular patterns, respectively;
the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of cylinder side surfaces, respectively, from the three-dimensional image;
The ophthalmic devices listed above are number 7.
〔9〕前記血管領域特定部は、
前記光コヒーレンストモグラフィスキャンのAスキャン方向へのプロジェクションを前記複数の断面画像のそれぞれに適用することによって、前記複数の断面画像にそれぞれ対応する複数のプロジェクション画像を生成し、
前記複数のプロジェクション画像のそれぞれについて、当該プロジェクション画像における輝度分布に基づいて、対応する断面画像から血管領域群を検出する、
上記1~8のいずれかの眼科装置。
[9] The vascular region specifying unit
generating a plurality of projection images corresponding to the plurality of cross-sectional images by applying a projection of the optical coherence tomography scan in an A-scan direction to each of the plurality of cross-sectional images;
for each of the plurality of projection images, detecting a blood vessel region group from a corresponding cross-sectional image based on a luminance distribution in the projection image;
Any one of the ophthalmic devices 1 to 8 above.
〔10〕予め指定された前記眼底の注目血管に反復スキャンを適用するように前記スキャン部の制御を実行するスキャン制御部と、
前記反復スキャンにより収集されたデータに基づいて、前記注目血管における血流動態情報を生成する血流動態情報生成部と
を更に備える、
上記1~9のいずれかの眼科装置。
[10] A scan control unit that controls the scanning unit so as to apply repeated scans to a pre-specified blood vessel of interest in the fundus;
and a hemodynamic information generating unit that generates hemodynamic information in the blood vessel of interest based on data collected by the repeated scans.
Any one of the ophthalmic devices 1 to 9 above.
〔11〕前記血管マップを解析して前記注目血管を指定する注目血管指定部を更に備える、
上記10の眼科装置。
[11] Further comprising a blood vessel of interest designation unit that analyzes the blood vessel map and designates the blood vessel of interest.
The above 10 ophthalmic devices.
〔12〕前記複数の断面画像は、第1の断面画像と第2の断面画像とを含み、
前記血管領域特定部は、
前記第1の断面画像から第1の血管領域を検出し、
前記第2の断面画像から第2の血管領域を検出し、
前記血管領域対応付け部は、前記第1の血管領域と前記第2の血管領域とを前記注目血管の異なる断面として互いに対応付け、
前記血流動態情報生成部は、
前記第1の血管領域の位置を示す第1の位置情報を生成し、
前記第2の血管領域の位置を示す第2の位置情報を生成し、
前記第1の位置情報と前記第2の位置情報とに基づいて、前記注目血管に対する光コヒーレンストモグラフィ血流計測におけるドップラー角の大きさを示すドップラー角度情報を生成する、
上記10又は11の眼科装置。
[12] The plurality of cross-sectional images include a first cross-sectional image and a second cross-sectional image,
The vascular region specifying unit
detecting a first blood vessel region from the first cross-sectional image;
detecting a second blood vessel region from the second cross-sectional image;
the vascular region associating unit associates the first vascular region and the second vascular region with each other as different cross sections of the blood vessel of interest;
The hemodynamic information generating unit
generating first position information indicating a position of the first vascular region;
generating second position information indicating a position of the second vascular region;
generating Doppler angle information indicating the magnitude of the Doppler angle in the optical coherence tomography blood flow measurement for the blood vessel of interest based on the first position information and the second position information;
12. The ophthalmic device according to claim 10 or 11.
〔13〕前記血流動態情報生成部は、前記反復スキャンにより収集された前記データと前記ドップラー角度情報とに基づいて、前記注目血管における前記血流動態情報を生成する、
上記12の眼科装置。
[13] The hemodynamic information generating unit generates the hemodynamic information in the blood vessel of interest based on the data collected by the repeated scans and the Doppler angle information.
The above 12 ophthalmic devices.
〔14〕前記血流動態情報生成部は、前記ドップラー角の前記大きさに評価処理を適用して評価情報を生成し、
前記評価情報を表示装置に表示させる表示制御部を更に備える、
上記12又は13の眼科装置。
[14] The hemodynamic information generating unit generates evaluation information by applying an evaluation process to the magnitude of the Doppler angle;
Further, a display control unit is provided to display the evaluation information on a display device.
14. The ophthalmic device according to claim 12 or 13.
〔15〕前記眼底の赤外観察画像を生成する観察画像生成部と、
前記スキャン部を移動する移動機構と、
前記血管マップに対する前記赤外観察画像の偏位に基づいて前記移動機構を制御する移動制御部と
を更に備え、
前記偏位が予め設定された閾値よりも小さくなったときに、前記スキャン制御部は、前記スキャン部の前記制御を実行する、
上記10~14のいずれかの眼科装置。
[15] An observation image generating unit that generates an infrared observation image of the fundus;
a movement mechanism that moves the scanning unit;
a movement control unit that controls the movement mechanism based on a displacement of the infrared observation image relative to the blood vessel map,
When the deviation becomes smaller than a preset threshold, the scan control unit executes the control of the scan unit.
15. The ophthalmic device according to any one of 10 to 14 above.
〔16〕前記血管マップ生成部は、前記スキャン部により前記眼底から新たに収集された新たなデータに基づく新たな血管マップを生成し、
前記光コヒーレンストモグラフィスキャンのAスキャン方向へのプロジェクションを前記血管マップに適用して第1のプロジェクション画像を生成し、且つ、前記プロジェクションを前記新たな血管マップに適用して第2のプロジェクション画像を生成するプロジェクション画像生成部と、
前記スキャン部を移動する移動機構と、
前記第1のプロジェクション画像と前記第2のプロジェクション画像との間の偏位に基づいて前記移動機構を制御する移動制御部と
を更に備える、
上記10~14のいずれかの眼科装置。
[16] The vascular map generating unit generates a new vascular map based on new data newly collected from the fundus by the scanning unit;
a projection image generator that applies a projection of the optical coherence tomography scan in an A-scan direction to the vascular map to generate a first projection image, and applies the projection to the new vascular map to generate a second projection image;
a movement mechanism that moves the scanning unit;
a movement control unit that controls the movement mechanism based on a deviation between the first projection image and the second projection image.
15. The ophthalmic device according to any one of 10 to 14 above.
〔17〕光コヒーレンストモグラフィスキャンを実行するスキャン部とプロセッサとを有する眼科装置を制御する方法であって、
前記プロセッサに、
フライバックを伴わないスキャンパターンによる光コヒーレンストモグラフィスキャンを被検眼の眼底に適用してデータを収集するように前記スキャン部を制御するスキャン制御と、
前記データに基づいて、前記眼底の複数の断面にそれぞれ対応する複数の断面画像を生成する断面画像生成処理と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定処理と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け処理と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成処理と
を実行させる、
方法。
[17] A method for controlling an ophthalmic apparatus having a scanning unit and a processor that performs an optical coherence tomography scan, comprising:
the processor,
a scan control that controls the scan unit to apply an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data;
a cross-sectional image generation process for generating a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data;
a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images;
a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a vascular map generation process for generating a vascular map showing the distribution of blood vessels based on the result of the association.
method.
〔18〕光コヒーレンストモグラフィスキャンを実行するスキャン部とプロセッサとを有する眼科装置を制御する方法であって、
前記プロセッサに、
被検眼の眼底の3次元領域に光コヒーレンストモグラフィスキャンを適用してデータを収集するように前記スキャン部を制御するスキャン制御と、
前記データに基づいて前記眼底の3次元画像を生成する3次元画像生成処理と、
フライバックを伴わないスキャンパターンに相当する前記眼底の複数の断面にそれぞれ対応する複数の断面画像を前記3次元画像から抽出する断面画像抽出処理と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定処理と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け処理と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成処理と
を実行させる、
方法。
[18] A method for controlling an ophthalmic apparatus having a scanning unit and a processor that performs an optical coherence tomography scan, comprising:
the processor,
a scan control that controls the scanning unit to collect data by applying an optical coherence tomography scan to a three-dimensional region of the fundus of the subject's eye;
a three-dimensional image generation process for generating a three-dimensional image of the fundus based on the data;
a cross-sectional image extraction process for extracting, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback;
a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images;
a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a vascular map generation process for generating a vascular map showing the distribution of blood vessels based on the result of the association.
method.
〔19〕上記17又は18の方法をコンピュータに実行させるプログラム。 [19] A program that causes a computer to execute method 17 or 18 above.
〔20〕上記19のプログラムが記録された、コンピュータ可読な非一時的記録媒体。 [20] A computer-readable non-transitory recording medium on which the program set forth in 19 above is recorded.
本開示は、この発明の実施の例示に過ぎない。この発明を実施しようとする者は、この発明の要旨の範囲内における任意の変形(省略、置換、付加など)を施すことが可能である。 This disclosure is merely an example of how to implement the invention. Anyone who intends to implement the invention may make any modifications (omissions, substitutions, additions, etc.) within the scope of the spirit of the invention.
1000 眼科装置
1010 スキャン部
1020 断面画像生成部
1030 血管領域特定部
1040 血管領域対応付け部
1050 血管マップ生成部
1000 Ophthalmic apparatus 1010 Scan unit 1020 Cross-sectional image generating unit 1030 Blood vessel region identifying unit 1040 Blood vessel region matching unit 1050 Blood vessel map generating unit
Claims (20)
前記スキャン部により収集された前記データに基づいて、前記眼底の複数の断面にそれぞれ対応する複数の断面画像を生成する断面画像生成部と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定部と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け部と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成部と
を備える、眼科装置。 a scanning unit that applies an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data;
a cross-sectional image generating unit that generates a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data collected by the scanning unit;
a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images;
a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on a result of the association.
請求項1の眼科装置。 the scan pattern includes a plurality of patterns each of which does not involve a flyback;
The ophthalmic device of claim 1.
前記眼底の前記複数の断面は、前記複数のパターンにそれぞれ相当する複数の同心断面を含み、
前記スキャン部は、前記同心パターンによる光コヒーレンストモグラフィスキャンを前記眼底に適用してデータを収集し、
前記断面画像生成部は、前記同心パターンによる前記光コヒーレンストモグラフィスキャンによって収集された前記データに基づいて、前記複数の同心断面にそれぞれ対応する前記複数の断面画像を生成する、
請求項2の眼科装置。 the scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically;
the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns;
the scanning unit applies an optical coherence tomography scan using the concentric pattern to the fundus to collect data;
the cross-sectional image generation unit generates the plurality of cross-sectional images corresponding to the plurality of concentric cross sections, respectively, based on the data collected by the optical coherence tomography scan using the concentric pattern;
The ophthalmic device of claim 2.
前記複数の同心断面は、前記複数の円形パターンにそれぞれ相当する同心に配置された複数の円柱側面を含み、
前記スキャン部は、前記同心円パターンによる光コヒーレンストモグラフィスキャンを前記眼底に適用してデータを収集し、
前記断面画像生成部は、前記同心円パターンによる前記光コヒーレンストモグラフィスキャンによって収集された前記データに基づいて、前記複数の円柱側面にそれぞれ対応する前記複数の断面画像を生成する、
請求項3の眼科装置。 The concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically,
the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged corresponding to the plurality of circular patterns, respectively;
the scanning unit applies an optical coherence tomography scan using the concentric circle pattern to the fundus to collect data;
the cross-sectional image generating unit generates the plurality of cross-sectional images corresponding to the plurality of cylindrical side surfaces, respectively, based on the data collected by the optical coherence tomography scan using the concentric circle pattern.
The ophthalmic apparatus of claim 3.
前記スキャン部により収集された前記データに基づいて、前記眼底の3次元画像を生成する3次元画像生成部と、
フライバックを伴わないスキャンパターンに相当する前記眼底の複数の断面にそれぞれ対応する複数の断面画像を前記3次元画像から抽出する断面画像抽出部と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定部と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け部と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成部と
を備える、眼科装置。 a scanning unit that applies optical coherence tomography scanning to a three-dimensional region of the fundus of the subject's eye to collect data;
a three-dimensional image generating unit that generates a three-dimensional image of the fundus based on the data collected by the scanning unit;
a cross-sectional image extracting unit that extracts, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback;
a vascular region specifying unit that specifies a plurality of vascular region groups corresponding to the plurality of cross-sectional images by detecting a vascular region group from each of the plurality of cross-sectional images;
a vascular region associating unit that associates vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a blood vessel map generating unit that generates a blood vessel map indicating the distribution of blood vessels based on a result of the association.
請求項5の眼科装置。 the scan pattern includes a plurality of patterns each of which does not involve a flyback;
The ophthalmic device of claim 5.
前記眼底の前記複数の断面は、前記複数のパターンにそれぞれ相当する複数の同心断面を含み、
前記断面画像抽出部は、前記複数の同心断面にそれぞれ対応する前記複数の断面画像を前記3次元画像から抽出する、
請求項6の眼科装置。 the scan pattern includes a concentric pattern that is a combination of the plurality of patterns arranged concentrically;
the plurality of cross sections of the fundus include a plurality of concentric cross sections respectively corresponding to the plurality of patterns;
the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of concentric cross sections from the three-dimensional image;
The ophthalmic device of claim 6.
前記複数の同心断面は、前記複数の円形パターンにそれぞれ相当する同心に配置された複数の円柱側面を含み、
前記断面画像抽出部は、前記複数の円柱側面にそれぞれ対応する前記複数の断面画像を前記3次元画像から抽出する、
請求項7の眼科装置。 The concentric pattern includes a concentric circular pattern that is a combination of a plurality of circular patterns arranged concentrically,
the plurality of concentric cross sections include a plurality of cylindrical side surfaces that are concentrically arranged and correspond to the plurality of circular patterns, respectively;
the cross-sectional image extraction unit extracts the plurality of cross-sectional images corresponding to the plurality of cylinder side surfaces, respectively, from the three-dimensional image;
The ophthalmic device of claim 7.
前記光コヒーレンストモグラフィスキャンのAスキャン方向へのプロジェクションを前記複数の断面画像のそれぞれに適用することによって、前記複数の断面画像にそれぞれ対応する複数のプロジェクション画像を生成し、
前記複数のプロジェクション画像のそれぞれについて、当該プロジェクション画像における輝度分布に基づいて、対応する断面画像から血管領域群を検出する、
請求項1~8のいずれかの眼科装置。 The vascular region specifying unit
generating a plurality of projection images corresponding to the plurality of cross-sectional images by applying a projection of the optical coherence tomography scan in an A-scan direction to each of the plurality of cross-sectional images;
for each of the plurality of projection images, detecting a blood vessel region group from a corresponding cross-sectional image based on a luminance distribution in the projection image;
The ophthalmic device according to any one of claims 1 to 8.
前記反復スキャンにより収集されたデータに基づいて、前記注目血管における血流動態情報を生成する血流動態情報生成部と
を更に備える、
請求項1~9のいずれかの眼科装置。 a scan control unit that controls the scanning unit so as to apply repeated scans to a pre-specified blood vessel of interest in the fundus;
and a hemodynamic information generating unit that generates hemodynamic information in the blood vessel of interest based on data collected by the repeated scans.
The ophthalmic apparatus according to any one of claims 1 to 9.
請求項10の眼科装置。 further comprising a blood vessel of interest designation unit that analyzes the blood vessel map and designates the blood vessel of interest;
The ophthalmic device of claim 10.
前記血管領域特定部は、
前記第1の断面画像から第1の血管領域を検出し、
前記第2の断面画像から第2の血管領域を検出し、
前記血管領域対応付け部は、前記第1の血管領域と前記第2の血管領域とを前記注目血管の異なる断面として互いに対応付け、
前記血流動態情報生成部は、
前記第1の血管領域の位置を示す第1の位置情報を生成し、
前記第2の血管領域の位置を示す第2の位置情報を生成し、
前記第1の位置情報と前記第2の位置情報とに基づいて、前記注目血管に対する光コヒーレンストモグラフィ血流計測におけるドップラー角の大きさを示すドップラー角度情報を生成する、
請求項10又は11の眼科装置。 the plurality of cross-sectional images include a first cross-sectional image and a second cross-sectional image;
The vascular region specifying unit
detecting a first blood vessel region from the first cross-sectional image;
detecting a second blood vessel region from the second cross-sectional image;
the vascular region associating unit associates the first vascular region and the second vascular region with each other as different cross sections of the blood vessel of interest;
The hemodynamic information generating unit
generating first position information indicating a position of the first vascular region;
generating second position information indicating a position of the second vascular region;
generating Doppler angle information indicating the magnitude of the Doppler angle in the optical coherence tomography blood flow measurement for the blood vessel of interest based on the first position information and the second position information;
12. An ophthalmic apparatus according to claim 10 or 11.
請求項12の眼科装置。 the hemodynamic information generating unit generates the hemodynamic information in the blood vessel of interest based on the data collected by the repeated scans and the Doppler angle information.
The ophthalmic device of claim 12.
前記評価情報を表示装置に表示させる表示制御部を更に備える、
請求項12又は13の眼科装置。 the hemodynamic information generating unit applies an evaluation process to the magnitude of the Doppler angle to generate evaluation information;
Further, a display control unit is provided to display the evaluation information on a display device.
14. An ophthalmic apparatus according to claim 12 or 13.
前記スキャン部を移動する移動機構と、
前記血管マップに対する前記赤外観察画像の偏位に基づいて前記移動機構を制御する移動制御部と
を更に備え、
前記偏位が予め設定された閾値よりも小さくなったときに、前記スキャン制御部は、前記スキャン部の前記制御を実行する、
請求項10~14のいずれかの眼科装置。 an observation image generating unit that generates an infrared observation image of the fundus;
a movement mechanism that moves the scanning unit;
a movement control unit that controls the movement mechanism based on a displacement of the infrared observation image relative to the blood vessel map,
When the deviation becomes smaller than a preset threshold, the scan control unit executes the control of the scan unit.
The ophthalmic device according to any one of claims 10 to 14.
前記光コヒーレンストモグラフィスキャンのAスキャン方向へのプロジェクションを前記血管マップに適用して第1のプロジェクション画像を生成し、且つ、前記プロジェクションを前記新たな血管マップに適用して第2のプロジェクション画像を生成するプロジェクション画像生成部と、
前記スキャン部を移動する移動機構と、
前記第1のプロジェクション画像と前記第2のプロジェクション画像との間の偏位に基づいて前記移動機構を制御する移動制御部と
を更に備える、
請求項10~14のいずれかの眼科装置。 the vascular map generating unit generates a new vascular map based on new data newly collected from the fundus by the scanning unit;
a projection image generator that applies a projection of the optical coherence tomography scan in an A-scan direction to the vascular map to generate a first projection image, and applies the projection to the new vascular map to generate a second projection image;
a movement mechanism that moves the scanning unit;
a movement control unit that controls the movement mechanism based on a deviation between the first projection image and the second projection image.
The ophthalmic device according to any one of claims 10 to 14.
前記プロセッサに、
フライバックを伴わないスキャンパターンによる光コヒーレンストモグラフィスキャンを被検眼の眼底に適用してデータを収集するように前記スキャン部を制御するスキャン制御と、
前記データに基づいて、前記眼底の複数の断面にそれぞれ対応する複数の断面画像を生成する断面画像生成処理と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定処理と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け処理と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成処理と
を実行させる、
方法。 1. A method for controlling an ophthalmic device having a scanning unit and a processor for performing optical coherence tomography scans, comprising:
the processor,
a scan control that controls the scan unit to apply an optical coherence tomography scan using a scan pattern without flyback to the fundus of the subject's eye to collect data;
a cross-sectional image generation process for generating a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus based on the data;
a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images;
a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a vascular map generation process for generating a vascular map showing the distribution of blood vessels based on the result of the association.
method.
前記プロセッサに、
被検眼の眼底の3次元領域に光コヒーレンストモグラフィスキャンを適用してデータを収集するように前記スキャン部を制御するスキャン制御と、
前記データに基づいて前記眼底の3次元画像を生成する3次元画像生成処理と、
フライバックを伴わないスキャンパターンに相当する前記眼底の複数の断面にそれぞれ対応する複数の断面画像を前記3次元画像から抽出する断面画像抽出処理と、
前記複数の断面画像のそれぞれから血管領域群を検出することにより、前記複数の断面画像にそれぞれ対応する複数の血管領域群を特定する血管領域特定処理と、
前記複数の血管領域群の間において、同一血管の異なる断面に相当する血管領域同士の対応付けを行う血管領域対応付け処理と、
前記対応付けの結果に基づいて、血管の分布を示す血管マップを生成する血管マップ生成処理と
を実行させる、
方法。 1. A method for controlling an ophthalmic device having a scanning unit and a processor for performing optical coherence tomography scans, comprising:
the processor,
a scan control that controls the scanning unit to collect data by applying an optical coherence tomography scan to a three-dimensional region of the fundus of the subject's eye;
a three-dimensional image generation process for generating a three-dimensional image of the fundus based on the data;
a cross-sectional image extraction process for extracting, from the three-dimensional image, a plurality of cross-sectional images corresponding to a plurality of cross sections of the fundus, the cross-sectional images corresponding to a scan pattern without a flyback;
a vascular region specifying process for detecting a vascular region group from each of the plurality of cross-sectional images to specify a plurality of vascular region groups corresponding to the plurality of cross-sectional images;
a vascular region association process for associating vascular regions corresponding to different cross sections of the same blood vessel among the plurality of vascular region groups;
and a vascular map generation process for generating a vascular map showing the distribution of blood vessels based on the result of the association.
method.
A computer-readable non-transitory recording medium on which the program of claim 19 is recorded.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202463557720P | 2024-02-26 | 2024-02-26 | |
US63/557,720 | 2024-02-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2025182923A1 true WO2025182923A1 (en) | 2025-09-04 |
Family
ID=96920281
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2024/045494 Pending WO2025182271A1 (en) | 2024-02-26 | 2024-12-23 | Ophthalmic apparatus, method for controlling ophthalmic apparatus, program, and recording medium |
PCT/JP2025/005329 Pending WO2025182661A1 (en) | 2024-02-26 | 2025-02-18 | Ophthalmological device |
PCT/JP2025/005332 Pending WO2025182663A1 (en) | 2024-02-26 | 2025-02-18 | Ophthalmologic information processing device and ophthalmologic device |
PCT/JP2025/006393 Pending WO2025182918A1 (en) | 2024-02-26 | 2025-02-25 | Ophthalmic device, control method therefor, program, and recording medium |
PCT/JP2025/006402 Pending WO2025182923A1 (en) | 2024-02-26 | 2025-02-25 | Ophthalmic device, control method therefor, program, and recording medium |
PCT/JP2025/006722 Pending WO2025183024A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, method for controlling same, program, and recording medium |
PCT/JP2025/006717 Pending WO2025183022A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, control method therefor, program, and recording medium |
PCT/JP2025/006474 Pending WO2025182947A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, control method therefor, program, and recording medium |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2024/045494 Pending WO2025182271A1 (en) | 2024-02-26 | 2024-12-23 | Ophthalmic apparatus, method for controlling ophthalmic apparatus, program, and recording medium |
PCT/JP2025/005329 Pending WO2025182661A1 (en) | 2024-02-26 | 2025-02-18 | Ophthalmological device |
PCT/JP2025/005332 Pending WO2025182663A1 (en) | 2024-02-26 | 2025-02-18 | Ophthalmologic information processing device and ophthalmologic device |
PCT/JP2025/006393 Pending WO2025182918A1 (en) | 2024-02-26 | 2025-02-25 | Ophthalmic device, control method therefor, program, and recording medium |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2025/006722 Pending WO2025183024A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, method for controlling same, program, and recording medium |
PCT/JP2025/006717 Pending WO2025183022A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, control method therefor, program, and recording medium |
PCT/JP2025/006474 Pending WO2025182947A1 (en) | 2024-02-26 | 2025-02-26 | Ophthalmic device, control method therefor, program, and recording medium |
Country Status (1)
Country | Link |
---|---|
WO (8) | WO2025182271A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014193225A (en) * | 2013-03-29 | 2014-10-09 | Nidek Co Ltd | Fundus image processing device and fundus image processing program |
JP2016040005A (en) * | 2015-12-21 | 2016-03-24 | 株式会社トプコン | Blood flow information generation device, blood flow information generation method, and program |
JP2021058285A (en) * | 2019-10-03 | 2021-04-15 | キヤノン株式会社 | Image processing device, image processing method and program |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5601613B2 (en) * | 2010-07-05 | 2014-10-08 | 株式会社ニデック | Fundus photographing device |
JP5693101B2 (en) * | 2010-08-30 | 2015-04-01 | キヤノン株式会社 | Image processing apparatus and image processing method |
FR2973997B1 (en) * | 2011-04-15 | 2014-08-22 | Univ Paris Curie | DEVICE FOR AIDING THE DETECTION OF ANATOMICAL CHARACTERISTICS OF AT LEAST ONE PORTION OF A FABRIC. METHOD FOR AIDING THE DETECTION OF ANATOMIC CHARACTERISTICS OF AT LEAST ONE PORTION OF A FABRIC |
JP6166871B2 (en) * | 2012-07-26 | 2017-07-19 | 国立大学法人京都大学 | Blood vessel age output device, blood vessel age output method, and program |
US20140073917A1 (en) * | 2012-09-10 | 2014-03-13 | Oregon Health & Science University | Quantification of local circulation with oct angiography |
WO2014203901A1 (en) * | 2013-06-19 | 2014-12-24 | 株式会社トプコン | Ophthalmological imaging device and ophthalmological image display device |
JP6481250B2 (en) * | 2013-10-29 | 2019-03-13 | 株式会社ニデック | Fundus analysis apparatus and fundus analysis program |
JP6469387B2 (en) * | 2014-08-26 | 2019-02-13 | 株式会社トプコン | Fundus analyzer |
JP6491540B2 (en) * | 2015-05-27 | 2019-03-27 | 株式会社トーメーコーポレーション | Optical coherence tomography and control method thereof |
JP6633468B2 (en) * | 2015-08-26 | 2020-01-22 | 株式会社トプコン | Blood flow measurement device |
JP6502791B2 (en) * | 2015-08-26 | 2019-04-17 | 株式会社トプコン | Blood flow measuring device |
JP6453191B2 (en) * | 2015-09-24 | 2019-01-16 | 株式会社トプコン | Blood flow measuring device |
JP6922152B2 (en) * | 2015-10-21 | 2021-08-18 | 株式会社ニデック | Ophthalmology analyzer, ophthalmology analysis program |
JP6624641B2 (en) * | 2016-03-18 | 2019-12-25 | 株式会社トプコン | Ophthalmic equipment |
JP6864450B2 (en) * | 2016-09-21 | 2021-04-28 | 株式会社トプコン | Ophthalmologic imaging equipment |
JP2019055134A (en) * | 2017-09-22 | 2019-04-11 | 株式会社トプコン | Ophthalmic photographing apparatus and ophthalmic information processing apparatus |
JP2019058491A (en) * | 2017-09-27 | 2019-04-18 | 株式会社トプコン | Ophthalmic device |
JP7106304B2 (en) * | 2018-03-12 | 2022-07-26 | キヤノン株式会社 | Image processing device, image processing method and program |
JP7216514B2 (en) * | 2018-09-28 | 2023-02-01 | 株式会社トプコン | Blood vessel analyzer |
JP6776313B2 (en) * | 2018-12-07 | 2020-10-28 | 国立大学法人旭川医科大学 | Blood flow measuring device |
JP2020121012A (en) * | 2019-01-31 | 2020-08-13 | キヤノン株式会社 | Ophthalmology information processing device, learned model, learning device, ophthalmology information processing method, and program |
JP2020130266A (en) * | 2019-02-14 | 2020-08-31 | 株式会社トプコン | Ophthalmic equipment |
JP7286422B2 (en) * | 2019-06-11 | 2023-06-05 | 株式会社トプコン | Ophthalmic information processing device, ophthalmic device, ophthalmic information processing method, and program |
US12229947B2 (en) * | 2019-06-12 | 2025-02-18 | Carl Zeiss Meditec Inc. | OCT-based retinal artery/vein classification |
JP7343331B2 (en) * | 2019-08-08 | 2023-09-12 | 株式会社トプコン | Ophthalmological device, its control method, program, and recording medium |
US20220322932A1 (en) * | 2019-09-04 | 2022-10-13 | Nidek Co., Ltd. | Oct device |
JP7359724B2 (en) * | 2020-03-11 | 2023-10-11 | 株式会社トプコン | Ophthalmology information processing device, ophthalmology device, ophthalmology information processing method, and program |
KR102482680B1 (en) * | 2020-06-03 | 2022-12-28 | 고려대학교 산학협력단 | Apparatus and method for predicting biometry based on fundus image |
JP7601643B2 (en) * | 2021-01-19 | 2024-12-17 | 株式会社トプコン | Ophthalmic device, method for controlling ophthalmic device, and program |
JP7718139B2 (en) * | 2021-07-27 | 2025-08-05 | 株式会社ニデック | ophthalmology equipment |
JP2023076025A (en) * | 2021-11-22 | 2023-06-01 | キヤノン株式会社 | Ophthalmologic apparatus, control method of ophthalmologic apparatus, and program |
-
2024
- 2024-12-23 WO PCT/JP2024/045494 patent/WO2025182271A1/en active Pending
-
2025
- 2025-02-18 WO PCT/JP2025/005329 patent/WO2025182661A1/en active Pending
- 2025-02-18 WO PCT/JP2025/005332 patent/WO2025182663A1/en active Pending
- 2025-02-25 WO PCT/JP2025/006393 patent/WO2025182918A1/en active Pending
- 2025-02-25 WO PCT/JP2025/006402 patent/WO2025182923A1/en active Pending
- 2025-02-26 WO PCT/JP2025/006722 patent/WO2025183024A1/en active Pending
- 2025-02-26 WO PCT/JP2025/006717 patent/WO2025183022A1/en active Pending
- 2025-02-26 WO PCT/JP2025/006474 patent/WO2025182947A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014193225A (en) * | 2013-03-29 | 2014-10-09 | Nidek Co Ltd | Fundus image processing device and fundus image processing program |
JP2016040005A (en) * | 2015-12-21 | 2016-03-24 | 株式会社トプコン | Blood flow information generation device, blood flow information generation method, and program |
JP2021058285A (en) * | 2019-10-03 | 2021-04-15 | キヤノン株式会社 | Image processing device, image processing method and program |
Also Published As
Publication number | Publication date |
---|---|
WO2025182663A1 (en) | 2025-09-04 |
WO2025182947A1 (en) | 2025-09-04 |
WO2025182661A1 (en) | 2025-09-04 |
WO2025183024A1 (en) | 2025-09-04 |
WO2025182271A1 (en) | 2025-09-04 |
WO2025183022A1 (en) | 2025-09-04 |
WO2025182918A1 (en) | 2025-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6624945B2 (en) | Image forming method and apparatus | |
JP6550745B2 (en) | Blood flow measuring device | |
JP6580448B2 (en) | Ophthalmic photographing apparatus and ophthalmic information processing apparatus | |
JP5628636B2 (en) | Fundus image processing apparatus and fundus observation apparatus | |
JP5916110B2 (en) | Image display device, image display method, and program | |
JP7736870B2 (en) | Ophthalmic imaging equipment | |
JP2008267892A (en) | Optical image measuring device and program for controlling the same | |
JP2018019771A (en) | Optical coherence tomography apparatus and optical coherence tomography control program | |
JP6471593B2 (en) | OCT signal processing apparatus and OCT signal processing program | |
JP2023038280A (en) | Blood flow measurement device | |
JP6633468B2 (en) | Blood flow measurement device | |
WO2017069019A1 (en) | Blood flow measurement device | |
JP7096116B2 (en) | Blood flow measuring device | |
US12150711B2 (en) | Blood flow analysis apparatus, blood flow analysis method, and recording medium | |
JP7260426B2 (en) | Optical coherence tomography device, control method thereof, optical measurement method, program, and storage medium | |
JP6736734B2 (en) | Ophthalmic photographing device and ophthalmic information processing device | |
JP6646021B2 (en) | Ophthalmic image processing device | |
JP7216514B2 (en) | Blood vessel analyzer | |
WO2025182923A1 (en) | Ophthalmic device, control method therefor, program, and recording medium | |
WO2016098472A1 (en) | Blood flow measurement device | |
JP2017086182A (en) | Oct data processing device, and oct data processing program | |
JP7624619B2 (en) | BLOOD FLOW ANALYSIS APPARATUS, OPHTHALMOLOGIC APPARATUS, OPERATION METHOD OF BLOOD FLOW ANALYSIS APPARATUS, AND PROGRAM | |
JP7068366B2 (en) | Blood flow measuring device | |
JP7221628B2 (en) | Blood flow measuring device, information processing device, information processing method, and program | |
JP2019054993A (en) | Blood flow measurement device |