WO2024075411A1 - 画像処理装置、画像処理方法及び記憶媒体 - Google Patents
画像処理装置、画像処理方法及び記憶媒体 Download PDFInfo
- Publication number
- WO2024075411A1 WO2024075411A1 PCT/JP2023/029842 JP2023029842W WO2024075411A1 WO 2024075411 A1 WO2024075411 A1 WO 2024075411A1 JP 2023029842 W JP2023029842 W JP 2023029842W WO 2024075411 A1 WO2024075411 A1 WO 2024075411A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- lesion
- score
- image processing
- processing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G06T11/23—
-
- G06T11/26—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- This disclosure relates to the technical fields of image processing devices, image processing methods, and storage media that process images acquired during endoscopic examinations.
- Patent Document 1 discloses a learning method for a learning model that outputs information about diseased areas contained in endoscopic image data when the endoscopic image data generated by an imaging device is input.
- Patent Document 2 discloses a classification method for classifying sequence data using a method that applies the Sequential Probability Ratio Test (SPRT).
- SPRT Sequential Probability Ratio Test
- Non-Patent Document 1 discloses an approximation method for matrices when performing multi-class classification in the SPRT-based method disclosed in Patent Document 2.
- lesion detection methods based on a fixed, predetermined number of images and lesion detection methods based on a variable number of images as described in Patent Document 2.
- Lesion detection methods based on a predetermined number of images can detect lesions with high accuracy even when there is no change in the image, but have the problem of being easily affected by noise including blurring and blurring.
- Lesion detection methods based on a variable number of images as described in Patent Document 2 are less susceptible to momentary noise and can detect easily identifiable lesions early, but have the problem of the possibility of delayed lesion detection or overlooking lesions when there is no change in the image.
- one of the objectives of the present disclosure is to provide an image processing device, an image processing method, and a storage medium that can suitably perform lesion detection in endoscopic images.
- One aspect of the image processing device is an acquisition means for acquiring an endoscopic image of a subject by an imaging unit provided in the endoscope; a lesion detection means for detecting the lesion based on a selection model selected from a first model for making an inference regarding a lesion in the subject based on a predetermined number of the endoscopic images and a second model for making an inference regarding the lesion based on a variable number of the endoscopic images; having
- the lesion detection means is an image processing device that changes parameters used for detecting the lesion based on the selected model, based on a non-selected model that is a first model or a second model other than the selected model.
- One aspect of the image processing method includes: The computer An endoscopic image of the subject is obtained by an imaging unit provided in the endoscope; Detecting the lesion based on a selection model selected from a first model that performs inference regarding a lesion in the subject based on a predetermined number of the endoscopic images and a second model that performs inference regarding the lesion based on a variable number of the endoscopic images; changing a parameter used for detecting the lesion based on the selected model based on a non-selected model which is a first model or a second model other than the selected model; An image processing method.
- One aspect of the storage medium is An endoscopic image of the subject is obtained by an imaging unit provided in the endoscope; Detecting the lesion based on a selection model selected from a first model that performs inference regarding a lesion in the subject based on a predetermined number of the endoscopic images and a second model that performs inference regarding the lesion based on a variable number of the endoscopic images;
- the storage medium stores a program that causes a computer to execute a process of changing parameters used for detecting the lesion based on the selected model based on a non-selected model, which is a first model or a second model that is not the selected model.
- One example of the effect of this disclosure is that it becomes possible to effectively detect lesions in endoscopic images.
- FIG. 1 shows a schematic configuration of an endoscopic examination system.
- 2 shows the hardware configuration of an image processing device.
- FIG. 2 is a functional block diagram of the image processing device.
- 4 shows an example of a display screen displayed by a display device during an endoscopic examination.
- 1A is a graph showing the progress of a first score from processing time t0 when acquisition of an endoscopic image is started in a first specific example
- FIG. 1B is a graph showing the progress of a second score from processing time t0 in a first specific example.
- 13A is a graph showing the transition of the first score from the processing time t0 in the second specific example
- FIG. 13B is a graph showing the transition of the second score from the processing time t0 in the second specific example.
- FIG. 5 is an example of a flowchart executed by the image processing apparatus in the first embodiment.
- FIG. 8A is a graph showing the progress of the first score from processing time t0 in the second embodiment
- FIG. 8B is a graph showing the progress of the second score from processing time t0 in the second embodiment.
- 13 is an example of a flowchart executed by the image processing apparatus in the second embodiment.
- 13 is an example of a flowchart executed by the image processing apparatus in the third embodiment.
- FIG. 13 is a block diagram of an image processing device according to a fourth embodiment. 13 is an example of a flowchart executed by the image processing apparatus in the fourth embodiment.
- FIG. 1 shows a schematic configuration of an endoscopic examination system 100.
- the endoscopic examination system 100 detects a part of a subject suspected of having a lesion (lesion part) to an examiner such as a doctor who performs an examination or treatment using an endoscope, and presents the detection result.
- the endoscopic examination system 100 can support the decision-making of the examiner such as a doctor, such as determining a treatment plan for the subject of the examination.
- the endoscopic examination system 100 mainly includes an image processing device 1, a display device 2, and an endoscope scope 3 connected to the image processing device 1.
- the image processing device 1 acquires images (also called “endoscopic images Ia") captured by the endoscope 3 in a time series from the endoscope 3, and displays a screen based on the endoscopic images Ia on the display device 2.
- the endoscopic images Ia are images captured at a predetermined frame rate during at least one of the processes of inserting or ejecting the endoscope 3 into the subject.
- the image processing device 1 analyzes the endoscopic images Ia to detect endoscopic images Ia that include a lesion site, and displays information related to the detection results on the display device 2.
- the display device 2 is a display or the like that performs a predetermined display based on a display signal supplied from the image processing device 1.
- the endoscope 3 mainly comprises an operation section 36 that allows the examiner to input the required information, a flexible shaft 37 that is inserted into the subject's organ to be imaged, a tip section 38 that incorporates an imaging section such as a miniature image sensor, and a connection section 39 for connecting to the image processing device 1.
- the configuration of the endoscopic examination system 100 shown in FIG. 1 is one example, and various modifications may be made.
- the image processing device 1 may be configured integrally with the display device 2.
- the image processing device 1 may be configured from multiple devices.
- endoscopes that are targets of the present disclosure include pharyngeal endoscopes, bronchoscopes, upper gastrointestinal endoscopes, duodenoscopes, small intestinal endoscopes, colonoscopes, capsule endoscopes, thoracoscopes, laparoscopes, cystoscopes, cholangioscopes, arthroscopes, spinal endoscopes, angioscopes, and epidural endoscopes.
- Examples of pathological conditions at lesion sites that are targets of detection in the present disclosure include the following (a) to (f).
- Esophagus esophageal cancer, esophagitis, hiatal hernia, Barrett's esophagus, esophageal varices, esophageal achalasia, esophageal submucosal tumor, benign esophageal tumor
- Stomach gastric cancer, gastritis, gastric ulcer, gastric polyp, gastric tumor
- Duodenum duodenal cancer, duodenal ulcer, duodenitis, duodenal tumor, duodenal lymphoma
- Small intestine small intestine cancer, small intestine neoplastic disease, small intestine inflammatory disease, small intestine vascular disease
- Large intestine large intestine
- FIG. 2 shows the hardware configuration of the image processing device 1.
- the image processing device 1 mainly includes a processor 11, a memory 12, an interface 13, an input unit 14, a light source unit 15, and a sound output unit 16. These elements are connected via a data bus 19.
- the processor 11 executes predetermined processing by executing programs stored in the memory 12.
- the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
- the processor 11 may be composed of multiple processors.
- the processor 11 is an example of a computer.
- the memory 12 is composed of various volatile memories used as working memories, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memories that store information necessary for the processing of the image processing device 1.
- the memory 12 may include an external storage device such as a hard disk connected to or built into the image processing device 1, or may include a removable storage medium such as a flash memory.
- the memory 12 stores programs that enable the image processing device 1 to execute each process in this embodiment.
- the memory 12 has a first model information storage unit D1 that stores first model information, and a second model information storage unit D2 that stores second model information.
- the first model information includes information on parameters of the first model used by the image processing device 1 to detect a lesion site.
- the first model information may further include information indicating the calculation results of the lesion site detection process using the first model.
- the second model information includes information on parameters of the second model used by the image processing device 1 to detect a lesion site.
- the second model information may further include information indicating the calculation results of the lesion site detection process using the second model.
- the first model is a model that performs inference regarding a lesion in a subject based on a fixed number of endoscopic images (which may be one or more).
- the first model is a model that has learned the relationship between a predetermined number of endoscopic images or their features input to the lesion determination model and a determination result regarding a lesion site in the endoscopic image.
- the first model is a model that has been trained to output a determination result regarding a lesion site in an endoscopic image when input data that is a predetermined number of endoscopic images or their features is input.
- the determination result regarding a lesion site output by the first model includes at least a score (index value) regarding the presence or absence of a lesion site in the endoscopic image, and this score is hereinafter also referred to as the "first score S1".
- the first score S1 indicates that the higher the first score S1, the higher the certainty that a lesion site exists in the target endoscopic image.
- the above-mentioned determination result regarding the lesion site may further include information indicating the position or area of the lesion site in the endoscopic image.
- the first model is, for example, a deep learning model that includes a convolutional neural network in its architecture.
- the first model may be a Fully Convolutional Network, SegNet, U-Net, V-Net, Feature Pyramid Network, Mask R-CNN, DeepLab, or the like.
- the first model information storage unit D1 includes various parameters required to configure the first model, such as the layer structure, the neuron structure of each layer, the number of filters and filter size in each layer, and the weight of each element of each filter.
- the first model is trained in advance based on a pair of an endoscopic image or its features, which is input data conforming to the input format of the first model, and correct answer data indicating the correct answer determination result regarding the lesion site in the endoscopic image.
- the second model is a model that performs inference regarding lesions of a subject based on a variable number of endoscopic images.
- the second model is a model that performs machine learning of the relationship between a variable number of endoscopic images or their features and a judgment result regarding a lesion site in the endoscopic images.
- the second model is a model that has been trained to output a judgment result regarding a lesion site in an endoscopic image when input data that is a variable number of endoscopic images or their features is input.
- the "judgment result regarding a lesion site” includes at least a score regarding the presence or absence of a lesion site in the endoscopic image, and this score is hereinafter also referred to as the "second score S2".
- the second score S2 indicates that the higher the second score S2, the higher the certainty that a lesion site exists in the target endoscopic image.
- the second model can be, for example, a model based on the SPRT described in Patent Document 2. A specific example of the second model based on the SPRT will be described later.
- Various parameters necessary for configuring the second model are stored in the second model information storage unit D2.
- the memory 12 also stores various information such as parameters necessary for the lesion detection process. At least a part of the information stored in the memory 12 may be stored by an external device other than the image processing device 1.
- the above-mentioned external device may be one or more server devices capable of data communication with the image processing device 1 via a communication network or the like or by direct communication.
- the interface 13 performs interface operations between the image processing device 1 and an external device. For example, the interface 13 supplies the display information "Ib" generated by the processor 11 to the display device 2. The interface 13 also supplies light generated by the light source unit 15 to the endoscope scope 3. The interface 13 also supplies an electrical signal indicating the endoscopic image Ia supplied from the endoscope scope 3 to the processor 11.
- the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, or may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc.
- the input unit 14 generates an input signal based on the operation by the examiner.
- the input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, etc.
- the light source unit 15 generates light to be supplied to the tip 38 of the endoscope 3.
- the light source unit 15 may also incorporate a pump or the like for sending water or air to be supplied to the endoscope 3.
- the sound output unit 16 outputs sound based on the control of the processor 11.
- the image processing device 1 when the image processing device 1 performs lesion detection based on the first score S1 output by the first model, the image processing device 1 changes the parameters used for the lesion detection based on the second score S2 output by the second model.
- the above parameters are parameters that define the conditions for determining that a lesion has been detected based on the first score S1, and the image processing device 1 changes the parameters so that the higher the confidence level of the lesion indicated by the second score S2, the more relaxed the above conditions are.
- the image processing device 1 performs accurate lesion detection by utilizing the advantages of both the first model and the second model, and presents the detection results.
- the first model is an example of a "selection model”
- the second model is an example of a "non-selection model”.
- FIG. 3 is a functional block diagram of the image processing device 1.
- the processor 11 of the image processing device 1 functionally has an endoscopic image acquisition unit 30, a feature extraction unit 31, a first score calculation unit 32, a second score calculation unit 33, a lesion detection unit 34, and a display control unit 35.
- blocks where data is exchanged are connected by solid lines, but the combination of blocks where data is exchanged is not limited to FIG. 3. The same applies to the other functional block diagrams described later.
- the endoscopic image acquisition unit 30 acquires the endoscopic image Ia captured by the endoscope 3 via the interface 13 at predetermined intervals in accordance with the frame period of the endoscope 3, and supplies the acquired endoscopic image Ia to the feature extraction unit 31 and the display control unit 35. Then, each processing unit in the subsequent stages performs the processing described below, with the time interval at which the endoscopic image acquisition unit 30 acquires the endoscopic image as a period.
- the time for each frame period will also be referred to as the "processing time".
- the feature extraction unit 31 converts the endoscopic image Ia supplied from the endoscopic image acquisition unit 30 into a feature quantity (specifically, a feature vector or third-order or higher tensor data) expressed in a feature space of a predetermined dimension.
- the feature extraction unit 31 configures a feature extractor based on parameters stored in advance in the memory 12 or the like, and acquires the feature quantity output by the feature extractor by inputting the endoscopic image Ia to the feature extractor.
- the feature extractor may be a deep learning model having an architecture such as a convolutional neural network. In this case, the feature extractor is machine-learned in advance, and parameters obtained by learning are stored in the memory 12 or the like in advance.
- the feature extractor may extract a feature quantity representing the relationship between time series data based on any method for calculating the relationship between time series data, such as LSTM (Long Short Term Memory). Then, the feature extraction unit 31 supplies the feature data representing the generated feature quantity to the first score calculation unit 32 and the second score calculation unit 33.
- LSTM Long Short Term Memory
- the feature extractor described above may be incorporated into at least one of the first model or the second model.
- the first score calculation unit 32 inputs the endoscopic image Ia to the first model, and then supplies feature amount data indicating the feature amounts generated by the feature extractor in the first model to the second score calculation unit 33 as the output of the intermediate layer of the first model.
- the feature extraction unit 31 does not need to be provided.
- the first score calculation unit 32 calculates the first score S1 based on the first model information storage unit D1 and the feature data supplied from the feature extraction unit 31.
- the first score calculation unit 32 inputs the feature data supplied from the feature extraction unit 31 to the first model configured with reference to the first model information storage unit D1, thereby acquiring the first score S1 output by the first model.
- the first model is a model that outputs the first score S1 based on one endoscopic image Ia
- the first score calculation unit 32 calculates the first score S1 at the current processing time, for example, by inputting the feature data supplied from the feature extraction unit 31 at the current processing time to the first model.
- the first score calculation unit 32 may calculate the first score S1 at the current processing time, for example, by inputting a combination of the feature data supplied from the feature extraction unit 31 at the current processing time and the feature data supplied in the past to the first model.
- the first score calculation unit 32 may also calculate the first score S1 by averaging (i.e., performing a moving average) the score obtained at the past processing time and the score obtained at the current processing time.
- the first score calculation unit 32 supplies the calculated first score S1 to the lesion detection unit 34.
- the second score calculation unit 33 calculates a second score S2 indicating the likelihood that a lesion exists based on the second model information storage unit D2 and feature data corresponding to a variable number of time-series endoscopic images Ia obtained up to now.
- the second score calculation unit 33 determines the second score S2 based on the likelihood ratio for the time-series endoscopic images Ia calculated using the second model based on SPRT for each processing time.
- the "likelihood ratio for the time-series endoscopic images Ia" refers to the ratio between the likelihood that a lesion exists in the time-series endoscopic images Ia and the likelihood that a lesion does not exist in the time-series endoscopic images Ia.
- the greater the likelihood that a lesion exists the greater the likelihood ratio.
- a specific example of a method for calculating the second score S2 using the second model based on SPRT will be described later.
- the second score calculation unit 33 supplies the calculated second score S2 to the lesion detection unit 34.
- the lesion detection unit 34 detects a lesion in the endoscopic image Ia (i.e., determines whether or not a lesion exists) based on the first score S1 supplied from the first score calculation unit 32 and the second score S2 supplied from the second score calculation unit 33. In this case, the lesion detection unit 34 changes the threshold value that defines the condition for determining that a lesion has been detected based on the first score S1, based on the second score S2. A specific example of the processing by the lesion detection unit 34 will be described later.
- the lesion detection unit 34 supplies the lesion detection result to the display control unit 35.
- the display control unit 35 generates display information Ib based on the endoscopic image Ia and the lesion detection result supplied from the lesion detection unit 34, and supplies the display information Ib to the display device 2 via the interface 13, thereby causing the display device 2 to display information relating to the endoscopic image Ia and the lesion detection result by the lesion detection unit 34.
- the display control unit 35 may also cause the display device 2 to further display information relating to the first score S1 calculated by the first score calculation unit 32 and the second score S2 calculated by the second score calculation unit 33.
- FIG. 4 shows an example of a display screen displayed by the display device 2 during an endoscopic examination.
- the display control unit 35 of the image processing device 1 outputs to the display device 2 display information Ib generated based on the endoscopic image Ia acquired by the endoscopic image acquisition unit 30 and the lesion detection result by the lesion detection unit 34, etc.
- the display control unit 35 transmits the endoscopic image Ia and the display information Ib to the display device 2, thereby causing the display device 2 to display the above-mentioned display screen.
- the display control unit 35 of the image processing device 1 provides a real-time image display area 71, a lesion detection result display area 72, and a score transition display area 73 on the display screen.
- the display control unit 35 displays a moving image representing the latest endoscopic image Ia in the real-time image display area 71. Furthermore, in the lesion detection result display area 72, the display control unit 35 displays the lesion detection result by the lesion detection unit 34. Note that, since the lesion detection unit 34 has determined that a lesion site exists at the time when the display screen shown in FIG. 4 is displayed, the display control unit 35 displays a text message indicating that a lesion is highly likely to exist in the lesion detection result display area 72.
- the display control unit 35 may output a sound (including voice) notifying that a lesion is highly likely to exist from the sound output unit 16.
- the display control unit 35 displays a score transition graph showing the progress of the first score S1 from the start of the endoscopic examination to the present time, together with a dashed dotted line indicating a reference value (first score threshold value Sth1 described later) for determining the presence or absence of a lesion from the first score S1.
- each of the components of the endoscopic image acquisition unit 30, the feature extraction unit 31, the first score calculation unit 32, the second score calculation unit 33, the lesion detection unit 34, and the display control unit 35 can be realized, for example, by the processor 11 executing a program. Also, each component may be realized by recording the necessary programs in any non-volatile storage medium and installing them as necessary. Note that at least a portion of each of these components is not limited to being realized by software using a program, but may be realized by any combination of hardware, firmware, and software. Also, at least a portion of each of these components may be realized using a user-programmable integrated circuit, such as an FPGA (Field-Programmable Gate Array) or a microcontroller.
- FPGA Field-Programmable Gate Array
- each of the above components may be realized using this integrated circuit.
- at least a portion of each component may be configured by an ASSP (Application Specific Standard Production), an ASIC (Application Specific Integrated Circuit), or a quantum processor (quantum computer control chip).
- ASSP Application Specific Standard Production
- ASIC Application Specific Integrated Circuit
- quantum processor quantum computer control chip
- the second score calculation unit 33 calculates likelihood ratios for the latest "N" (N is an integer equal to or greater than 2) endoscopic images Ia for each processing time, and determines the second score S2 based on a likelihood ratio (also called an "integrated likelihood ratio") that integrates the likelihood ratios calculated at the current processing time and past processing times.
- a likelihood ratio also called an "integrated likelihood ratio”
- the second score S2 may be the integrated likelihood ratio itself, or may be a function that includes the integrated likelihood ratio as a variable.
- the second model is assumed to include a likelihood ratio calculation model, which is a processing unit that calculates the likelihood ratio, and a score calculation model, which is a processing unit that calculates the second score S2 from the likelihood ratio.
- the likelihood ratio calculation model is a model trained to output likelihood ratios for N endoscopic images Ia when feature data of the N endoscopic images Ia are input.
- the likelihood ratio calculation model may be a deep learning model or any other machine learning model or statistical model.
- the second model information storage unit D2 stores trained parameters of the second model including the likelihood ratio calculation model.
- various parameters such as the layer structure, the neuron structure of each layer, the number of filters and filter size in each layer, and the weight of each element of each filter are stored in advance in the second model information storage unit D2.
- the second score calculation unit 33 can acquire likelihood ratios from less than N endoscopic images Ia using the likelihood ratio calculation model even when the acquired endoscopic images Ia are less than N.
- the second score calculation unit 33 may store the acquired likelihood ratios in the second model information storage unit D2.
- the "start time” represents the first processing time of the past processing times considered in the calculation of the second score S2.
- the integrated likelihood ratio for the binary classification of a class "C 1 " that includes a lesion site and a class "C 0 " in which the endoscopic image Ia does not include a lesion site is represented by the following formula (1).
- p represents the probability of belonging to each class (i.e., the confidence level between 0 and 1).
- the likelihood ratio output by the likelihood ratio calculation model can be used.
- the time index t representing the current processing time increases with the passage of time, so the length of the time series of the endoscopic image Ia used to calculate the integrated likelihood ratio (i.e., the number of frames) is variable.
- the second score calculation unit 33 can calculate the second score S2 taking into account a variable number of endoscopic images Ia as a first advantage.
- a second advantage is that time-dependent features can be classified, and a third advantage is that the second score S2, whose accuracy is unlikely to decrease even with data that is difficult to distinguish, can be suitably calculated.
- the second score calculation unit 33 may store the integrated likelihood ratio and the second score S2 calculated at each processing time in the second model information storage unit D2.
- the second score calculation unit 33 may determine that no lesion is present, initialize the second score S2 and the time index t to 0, and restart the calculation of the second score S2 based on the endoscopic image Ia obtained from the next processing time.
- the lesion detection unit 34 compares the first score S1 with a threshold value for the first score S1 (also referred to as the "first score threshold value Sth1”), and the second score S2 with a threshold value for the second score S2 (also referred to as the "second score threshold value Sth2”) at each processing time. Then, the lesion detection unit 34 determines that a lesion exists when the first score S1 exceeds the first score threshold value Sth1 consecutively for more than a predetermined number of times (also referred to as the "threshold number of times Mth").
- the lesion detection unit 34 reduces the threshold number of times Mth. In this way, in a situation where the presence of a lesion is suspected based on the second score S2 output by the second model, the lesion detection unit 34 relaxes the condition for determining that a lesion exists based on the first score S1. This makes it possible to accurately detect the lesion site in both a situation in which the first model is likely to accurately detect the lesion site and a situation in which the second model is likely to accurately detect the lesion site.
- the number of times that the first score S1 consecutively exceeds the first score threshold Sth1 is referred to as the "consecutive number of times exceeding the threshold M.”
- the first score threshold Sth1 and the second score threshold Sth2 each have a matching value stored in advance in, for example, the memory 12.
- the threshold number of times Mth is a value that varies according to the second score S2, and an initial value, etc., is stored in advance in the memory 12.
- the threshold number of times Mth is an example of a "parameter used for lesion detection based on a selection model.”
- FIG. 5(A) is a graph showing the progress of the first score S1 from processing time "t0" when acquisition of the endoscopic image Ia begins in the first specific example
- FIG. 5(B) is a graph showing the progress of the second score S2 from processing time t0 in the first specific example.
- the first specific example is an example of lesion detection processing in a situation where the accuracy of lesion detection based on the first model is higher than the accuracy of lesion detection based on the second model.
- an example of such a situation is when the fluctuation in the endoscopic image Ia over time is relatively small.
- the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold Sth1, and the second score S2 with the second score threshold Sth2. Then, at processing time "t1", the lesion detection unit 34 determines that the first score S1 exceeds the first score threshold Sth1, starts counting the number of consecutive times M that exceed the threshold, and determines that the number of consecutive times M that exceed the threshold exceeds the threshold number Mth at processing time "t1 ⁇ ". Therefore, in this case, the lesion detection unit 34 determines that a lesion site is present in the endoscopic image Ia obtained from processing time t1 to t1 ⁇ . On the other hand, after processing time t0, the lesion detection unit 34 determines that the second score S2 is equal to or less than the second score threshold Sth2, and keeps the threshold number Mth fixed even after processing time t0.
- the lesion detection unit 34 can accurately perform lesion detection.
- FIG. 6(A) is a graph showing the progress of the first score S1 from processing time t0 in the second specific example
- FIG. 6(B) is a graph showing the progress of the second score S2 from processing time t0 in the second specific example.
- the second specific example is an example of lesion detection processing in a situation where the accuracy of lesion detection based on the second model is higher than the accuracy of lesion detection based on the first model.
- such a situation may be one where there is a relatively large fluctuation in the endoscopic image Ia over time.
- the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold Sth1, and the second score S2 with the second score threshold Sth2. Then, in the period from processing time "t2" to processing time "t3", the first score S1 exceeds the first score threshold Sth1, so the consecutive number of times M that exceed the threshold increases. On the other hand, since the first score S1 becomes equal to or less than the first score threshold Sth1 after processing time t3 without the consecutive number of times M that exceed the threshold value Mth exceeding the initial value, the lesion detection unit 34 determines that no lesion area is present during the above period.
- the lesion detection unit 34 determines that the second score S2 is greater than the second score threshold Sth2, and sets the threshold count Mth to a predetermined relaxed value that is smaller than the initial value (i.e., a value in which the condition for determining that a lesion exists is relaxed from the initial value).
- the initial value of the threshold count Mth and the relaxed value of the threshold count Mth are each stored in advance in, for example, the memory 12, etc.
- the lesion detection unit 34 determines that a lesion area is present in the period from processing time t5 to processing time t6.
- the lesion detection unit 34 can accurately perform lesion detection based on the first model. Furthermore, when an easily identifiable lesion exists, the relaxation of the above-mentioned conditions allows lesion detection to be performed quickly with a smaller number of endoscopic images Ia. In this case, the reduction in the number of endoscopic images Ia required to detect a lesion reduces the possibility of momentary noise being introduced, causing the initialization of the number of consecutive exceeding the threshold M.
- the presence or absence of lesion detection is determined by comparing the number of consecutive occurrences over the threshold M with the threshold number Mth.
- Such lesion detection has the advantage that lesion detection can be performed even under conditions in which the log-likelihood ratio calculated in the second model based on SPRT is unlikely to increase, such as when there is no time change in the endoscopic image Ia.
- the number of endoscopic images Ia required to detect a lesion is large, even for lesions that are weak against noise (including blurring and blurring) and are easily identifiable.
- the second model based on SPRT is resistant to instantaneous noise and can quickly detect lesions that are easily identifiable, but when there is little time change in the endoscopic image Ia, the log-likelihood ratio is unlikely to increase, and the number of endoscopic images Ia required to detect a lesion may be large.
- these are combined to perform lesion detection that enjoys the advantages of both.
- Processing Flow Fig. 7 is an example of a flowchart executed by the image processing device 1 in the first embodiment.
- the image processing device 1 repeatedly executes the processing of this flowchart until the end of the endoscopic examination. For example, the image processing device 1 determines that the endoscopic examination has ended when it detects a predetermined input to the input unit 14 or the operation unit 36.
- the endoscopic image acquisition unit 30 of the image processing device 1 acquires the endoscopic image Ia (step S11).
- the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscopic scope 3 via the interface 13.
- the display control unit 35 also executes processing such as displaying the endoscopic image Ia acquired in step S11 on the display device 2.
- the feature extraction unit 31 also generates feature data indicating the feature amount of the acquired endoscopic image Ia.
- the second score calculation unit 33 calculates a second score S2 based on the variable number of endoscopic images Ia (step S12).
- the second score calculation unit 33 calculates the second score S2 based on the feature data of the variable number of endoscopic images Ia acquired at the current processing time and past processing times and the second model configured based on the second model information storage unit D2.
- the first score calculation unit 32 calculates a first score S1 based on a predetermined number of endoscopic images Ia in parallel with step S12 (step S16).
- the first score calculation unit 32 calculates the first score S1 based on the feature data of the predetermined number of endoscopic images Ia acquired at the current processing time (and past processing times) and the first model configured based on the first model information storage unit D1.
- the lesion detection unit 34 determines whether the second score S2 is greater than the second score threshold Sth2 (step S13). If the second score S2 is greater than the second score threshold Sth2 (step S13; Yes), the lesion detection unit 34 sets the threshold number of times Mth to a relaxed value that is smaller than the initial value (step S14). On the other hand, if the second score S2 is equal to or less than the second score threshold Sth2 (step S13; No), the lesion detection unit 34 sets the threshold number of times Mth to the initial value (step S15).
- the lesion detection unit 34 determines whether the first score S1 is greater than the first score threshold Sth1 (step S17). Then, if the first score S1 is greater than the first score threshold Sth1 (step S17; Yes), the lesion detection unit 34 increases the number of consecutive times M that exceeds the threshold by 1 (step S18). Note that the initial value of the number of consecutive times M that exceeds the threshold is set to 0. On the other hand, if the first score S1 is equal to or less than the first score threshold Sth1 (step S17; No), the lesion detection unit 34 sets the number of consecutive times M that exceeds the threshold to the initial value of 0 (step S19).
- step S20 determines whether the consecutive number of times M exceeding the threshold is greater than the threshold number Mth (step S20). If the consecutive number of times M exceeding the threshold is greater than the threshold number Mth (step S20; Yes), the lesion detection unit 34 determines that a lesion area is present and notifies the user that a lesion area has been detected by at least one of display and sound output (step S21). On the other hand, if the consecutive number of times M exceeding the threshold is equal to or less than the threshold number Mth (step S20; No), the process returns to step S11.
- the lesion detection unit 34 switches the threshold number of times Mth from the initial value to a relaxed value when the second score S2 exceeds the second score threshold Sth2.
- the lesion detection unit 34 is not limited to this mode, and may decrease the threshold number of times Mth stepwise or continuously as the second score S2 increases (i.e., relax the conditions for determining that a lesion exists).
- correspondence information such as an equation or lookup table showing the relationship between each conceivable second score S2 and the appropriate threshold number Mth for each second score S2 is stored in advance in the memory 12, etc., and the lesion detection unit 34 determines the threshold number Mth based on the second score S2 and the above-mentioned correspondence information.
- the lesion detection unit 34 can set the threshold number Mth according to the second score S2 and perform lesion detection that makes use of the advantages of both the first model and the second model.
- the lesion detection unit 34 may change the first score threshold Sth1 based on the second score S2. In this case, for example, the lesion detection unit 34 may gradually or continuously decrease the first score threshold Sth1 as the second score S2 increases. Even with this aspect, the lesion detection unit 34 can appropriately relax the conditions for lesion detection based on the first model in a situation where lesion detection based on the second model is effective, and accurately perform lesion detection.
- the image processing device 1 may start the process of calculating the second score S2 and changing the threshold number Mth.
- the image processing device 1 After the start of the lesion detection process, if the image processing device 1 does not calculate the second score S2 by the second score calculation unit 33 and determines that the first score S1 exceeds the first score threshold Sth1, the image processing device 1 starts calculating the second score S2 by the second score calculation unit 33 and changes the threshold number of times Mth (or the first score threshold Sth1) in accordance with the second score S2 as in the above-mentioned embodiment.
- the image processing device 1 determines that the first score S1 has become equal to or less than the first score threshold Sth1 after starting the calculation of the second score S2 by the second score calculation unit 33, the image processing device 1 again stops the calculation of the second score S2 by the second score calculation unit 33.
- the "predetermined condition” is not limited to the condition that the first score S1 is greater than the first score threshold Sth1, but may be any condition under which it is determined that the probability of the presence of a lesion site has increased. Examples of such conditions include a condition that the first score S1 is greater than a predetermined threshold value that is smaller than the first score threshold value Sth1, a condition that the increase in the first score S1 per unit time (i.e., the derivative of the first score S1) is greater than or equal to a predetermined value, and a condition that the number of consecutive occurrences M exceeding the threshold value is greater than or equal to a predetermined value.
- the image processing device 1 may calculate the second score S2 going back to a past processing time, and change the threshold number of times Mth (or the first score threshold Sth1) based on the second score S2.
- the image processing device 1 may store, for example, feature data calculated by the feature extraction unit 31 at a past processing time in the memory 12, etc., and the second score calculation unit 33 may calculate the second score S2 at the past processing time based on the feature data, and change the threshold number of times Mth (or the first score threshold Sth1) based on the second score S2.
- the image processing device 1 can limit the period for calculating the second score S2, thereby effectively reducing the calculation load.
- the image processing device 1 may process the video composed of the endoscopic images Ia generated during the endoscopic examination after the examination.
- the image processing device 1 when an image to be processed is specified based on user input via the input unit 14 at any time after the examination, the image processing device 1 repeatedly performs the process of the flowchart shown in FIG. 7 on the time-series endoscopic images Ia that constitute the specified image until it is determined that the target image has ended.
- the image processing device 1 performs lesion detection based on the second score S2 based on the second model as a reference, and changes the second score threshold Sth2 for comparison with the second score S2 based on the first score S1 based on the first model. This allows accurate detection of a lesion site in both a situation where the first model is likely to accurately detect a lesion site and a situation where the second model is likely to accurately detect a lesion site.
- the hardware configuration of the image processing device 1 according to the second embodiment is the same as the hardware configuration of the image processing device 1 shown in FIG. 2, and the functional block configuration of the processor 11 of the image processing device 1 according to the second embodiment is the same as the functional block configuration shown in FIG. 3.
- the lesion detection unit 34 gradually or continuously lowers the second score threshold Sth2 (i.e., relaxes the conditions for determining that a lesion area has been detected) as the number of consecutive occurrences exceeding the threshold value M increases. This allows the lesion detection unit 34 to appropriately relax the conditions for lesion detection based on the second model and accurately perform lesion detection even in a situation in which lesion detection based on the first model is effective.
- the second model is an example of a "selection model," and the first model is an example of a “non-selection model.” Also, the second score threshold Sth2 is an example of a "parameter used to detect a lesion based on the selection model.”
- Fig. 8(A) is a graph showing the progress of the first score S1 from processing time t0 when acquisition of the endoscopic image Ia is started in the second embodiment
- Fig. 8(B) is a graph showing the progress of the second score S2 from processing time t0 in the second embodiment.
- the specific examples shown in Fig. 8(A) and Fig. 8(B) are examples of lesion detection processing in a situation where the accuracy of lesion detection based on the first model is higher than the accuracy of lesion detection based on the second model.
- the lesion detection unit 34 compares the first score S1 obtained at each processing time with the first score threshold Sth1, and the second score S2 with the second score threshold Sth2. Then, at processing time "t11", the lesion detection unit 34 determines that the first score S1 exceeds the first score threshold Sth1, and increases the number of consecutive times M that the threshold is exceeded.
- the lesion detection unit 34 changes the second score threshold Sth2 in accordance with the consecutive over-threshold count M.
- the lesion detection unit 34 continuously decreases the second score threshold Sth2 as the consecutive over-threshold count M increases.
- processing time "t12" which is included in the period in which the consecutive over-threshold count M increases, the lesion detection unit 34 determines that a lesion area is present at processing time t12, because the second score S2 becomes greater than the second score threshold Sth2.
- the second score threshold Sth2 is decreased as the number of consecutive exceeding thresholds M increases, and the conditions for lesion detection related to the second score S2 based on the second model are suitably relaxed to accurately perform lesion detection. Also, even in a situation where the accuracy of lesion detection based on the second model is higher than the accuracy of lesion detection based on the first model, the second score S2 based on the second model reaches the second score threshold Sth2 even if the second score threshold Sth2 does not change, so the lesion detection unit 34 can accurately perform lesion detection.
- Processing Flow Fig. 9 is an example of a flowchart executed by the image processing device 1 in the second embodiment.
- the image processing device 1 repeatedly executes the processing of this flowchart until the endoscopy is completed.
- the endoscopic image acquisition unit 30 of the image processing device 1 acquires the endoscopic image Ia (step S31).
- the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscopic scope 3 via the interface 13.
- the display control unit 35 also executes processing such as displaying the endoscopic image Ia acquired in step S31 on the display device 2.
- the feature extraction unit 31 also generates feature data indicating the feature amount of the acquired endoscopic image Ia.
- the second score calculation unit 33 calculates a second score S2 based on the variable number of endoscopic images Ia (step S32).
- the second score calculation unit 33 calculates the second score S2 based on the feature data of the variable number of endoscopic images Ia acquired at the current processing time and past processing times and the second model configured based on the second model information storage unit D2.
- the first score calculation unit 32 calculates a first score S1 based on a predetermined number of endoscopic images Ia in parallel with step S12 (step S33).
- the first score calculation unit 32 calculates the first score S1 based on the feature data of the predetermined number of endoscopic images Ia acquired at the current processing time (and past processing times) and the first model configured based on the first model information storage unit D1.
- the lesion detection unit 34 determines whether the first score S1 is greater than the first score threshold Sth1 (step S34). If the first score S1 is greater than the first score threshold Sth1 (step S34; Yes), the lesion detection unit 34 increases the number of consecutive occurrences M that exceed the threshold by 1 (step S35). Note that the initial value of the number of consecutive occurrences M that exceed the threshold is set to 0. On the other hand, if the first score S1 is equal to or less than the first score threshold Sth1 (step S34; No), the lesion detection unit 34 sets the number of consecutive occurrences M that exceed the threshold to the initial value of 0 (step S36).
- the lesion detection unit 34 determines the second score threshold Sth2, which is a threshold to be compared with the second score S2, based on the number of consecutive occurrences exceeding the threshold M (step S37).
- the lesion detection unit 34 refers to, for example, a pre-stored formula or lookup table, and reduces the second score threshold Sth2 as the number of consecutive occurrences exceeding the threshold M increases.
- the lesion detection unit 34 determines whether the second score S2 is greater than the second score threshold Sth2 (step S38). If the second score S2 is greater than the second score threshold Sth2 (step S38; Yes), the lesion detection unit 34 determines that a lesion area is present and notifies the user that a lesion area has been detected by at least one of display and sound output (step S39). On the other hand, if the second score S2 is equal to or less than the second score threshold Sth2 (step S38; No), the process returns to step S31.
- the image processing device 1 may start the calculation of the first score S1 using the first model and the process of changing the second score threshold Sth2.
- the image processing device 1 does not calculate the first score S1 by the first score calculation unit 32, and when the second score S2 is greater than a predetermined threshold (e.g., 0) that is smaller than the second score threshold Sth2, the image processing device 1 starts calculating the first score S1 by the first score calculation unit 32, and changes the second score threshold Sth2 in the same manner as in the above-mentioned embodiment according to the number of consecutive times M that exceed the threshold.
- a predetermined threshold e.g., 0
- the image processing device 1 determines that the second score S2 has become equal to or smaller than the predetermined threshold, the image processing device 1 again stops the calculation of the first score S1 by the first score calculation unit 32.
- the "predetermined condition” is not limited to the condition that the second score S2 is greater than the predetermined threshold, and may be any condition that determines that the probability of the presence of a lesion site has increased.
- examples of such conditions include a condition that the increase amount per unit time of the second score S2 (i.e., the derivative of the first score S1) is equal to or greater than a predetermined value.
- the image processing device 1 may calculate the first score S1 going back to a past processing time and change the second score threshold Sth2 based on the first score S1.
- the image processing device 1 may store, for example, feature data calculated by the feature extraction unit 31 at a past processing time in the memory 12 or the like, and the first score calculation unit 32 may calculate the first score S1 at the past processing time based on the feature data and change the second score threshold Sth2 at the past processing time based on the first score S1.
- the image processing device 1 compares the second score S2 with the second score threshold Sth2 at each past processing time to determine the presence or absence of a lesion.
- the image processing device 1 can limit the period for calculating the second score S2, thereby effectively reducing the calculation load.
- the image processing device 1 may process the video composed of the endoscopic images Ia generated during the endoscopic examination after the examination.
- the image processing device 1 when an image to be processed is specified based on user input via the input unit 14 at any time after the examination, the image processing device 1 repeatedly performs the process of the flowchart shown in FIG. 9 on the time-series endoscopic images Ia that constitute the specified image until it is determined that the target image has ended.
- the image processing device 1 switches between the lesion detection process based on the first embodiment and the lesion detection process based on the second embodiment based on the degree of time-series fluctuation of the endoscopic image Ia.
- the lesion detection process based on the first embodiment will be referred to as the "first model-based lesion detection process” and the lesion detection process based on the second embodiment will be referred to as the "second model-based lesion detection process.”
- the hardware configuration of the image processing device 1 according to the third embodiment is the same as the hardware configuration of the image processing device 1 shown in FIG. 2, and the functional block configuration of the processor 11 of the image processing device 1 according to the third embodiment is the same as the functional block configuration shown in FIG. 3.
- the lesion detection unit 34 calculates a score (also called a "variation score") representing the degree of variation between an endoscopic image Ia (also called a “currently processed image”) at a time index t representing the current processing time and an endoscopic image Ia (also called a “past image”) acquired at the time immediately prior to that (i.e., time index "t-1").
- a score also called a "variation score” representing the degree of variation between an endoscopic image Ia (also called a "currently processed image”) at a time index t representing the current processing time and an endoscopic image Ia (also called a “past image”) acquired at the time immediately prior to that (i.e., time index "t-1").
- the lesion detection unit 34 calculates an arbitrary similarity index based on a comparison between images (i.e., a comparison between images) as the variation score.
- similarity indexes examples include a correlation coefficient, a SSIM (Structural SIMilarity) index, a PSNR (Peak Signal-to-Noise Ratio) index, and a squared error between corresponding pixels.
- the lesion detection unit 34 may compare the feature amounts of the currently processed image with the feature amounts of the previous image and calculate the similarity between them as the variation score.
- the lesion detection unit 34 performs a first model-based lesion detection process. That is, in this case, the lesion detection unit 34 determines the threshold number Mth based on the second score S2, and determines that a lesion site exists when the threshold-exceeding consecutive number M based on the first score S1 is greater than the threshold number Mth.
- the fluctuation threshold is, for example, stored in advance in the memory 12, etc.
- the lesion detection unit 34 performs a second model-based lesion detection process.
- the lesion detection unit 34 determines the second score threshold Sth2 based on the first score S1, and determines that a lesion site exists when the second score S2 is greater than the second score threshold Sth2.
- the lesion detection unit 34 selects a selection model, which is a model to be used for lesion detection, from the first model and the second model based on the degree of fluctuation of the endoscopic image Ia.
- lesion detection based on the first model has the advantage that lesion detection can be performed even under conditions where the log-likelihood ratio based on the second model is unlikely to increase when there is no time change in the endoscopic image Ia (i.e., when the fluctuation score is relatively low), and lesion detection based on the second model has the advantage that it is resistant to instantaneous noise and can quickly detect lesion sites that are easy to identify.
- the lesion detection unit 34 determines whether or not a lesion has been detected based on the first score S1 and the number of consecutive occurrences above the threshold M, and when the fluctuation score is equal to or less than the fluctuation threshold and lesion detection based on the first model is effective, the lesion detection unit 34 determines whether or not a lesion has been detected based on the second score S2. This makes it possible to suitably improve the lesion detection accuracy.
- FIG. 10 is an example of a flowchart executed by the image processing device 1 in the third embodiment.
- the endoscopic image acquisition unit 30 of the image processing device 1 acquires the endoscopic image Ia (step S41).
- the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscopic scope 3 via the interface 13.
- the display control unit 35 executes processing such as displaying the endoscopic image Ia acquired in step S41 on the display device 2.
- the lesion detection unit 34 calculates a fluctuation score based on the current processing image, which is the endoscopic image Ia obtained in step S41 at the current processing time, and the past image, which is the endoscopic image Ia obtained in step S41 at the immediately preceding processing time (step S42). Then, the lesion detection unit 34 determines whether the fluctuation score is greater than the fluctuation threshold (step S43). Then, if the fluctuation score is greater than the fluctuation threshold (step S43; Yes), the image processing device 1 executes the second model-based lesion detection process (step S44). In this case, the image processing device 1 executes the flowchart of FIG. 9 excluding the process of step S31 that overlaps with step S41.
- step S38 if it is determined in step S38 that the second score S2 is equal to or less than the second score threshold Sth2, the process may proceed to step S46.
- step S43 if the fluctuation score is equal to or less than the fluctuation threshold (step S43; No), the image processing device 1 executes the first model-based lesion detection process (step S45). In this case, the image processing device 1 executes the flowchart of FIG. 7 excluding the process of step S11 that overlaps with step S41. If it is determined in step S20 that the number of consecutive occurrences exceeding the threshold value M is equal to or less than the threshold number Mth, or if the processing of step S19 is completed, the processing may proceed to step S46.
- the image processing device 1 determines whether or not the endoscopic examination has ended (step S46). For example, the image processing device 1 determines that the endoscopic examination has ended when it detects a predetermined input to the input unit 14 or the operation unit 36. Then, if the image processing device 1 determines that the endoscopic examination has ended (step S46; Yes), it ends the processing of the flowchart. On the other hand, if the image processing device 1 determines that the endoscopic examination has not ended (step S46; No), it returns the processing to step S41.
- the fourth Embodiment 11 is a block diagram of an image processing device 1X according to the fourth embodiment.
- the image processing device 1X includes an acquisition unit 30X and a lesion detection unit 34X.
- the image processing device 1X may be composed of a plurality of devices.
- the acquisition means 30X acquires an endoscopic image of the subject captured by an imaging unit provided in the endoscope.
- the acquisition means 30X may instantly acquire an endoscopic image generated by the imaging unit, or may acquire an endoscopic image generated in advance by the imaging unit and stored in a storage device at a predetermined timing.
- the acquisition means 30X may be, for example, the endoscopic image acquisition unit 30 in the first to third embodiments.
- the lesion detection means 34X detects lesions based on a selection model selected from a first model that performs inferences regarding lesions in a subject based on a predetermined number of endoscopic images and a second model that performs inferences regarding lesions in a subject based on a variable number of endoscopic images.
- the lesion detection means 34X changes parameters used for lesion detection based on the selection model based on a non-selection model, which is the first model or the second model that is not the selection model.
- the "selection model" can be the "first model” in the first model-based lesion detection process in the first or third embodiment, and the "second model” in the second model-based lesion detection process in the second or third embodiment.
- the "parameters used for lesion detection based on the selection model” can be the “threshold number of times Mth" or the "first score threshold Sth1" in the first model-based lesion detection process in the first or third embodiment, and the “second score threshold Sth2" in the second model-based lesion detection process in the second or third embodiment.
- the selection of the "selected model” and the “non-selected model” here is not limited to being made autonomously based on the variation score as in the third embodiment, but may be predetermined by settings as in the first or second embodiment.
- the lesion detection means 34X may be, for example, the first score calculation unit 32, the second score calculation unit 33, and the lesion detection unit 34 in the first to third embodiments.
- FIG. 12 is an example of a flowchart showing the processing procedure in the fourth embodiment.
- the acquisition means 30X acquires endoscopic images of the subject captured by an imaging unit provided in the endoscope (step S51).
- the lesion detection means 34X detects a lesion based on a selected model selected from a first model that performs inference regarding a lesion in the subject based on a predetermined number of endoscopic images, and a second model that performs inference regarding a lesion in the subject based on a variable number of endoscopic images.
- the lesion detection means 34X changes the parameters used to detect a lesion based on the selected model based on a non-selected model, which is the first model that is not the selected model or the second model (step S52).
- the image processing device 1X can accurately detect a lesion area present in an endoscopic image.
- Non-transitory computer readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard disk drives), optical storage media (e.g., optical disks), CD-ROMs (Read Only Memory), CD-Rs, CD-R/Ws, semiconductor memories (e.g., mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, and RAMs (Random Access Memory).
- Programs may also be supplied to computers by various types of transient computer-readable media.
- Examples of transient computer-readable media include electrical signals, optical signals, and electromagnetic waves.
- Transient computer-readable media can supply programs to computers via wired communication paths such as electric wires and optical fibers, or wireless communication paths.
- the lesion detection means is an image processing device that changes parameters used for detecting the lesion based on the selected model, based on a non-selected model that is a first model or a second model other than the selected model.
- the parameter defines a condition for determining that the lesion has been detected
- the lesion detection means changes the parameters so as to relax the conditions as the degree of certainty that the lesion exists, which is indicated by the score calculated by the non-selection model, increases.
- the first model is a deep learning model that includes a convolutional neural network in its architecture.
- the selected model is the first model
- the lesion detection means determines that the lesion has been detected when a consecutive number of times that a certainty of the presence of the lesion, indicated by a score calculated by the first model from the endoscopic images acquired in time series, becomes greater than a predetermined threshold value is greater than a predetermined number of times; the parameter is at least one of the predetermined number of times or the predetermined threshold value,
- the image processing device according to claim 1, wherein the lesion detection means changes at least one of the predetermined number of times or the predetermined threshold value based on the score calculated by the second model.
- the second model is a model based on SPRT.
- the selected model is the second model
- the lesion detection means determines that the lesion has been detected when a certainty that the lesion exists, which is indicated by a score calculated by the second model, is greater than a predetermined threshold; the parameter is the predetermined threshold,
- the image processing device according to claim 1, wherein the lesion detection means changes the predetermined threshold value based on the score calculated by the first model.
- the lesion detection means determines the selected model from the first model and the second model based on a degree of variation in the endoscopic image.
- the image processing device wherein the lesion detection means starts calculating a score using the non-selected model when it determines that a predetermined condition based on the score calculated by the selected model is satisfied.
- the lesion detection means starts calculating a score using the non-selected model when it determines that a predetermined condition based on the score calculated by the selected model is satisfied.
- the image processing device further comprising an output control means for displaying or outputting audio information regarding the detection result of the lesion by the lesion detection means.
- the output control means outputs information regarding the lesion detection result and information regarding the selection model to assist an examiner in making a decision.
- An endoscopic image of the subject is obtained by an imaging unit provided in the endoscope; Detecting the lesion based on a selection model selected from a first model that performs inference regarding a lesion in the subject based on a predetermined number of the endoscopic images and a second model that performs inference regarding the lesion based on a variable number of the endoscopic images; changing a parameter used for detecting the lesion based on the selected model based on a non-selected model which is a first model or a second model other than the selected model; Image processing methods.
- An endoscopic image of the subject is obtained by an imaging unit provided in the endoscope; Detecting the lesion based on a selection model selected from a first model that performs inference regarding a lesion in the subject based on a predetermined number of the endoscopic images and a second model that performs inference regarding the lesion based on a variable number of the endoscopic images;
- a storage medium storing a program that causes a computer to execute a process of changing parameters used for detecting the lesion based on the selected model based on a non-selected model, which is a first model or a second model that is not the selected model.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Endoscopes (AREA)
Abstract
Description
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得する取得手段と、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知する病変検知手段と、
を有し、
前記病変検知手段は、前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する画像処理装置である。
コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する、
画像処理方法である。
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する処理をコンピュータに実行させるプログラムを格納した記憶媒体である。
(1-1)システム構成
図1は、内視鏡検査システム100の概略構成を示す。内視鏡検査システム100は、内視鏡を利用した検査又は治療を行う医師等の検査者に対して病変の疑いがある被検体の部位(病変部位)の検知を行い、その検知結果を提示する。これにより、内視鏡検査システム100は、検査の対象者に対する治療方針の決定などの、医師等の検査者の意思決定を支援することができる。内視鏡検査システム100は、図1に示すように、主に、画像処理装置1と、表示装置2と、画像処理装置1に接続された内視鏡スコープ3と、を備える。
(a)頭頚部:咽頭ガン、悪性リンパ腫、乳頭腫
(b)食道:食道ガン、食道炎、食道裂孔ヘルニア、バレット食道、食道静脈瘤、食道アカラシア、食道粘膜下腫瘍、食道良性腫瘍
(c)胃:胃ガン、胃炎、胃潰瘍、胃ポリープ、胃腫瘍
(d)十二指腸:十二指腸ガン、十二指腸潰瘍、十二指腸炎、十二指腸腫瘍、十二指腸リンパ腫
(e)小腸:小腸ガン、小腸腫瘍性疾患、小腸炎症性疾患、小腸血管性疾患
(f)大腸:大腸ガン、大腸腫瘍性疾患、大腸炎症性疾患、大腸ポリープ、大腸ポリポーシス、クローン病、大腸炎、腸結核、痔
図2は、画像処理装置1のハードウェア構成を示す。画像処理装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、を含む。これらの各要素は、データバス19を介して接続されている。
次に、画像処理装置1による病変部位の検知処理(病変検知処理)の概要について説明する。概略的には、画像処理装置1は、第1モデルが出力する第1スコアS1に基づく病変検知を行う場合に、当該病変検知に用いるパラメータを、第2モデルが出力する第2スコアS2に基づき変更する。具体的には、上記のパラメータは、第1スコアS1に基づき病変を検知したと判定する条件を規定するパラメータであり、画像処理装置1は、第2スコアS2が示す病変が存在する確信度が高いほど、上記の条件を緩和するようにパラメータを変更する。これにより、画像処理装置1は、第1モデル及び第2モデルの両方の利点を活かした的確な病変検知を行い、その検知結果の提示を行う。なお、第1実施形態において、第1モデルは、「選択モデル」の一例であり、第2モデルは、「非選択モデル」の一例である。
次に、SPRTに基づく第2モデルを用いた第2スコアS2の算出例について説明する。
ここで、「p」は、各クラスに属する確率(即ち0~1の確信度)を表す。式(1)の右辺の項の算出においては、尤度比算出モデルが出力する尤度比を用いることができる。
次に、病変検知部34による病変部位の存否の具体的な判定方法について説明する。病変検知部34は、第1スコアS1と第1スコアS1に対する閾値(「第1スコア閾値Sth1」とも呼ぶ。)、及び、第2スコアS2と第2スコアS2に対する閾値(「第2スコア閾値Sth2」とも呼ぶ。)を、各処理時刻において比較する。そして、病変検知部34は、所定回数(「閾値回数Mth」とも呼ぶ。)より多く連続して第1スコアS1が第1スコア閾値Sth1を上回った場合には、病変部位が存在すると判定する。一方、病変検知部34は、第2スコアS2が第2スコア閾値Sth2より大きくなった場合には、閾値回数Mthを減少させる。このように、病変検知部34は、第2モデルが出力する第2スコアS2に基づき病変部位の存在が疑われる状況では、第1スコアS1に基づく病変部位が存在すると判定する条件を緩和する。これにより、第1モデルが病変部位を的確に検知しやすい状況及び第2モデルが病変部位を的確に検知しやすい状況の各々において、的確に病変部位の検知を行うことが可能となる。
図7は、第1実施形態において画像処理装置1が実行するフローチャートの一例である。画像処理装置1は、このフローチャートの処理を、内視鏡検査の終了まで繰り返し実行する。なお、例えば、画像処理装置1は、入力部14又は操作部36への所定の入力等を検知した場合に、内視鏡検査が終了したと判定する。
次に、上述した第1実施形態の変形例について説明する。以下の変形例は任意に組み合わせてもよい。
病変検知部34は、第2スコアS2が第2スコア閾値Sth2を超えた場合に閾値回数Mthを初期値から緩和値に切り替えていた。一方、病変検知部34は、この態様に限らず、第2スコアS2が大きいほど、段階的又は連続的に、閾値回数Mthを小さく(即ち、病変部位が存在すると判定する条件を緩和)してもよい。
病変検知部34は、第2スコアS2に基づき閾値回数Mthを変更する代わりに、又は、これに加えて、第2スコアS2に基づき第1スコア閾値Sth1を変更してもよい。この場合、例えば、病変検知部34は、第2スコアS2が大きいほど、段階的又は連続的に第1スコア閾値Sth1を小さくしてもよい。この態様によっても、病変検知部34は、第2モデルに基づく病変検知が有効な状況において、第1モデルに基づく病変検知の条件を好適に緩和し、病変検知を的確に実行することができる。
画像処理装置1は、第1スコアS1に基づく所定の条件が満たされたと判定した場合に、第2スコアS2の算出及び閾値回数Mthの変更処理を開始してもよい。
画像処理装置1は、内視鏡検査時に生成された内視鏡画像Iaから構成された映像を、検査後において処理してもよい。
(2-1)概要
第2実施形態では、画像処理装置1は、第2モデルに基づく第2スコアS2を基準として病変検知を行いつつ、第1モデルに基づく第1スコアS1に基づき第2スコアS2と比較する第2スコア閾値Sth2を変更する。これにより、第1モデルが病変部位を的確に検知しやすい状況及び第2モデルが病変部位を的確に検知しやすい状況の各々において、的確に病変部位の検知を行う。
図8(A)は、第2実施形態において、内視鏡画像Iaの取得が開始された処理時刻t0からの第1スコアS1の推移を示すグラフであり、図8(B)は、第2実施形態において、処理時刻t0からの第2スコアS2の推移を示すグラフである。なお、図8(A)及び図8(B)に示される具体例は、第1モデルに基づく病変検知の精度が第2モデルに基づく病変検知の精度よりも高くなる状況での病変検知処理の例となっている。
図9は、第2実施形態において画像処理装置1が実行するフローチャートの一例である。画像処理装置1は、このフローチャートの処理を、内視鏡検査の終了まで繰り返し実行する。
次に、上述した第2実施形態の変形例について説明する。以下の変形例は任意に組み合わせてもよい。
画像処理装置1は、第2スコアS2に基づく所定の条件が満たされたと判定した場合に第1モデルによる第1スコアS1の算出及び第2スコア閾値Sth2の変更処理を開始してもよい。
画像処理装置1は、内視鏡検査時に生成された内視鏡画像Iaから構成された映像を、検査後において処理してもよい。
第3実施形態では、画像処理装置1は、内視鏡画像Iaの時系列での変動の度合いに基づいて、第1実施形態に基づく病変検知処理と、第2実施形態に基づく病変検知処理とを切り替える。以後では、第1実施形態に基づく病変検知処理を「第1モデルベースの病変検知処理」と呼び、第2実施形態に基づく病変検知処理を「第2モデルベースの病変検知処理」と呼ぶ。
図11は、第4実施形態における画像処理装置1Xのブロック図である。画像処理装置1Xは、取得手段30Xと、病変検知手段34Xと、を備える。画像処理装置1Xは、複数の装置から構成されてもよい。
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得する取得手段と、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知する病変検知手段と、
を有し、
前記病変検知手段は、前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する画像処理装置。
[付記2]
前記パラメータは、前記病変を検知したと判定する条件を規定するパラメータであり、
前記病変検知手段は、前記非選択モデルが算出するスコアが示す前記病変が存在する確信度が高いほど、前記条件を緩和するように前記パラメータを変更する、付記1に記載の画像処理装置。
[付記3]
前記第1モデルは、畳み込みニューラルネットワークをアーキテクチャに含む深層学習モデルである、付記1に記載の画像処理装置。
[付記4]
前記選択モデルは、前記第1モデルであり、
前記病変検知手段は、時系列により取得される前記内視鏡画像から前記第1モデルが算出するスコアが示す前記病変が存在する確信度が所定の閾値より大きくなる連続回数が所定回数よりも多い場合に、前記病変を検知したと判定し、
前記パラメータは、前記所定回数又は前記所定の閾値の少なくとも一方であり、
前記病変検知手段は、前記第2モデルが算出するスコアに基づき、前記所定回数又は前記所定の閾値の少なくとも一方を変更する、付記1に記載の画像処理装置。
[付記5]
前記第2モデルは、SPRTに基づくモデルである、付記1に記載の画像処理装置。
[付記6]
前記選択モデルは、前記第2モデルであり、
前記病変検知手段は、前記第2モデルが算出するスコアが示す前記病変が存在する確信度が所定の閾値よりも大きくなる場合に、前記病変を検知したと判定し、
前記パラメータは、前記所定の閾値であり、
前記病変検知手段は、前記第1モデルが算出するスコアに基づき、前記所定の閾値を変更する、付記1に記載の画像処理装置。
[付記7]
前記病変検知手段は、前記内視鏡画像の変動の度合いに基づき、前記選択モデルを、前記第1モデル及び前記第2モデルから決定する、付記1に記載の画像処理装置。
[付記8]
前記病変検知手段は、前記選択モデルが算出するスコアに基づく所定の条件が満たされたと判定した場合に、前記非選択モデルによるスコアの算出を開始する、付記1に記載の画像処理装置。
[付記9]
前記病変検知手段による前記病変の検知結果に関する情報を表示又は音声出力する出力制御手段をさらに有する、付記1に記載の画像処理装置。
[付記10]
前記出力制御手段は、検査者の意思決定を支援するために、前記病変の検知結果に関する情報と、前記選択モデルに関する情報とを出力する、付記9に記載の画像処理装置。
[付記11]
コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する、
画像処理方法。
[付記12]
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
13 インターフェース
14 入力部
15 光源部
16 音出力部
100 内視鏡検査システム
Claims (12)
- 内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得する取得手段と、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知する病変検知手段と、
を有し、
前記病変検知手段は、前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する画像処理装置。 - 前記パラメータは、前記病変を検知したと判定する条件を規定するパラメータであり、
前記病変検知手段は、前記非選択モデルが算出するスコアが示す前記病変が存在する確信度が高いほど、前記条件を緩和するように前記パラメータを変更する、請求項1に記載の画像処理装置。 - 前記第1モデルは、畳み込みニューラルネットワークをアーキテクチャに含む深層学習モデルである、請求項1に記載の画像処理装置。
- 前記選択モデルは、前記第1モデルであり、
前記病変検知手段は、時系列により取得される前記内視鏡画像から前記第1モデルが算出するスコアが示す前記病変が存在する確信度が所定の閾値より大きくなる連続回数が所定回数よりも多い場合に、前記病変を検知したと判定し、
前記パラメータは、前記所定回数又は前記所定の閾値の少なくとも一方であり、
前記病変検知手段は、前記第2モデルが算出するスコアに基づき、前記所定回数又は前記所定の閾値の少なくとも一方を変更する、請求項1に記載の画像処理装置。 - 前記第2モデルは、SPRTに基づくモデルである、請求項1に記載の画像処理装置。
- 前記選択モデルは、前記第2モデルであり、
前記病変検知手段は、前記第2モデルが算出するスコアが示す前記病変が存在する確信度が所定の閾値よりも大きくなる場合に、前記病変を検知したと判定し、
前記パラメータは、前記所定の閾値であり、
前記病変検知手段は、前記第1モデルが算出するスコアに基づき、前記所定の閾値を変更する、請求項1に記載の画像処理装置。 - 前記病変検知手段は、前記内視鏡画像の変動の度合いに基づき、前記選択モデルを、前記第1モデル及び前記第2モデルから決定する、請求項1に記載の画像処理装置。
- 前記病変検知手段は、前記選択モデルが算出するスコアに基づく所定の条件が満たされたと判定した場合に、前記非選択モデルによるスコアの算出を開始する、請求項1に記載の画像処理装置。
- 前記病変検知手段による前記病変の検知結果に関する情報を表示又は音声出力する出力制御手段をさらに有する、請求項1に記載の画像処理装置。
- 前記出力制御手段は、検査者の意思決定を支援するために、前記病変の検知結果に関する情報と、前記選択モデルに関する情報とを出力する、請求項9に記載の画像処理装置。
- コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する、
画像処理方法。 - 内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像を取得し、
所定枚数の前記内視鏡画像に基づき前記被検体の病変に関する推論を行う第1モデルと、可変枚数の前記内視鏡画像に基づき前記病変に関する推論を行う第2モデルと、から選択される選択モデルに基づき、前記病変を検知し、
前記選択モデルではない第1モデル又は第2モデルである非選択モデルに基づき、前記選択モデルに基づく前記病変の検知に用いるパラメータを変更する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/561,130 US20250078259A1 (en) | 2022-10-06 | 2023-08-18 | Image processing device, image processing method, and storage medium |
| JP2024555650A JPWO2024075411A5 (ja) | 2023-08-18 | 画像処理装置、画像処理方法及びプログラム | |
| US18/544,857 US20240127443A1 (en) | 2022-10-06 | 2023-12-19 | Image processing device, image processing method, and storage medium |
| US18/544,886 US20240135539A1 (en) | 2022-10-05 | 2023-12-19 | Image processing device, image processing method, and storage medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JPPCT/JP2022/037418 | 2022-10-06 | ||
| PCT/JP2022/037418 WO2024075240A1 (ja) | 2022-10-06 | 2022-10-06 | 画像処理装置、画像処理方法及び記憶媒体 |
Related Child Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/561,130 A-371-Of-International US20250078259A1 (en) | 2022-10-05 | 2023-08-18 | Image processing device, image processing method, and storage medium |
| US18/544,886 Continuation US20240135539A1 (en) | 2022-10-05 | 2023-12-19 | Image processing device, image processing method, and storage medium |
| US18/544,857 Continuation US20240127443A1 (en) | 2022-10-06 | 2023-12-19 | Image processing device, image processing method, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024075411A1 true WO2024075411A1 (ja) | 2024-04-11 |
Family
ID=90607884
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/037418 Ceased WO2024075240A1 (ja) | 2022-10-05 | 2022-10-06 | 画像処理装置、画像処理方法及び記憶媒体 |
| PCT/JP2023/029842 Ceased WO2024075411A1 (ja) | 2022-10-05 | 2023-08-18 | 画像処理装置、画像処理方法及び記憶媒体 |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/037418 Ceased WO2024075240A1 (ja) | 2022-10-05 | 2022-10-06 | 画像処理装置、画像処理方法及び記憶媒体 |
Country Status (2)
| Country | Link |
|---|---|
| US (3) | US20250078259A1 (ja) |
| WO (2) | WO2024075240A1 (ja) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018225448A1 (ja) * | 2017-06-09 | 2018-12-13 | 智裕 多田 | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
| WO2020003607A1 (ja) * | 2018-06-25 | 2020-01-02 | オリンパス株式会社 | 情報処理装置、モデル学習方法、データ認識方法および学習済みモデル |
| WO2020071086A1 (ja) * | 2018-10-04 | 2020-04-09 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
| WO2020194497A1 (ja) * | 2019-03-26 | 2020-10-01 | 日本電気株式会社 | 情報処理装置、個人識別装置、情報処理方法及び記憶媒体 |
| JP2020156903A (ja) * | 2019-03-27 | 2020-10-01 | Hoya株式会社 | 内視鏡用プロセッサ、情報処理装置、プログラム、情報処理方法および学習モデルの生成方法 |
-
2022
- 2022-10-06 WO PCT/JP2022/037418 patent/WO2024075240A1/ja not_active Ceased
-
2023
- 2023-08-18 US US18/561,130 patent/US20250078259A1/en active Pending
- 2023-08-18 WO PCT/JP2023/029842 patent/WO2024075411A1/ja not_active Ceased
- 2023-12-19 US US18/544,886 patent/US20240135539A1/en active Pending
- 2023-12-19 US US18/544,857 patent/US20240127443A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018225448A1 (ja) * | 2017-06-09 | 2018-12-13 | 智裕 多田 | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
| WO2020003607A1 (ja) * | 2018-06-25 | 2020-01-02 | オリンパス株式会社 | 情報処理装置、モデル学習方法、データ認識方法および学習済みモデル |
| WO2020071086A1 (ja) * | 2018-10-04 | 2020-04-09 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
| WO2020194497A1 (ja) * | 2019-03-26 | 2020-10-01 | 日本電気株式会社 | 情報処理装置、個人識別装置、情報処理方法及び記憶媒体 |
| JP2020156903A (ja) * | 2019-03-27 | 2020-10-01 | Hoya株式会社 | 内視鏡用プロセッサ、情報処理装置、プログラム、情報処理方法および学習モデルの生成方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20240135539A1 (en) | 2024-04-25 |
| WO2024075240A1 (ja) | 2024-04-11 |
| US20250078259A1 (en) | 2025-03-06 |
| US20240127443A1 (en) | 2024-04-18 |
| JPWO2024075411A1 (ja) | 2024-04-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7647864B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| WO2007119297A1 (ja) | 医療用画像処理装置及び医療用画像処理方法 | |
| JP7485193B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| WO2023042273A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2024075411A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2024075410A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| US20250209776A1 (en) | Image processing device, image processing method, and storage medium | |
| WO2022224446A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2024013848A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| US20250157028A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250078254A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250104222A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250232434A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250104241A1 (en) | Image processing device, image processing method, and storage medium | |
| WO2024180593A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2023187886A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| US20240212142A1 (en) | Image processing device, image processing method, and storage medium | |
| US20250228427A1 (en) | Image processing device, image processing method, and storage medium | |
| WO2025083856A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 18561130 Country of ref document: US |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23874543 Country of ref document: EP Kind code of ref document: A1 |
|
| WWP | Wipo information: published in national office |
Ref document number: 18561130 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024555650 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23874543 Country of ref document: EP Kind code of ref document: A1 |