WO2024018581A1 - 画像処理装置、画像処理方法及び記憶媒体 - Google Patents
画像処理装置、画像処理方法及び記憶媒体 Download PDFInfo
- Publication number
- WO2024018581A1 WO2024018581A1 PCT/JP2022/028321 JP2022028321W WO2024018581A1 WO 2024018581 A1 WO2024018581 A1 WO 2024018581A1 JP 2022028321 W JP2022028321 W JP 2022028321W WO 2024018581 A1 WO2024018581 A1 WO 2024018581A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- tumor
- distance
- image processing
- processing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
Definitions
- the present disclosure relates to the technical field of an image processing device, an image processing method, and a storage medium that process images acquired in endoscopy.
- Patent Document 1 discloses a method for supporting diagnosis of diseases using endoscopic images of the digestive organs.
- CAD Computer Aided Detection/Diagnosis
- one of the objects of the present disclosure is to provide an image processing device, an image processing method, and a storage medium that can present information about tumors during endoscopy.
- One aspect of the image processing device is an infiltration distance acquisition means for acquiring an infiltration distance of a tumor site of the subject in the endoscopic image based on an endoscopic image taken of the subject by a photographing unit provided in the endoscope; output control means for outputting an image or sound based on the infiltration distance to an output device;
- This is an image processing device having:
- One aspect of the image processing method is The computer is Obtaining the invasion distance of the tumor site of the subject in the endoscopic image based on an endoscopic image taken of the subject by an imaging unit provided in the endoscope; outputting an image or sound based on the infiltration distance to an output device; This is an image processing method.
- One aspect of the storage medium is Obtaining the invasion distance of the tumor site of the subject in the endoscopic image based on an endoscopic image taken of the subject by an imaging unit provided in the endoscope; It is a storage medium that stores a program that causes a computer to execute a process of outputting an image or sound based on the infiltration distance to an output device.
- information regarding tumors can be presented during endoscopy.
- FIG. 3 is a functional block diagram of an image processing device regarding display processing based on the invasion distance of a tumor site.
- A An overview of infiltration distance estimation processing based on the first example is shown.
- B A diagram showing an outline of a method for estimating the infiltration distance along the cross-sectional designation line.
- a first display example of a display screen displayed by a display device during an endoscopy is shown.
- a second display example of a display screen displayed by a display device during an endoscopy is shown.
- FIG. 1 is an example of a flowchart illustrating an overview of processing performed by the image processing device during an endoscopy in the first embodiment. It is a schematic block diagram of the endoscopy system in a modification.
- FIG. 2 is a block diagram of an image processing device in a second embodiment. This is an example of a flowchart executed by the image processing apparatus in the second embodiment.
- FIG. 1 shows a schematic configuration of an endoscopy system 100.
- the endoscopy system 100 provides information to a part of a subject suspected of having a tumor (also referred to as a "tumor site") to an examiner such as a doctor who performs an examination or treatment using an endoscope. .), and mainly includes an image processing device 1, a display device 2, and an endoscope 3 connected to the image processing device 1.
- the image processing device 1 acquires images (also referred to as "endoscope images Ia") photographed by the endoscope scope 3 in chronological order from the endoscope scope 3, and displays a screen based on the endoscopic images Ia. Display on device 2.
- the endoscopic image Ia is an image captured at a predetermined frame period during at least one of the insertion process and the ejection process of the endoscope 3 into the subject.
- the image processing apparatus 1 detects an endoscopic image Ia including a tumor site (also referred to as a "tumor-containing image”), the image processing apparatus 1 detects the tumor infiltration distance of the subject's site in the tumor-containing image.
- images based on invasion distance include a map representing the invasion distance, a cross-sectional view at a cutting plane specified by the user, and a 3D model representing the 3D shape of the tumor using CG (Computer Graphics). Including.
- the display device 2 is a display or the like that performs a predetermined display based on a display signal supplied from the image processing device 1.
- the endoscope 3 mainly includes an operating section 36 for the examiner to perform predetermined inputs, a flexible shaft 37 that is inserted into the organ to be imaged of the subject, and an ultra-compact imaging device. It has a distal end portion 38 containing a photographing section such as an element, and a connecting section 39 for connecting to the image processing device 1.
- the operation unit 36 captures the endoscopic image displayed on the display device 2 (i.e., captures the endoscopic image displayed on the display device 2). It includes a button (also called a "still image save button”) for instructing to save as a still image.
- the configuration of the endoscopy system 100 shown in FIG. 1 is an example, and various changes may be made.
- the image processing device 1 may be configured integrally with the display device 2.
- the image processing device 1 may be composed of a plurality of devices.
- the subject of endoscopy in the present disclosure is not limited to the large intestine, but may be any organ that can be subjected to endoscopy, such as the esophagus, stomach, and pancreas.
- the target endoscopes in the present disclosure include a pharyngoscope, a bronchoscope, an upper gastrointestinal endoscope, a duodenoscope, a small intestine endoscope, a colonoscope, a capsule endoscope, a thoracoscope, Examples include laparoscope, cystoscope, cholangioscope, arthroscope, spinal endoscope, angioscope, and epidural space endoscope.
- FIG. 2 shows the hardware configuration of the image processing device 1.
- the image processing device 1 mainly includes a processor 11 , a memory 12 , an interface 13 , an input section 14 , a light source section 15 , and a sound output section 16 . Each of these elements is connected via a data bus 19.
- the processor 11 executes a predetermined process by executing a program stored in the memory 12.
- the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit).
- Processor 11 may be composed of multiple processors.
- Processor 11 is an example of a computer.
- the memory 12 includes various types of volatile memory used as working memory, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memory that stores information necessary for processing by the image processing device 1. be done.
- the memory 12 may include an external storage device such as a hard disk connected to or built in the image processing device 1, or may include a removable storage medium such as a flash memory.
- the memory 12 stores programs for the image processing device 1 to execute each process in this embodiment.
- the memory 12 also includes tumor detection model information D1 regarding a tumor detection model that is a model for detecting an endoscopic image Ia that is a tumor-containing image from an input endoscopic image Ia, and tumor detection model information D1 regarding a tumor detection model that is a model for detecting an endoscopic image Ia that is a tumor-containing image from an input endoscopic image Ia.
- Infiltration distance estimation model information D2 regarding an infiltration distance estimation model that is a model for estimating the infiltration distance of a site is stored. The tumor detection model information D1 and the invasion distance estimation model information D2 will be described later.
- the interface 13 performs an interface operation between the image processing device 1 and an external device.
- the interface 13 supplies display information “Ib” generated by the processor 11 to the display device 2.
- the interface 13 supplies light etc. generated by the light source section 15 to the endoscope 3.
- the interface 13 supplies the processor 11 with an electrical signal indicating the endoscopic image Ia supplied from the endoscope 3 .
- the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, and may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc. You can.
- the input unit 14 generates input signals based on operations by the examiner.
- the input unit 14 is, for example, a button, a touch panel stacked on the display device 2, a remote controller, a voice input device, or the like.
- the light source section 15 generates light to be supplied to the distal end section 38 of the endoscope 3. Further, the light source section 15 may also incorporate a pump or the like for sending out water and air to be supplied to the endoscope 3.
- the sound output section 16 outputs sound under the control of the processor 11.
- the tumor detection model information D1 and the invasion distance estimation model information D2 stored in the memory 12 will be explained in detail.
- the tumor detection model information D1 is information related to a tumor detection model that, when an endoscopic image Ia is input, outputs information regarding whether a tumor site is included in the input endoscopic image Ia.
- the tumor detection model information D1 includes parameters necessary to configure a tumor detection model.
- the tumor detection model is, for example, a classification model that, when an endoscopic image Ia is input, outputs a classification result regarding the presence or absence of a tumor site in the input endoscopic image Ia.
- the tumor detection model may be any machine learning model (including a statistical model, the same applies hereinafter) such as a neural network or a support vector machine.
- the tumor detection model information D1 includes various parameters such as the layer structure, the neuron structure of each layer, the number and size of filters in each layer, and the weight of each element of each filter. include.
- the invasion distance estimation model information D2 is information related to an invasion distance estimation model that estimates the invasion distance of a tumor site in the image when an image of a part of the subject including a tumor site is input. Contains the parameters necessary to configure the distance estimation model.
- the invasion distance estimation model is a model that has learned the relationship between the image input to the invasion distance estimation model and the invasion distance of the tumor site of the subject represented in the image.
- the infiltration distance estimation model may be, for example, any machine learning model (including a statistical model, the same applies hereinafter) such as a neural network or a support vector machine.
- the invasion distance estimation model information D2 includes various information such as the layer structure, the neuron structure of each layer, the number and size of filters in each layer, and the weight of each element of each filter. Contains parameters.
- the image input to the invasion distance estimation model may be a partial image of the tumor-containing image that is regularly (for example, in a grid pattern) cut out from the tumor-containing image, or it may be the tumor-containing image itself.
- the invasion distance estimation model outputs a numerical value indicating the estimation result of the invasion distance at the center position of the input partial image
- the invasion distance estimation model outputs a numerical value indicating the estimation result of the invasion distance at the center position of the input partial image.
- An image showing the estimation result of the infiltration distance in each pixel (or in units of blocks or subpixels of multiple pixels) is output.
- the invasion distance estimation model is a model that further outputs estimation results regarding the depth of each layer constituting the wall layer of the subject represented in the image input to the invasion distance estimation model.
- the invasion distance estimation model estimates the depth of each layer: mucosal layer, muscularis mucosae, submucosa, muscularis basement, subserosa, and serosa;
- the invasion distance estimation model estimates the depth of each layer: mucosal layer, submucosa, muscularis basement, and adventitia.
- the model for estimating the depth of each layer constituting the wall layer may be a separate model from the infiltration distance estimation model.
- the tumor detection model and the invasion distance estimation model are learning models
- the tumor detection model and the invasion distance estimation model are based on an input image that conforms to the input format of each model and when the input image is input to each model. It is learned in advance based on a pair with correct answer data indicating the correct answer to be output. Then, the parameters of each model obtained through learning are stored in the memory 12 as tumor detection model information D1 and invasion distance estimation model information D2, respectively.
- the image processing device 1 detects an endoscopic image Ia that is a tumor-containing image
- the image processing device 1 calculates the infiltration distance at each position of the subject represented in the tumor-containing image.
- An image based on the estimated infiltration distance is displayed on the display device 2.
- the image processing device 1 can present information regarding the infiltration distance of the tumor site to the examiner without requiring a detailed examination such as a biopsy. Therefore, the image processing device 1 can immediately present information necessary for determining the necessity of surgery to the examiner during an endoscopy.
- FIGS. 3(A) to 3(D) are diagrams schematically showing the flow of display processing based on the invasion distance of the tumor site.
- the image processing device 1 acquires time-series endoscopic images Ia from the endoscopic scope 3, as shown in FIG. 3(A). Then, as shown in FIG. 3B, the image processing device 1 automatically detects the tumor site using the tumor detection model or performs the user's operation on the endoscopic image Ia that corresponds to the tumor-containing image among the acquired endoscopic images Ia.
- the tumor-containing image is recognized by designation using the still image save button of the section 36, and the recognized tumor-containing image is displayed on the display device 2.
- the image processing device 1 displays the tumor-containing image on the display device 2, and as shown in FIG. It is accepted by the input unit 14 or the like.
- the cross-section designation line Lc may be specified by, for example, input using a mouse or a touch panel.
- the image processing device 1 displays on the display device 2 an image based on the invasion distance for each position of the tumor-containing image estimated using the invasion distance estimation model.
- the image processing device 1 generates a map of the infiltration distance of the subject corresponding to the endoscopic image Ia (also referred to as an "infiltration distance map”), a cross-sectional view of the subject whose cut plane is the cross-section designation line Lc ( (also referred to as a "tumor cross-sectional view”), or a three-dimensional model (also referred to as a "tumor 3D model”) representing a three-dimensional shape of the tumor site estimated based on the estimated invasion distance.
- 3(D) is a heat map in which the darker the color, the longer the invasion distance of the tumor site is.
- the 3D tumor model may be, for example, a graphic display (for example, a wire frame display) used in CAD (Computer Aided Design).
- cross-section designation line Lc may be designated after the infiltration distance map is displayed. A specific example of this will be described later.
- FIG. 4 is a functional block diagram of the image processing device 1 regarding display processing based on the invasion distance of a tumor site.
- the processor 11 of the image processing device 1 functionally includes an endoscopic image acquisition section 30, a tumor determination section 31, an invasion distance estimation section 32, and a display control section 33. Note that in FIG. 4, blocks where data is exchanged are connected by solid lines, but the combination of blocks where data is exchanged is not limited to this. The same applies to other functional block diagrams to be described later.
- the endoscopic image acquisition unit 30 acquires endoscopic images Ia taken by the endoscope 3 via the interface 13 at predetermined intervals. Then, the endoscopic image acquisition unit 30 supplies the acquired endoscopic image Ia to the tumor determination unit 31 and the display control unit 33, respectively.
- the tumor determination unit 31 determines whether the endoscopic image Ia supplied from the endoscopic image acquisition unit 30 is a tumor-containing image. In this case, the tumor determination unit 31 detects the endoscopic image Ia that is a tumor-containing image, for example, based on at least one of the user input (ie, external input) or the analysis result of the endoscopic image Ia. When the tumor determining unit 31 detects an endoscopic image Ia that is a tumor-containing image, the tumor determining unit 31 supplies the detected tumor-containing image to the invasion distance estimating unit 32.
- the tumor determination unit 31 detects that the still image save button has been selected based on the signal supplied from the operation unit 36, the tumor determination unit 31 detects the internal image displayed on the display device 2 at the time of selection.
- Mirror image Ia is detected as a tumor-containing image.
- the tumor determination unit 31 may detect the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30 at the time when the still image save button is selected as a tumor-containing image.
- the tumor determination unit 31 inputs the endoscopic image Ia supplied from the endoscopic image acquisition unit 30 into the tumor detection model configured with reference to the tumor detection model information D1, and inputs the endoscopic image Ia. Based on the information output by the tumor detection model in this case, it is determined whether the input endoscopic image Ia is a tumor-containing image. For example, the tumor detection model outputs a classification result regarding the presence or absence of a tumor site in the input endoscopic image Ia, and the tumor determination unit 31 determines whether the input endoscopic image Ia is a tumor-containing image based on the classification result. Determine whether it exists or not.
- the invasion distance estimation unit 32 estimates the invasion distance of the tumor site of the subject indicated by the tumor-containing image supplied from the tumor determination unit 31, based on the invasion distance estimation model configured by referring to the invasion distance estimation model information D2. Then, the estimation result is supplied to the display control section 33.
- the invasion distance estimation unit 32 generates partial images by dividing the tumor-containing image regularly (for example, in a grid pattern), and sequentially inputs each divided partial image into the invasion distance estimation model. As a result, the infiltration distance for each partial image sequentially output by the infiltration distance estimation model is outputted to the display control unit 33.
- the invasion distance estimating unit 32 inputs the tumor-containing image into the invasion distance estimation model, and controls the display of an image indicating the invasion distance for each pixel of the tumor-containing image, which is output by the invasion distance estimation model. It outputs to section 33.
- FIG. 5(A) shows an overview of the infiltration distance estimation process based on the first example described above.
- the invasion distance estimating unit 32 generates a total of 42 partial images for one tumor-containing image by dividing the tumor-containing image into seven horizontally and six vertically in a grid pattern. By inputting the partial images to the tumor detection model, the invasion distance of the center position of each partial image is obtained. Thereby, the invasion distance estimating unit 32 acquires the distribution (map) of invasion distances on the tumor-containing image, which is necessary for generating an invasion distance map.
- the invasion distance estimating unit 32 allows the partial images to overlap each other so that the distance between the center positions of adjacent partial images is shorter than the length of the partial images. Partial images may also be generated. This makes it possible to obtain a more detailed distribution of invasion distance on a tumor-containing image.
- FIG. 5(B) is a diagram showing an outline of a method for estimating the infiltration distance along the cross-section designation line Lc.
- the infiltration distance estimating unit 32 sets points C1 to C5 at equal intervals on the cross-section designation line Lc, and sets partial images Ip1 to Ip1 to C5, which are square areas centered on the points C1 to C5, respectively. IP5 is set.
- the infiltration distance estimating unit 32 sequentially inputs the partial images Ip1 to Ip5 to the infiltration distance estimation model, and obtains the infiltration distances sequentially output by the infiltration distance estimation model as the infiltration distances at the points C1 to C5.
- the infiltration distance estimating unit 32 or the display control unit 33 interpolates the infiltration distance output by the infiltration distance estimation model using an arbitrary interpolation process, and generates a function or the like representing the infiltration distance at an arbitrary point on the cross-section designation line Lc. Further calculations may be made. Thereby, the infiltration distance estimating unit 32 or the display control unit 33 accurately identifies the shape of the tumor site when the cross-section designation line Lc is used as the cutting plane, and displays a tumor cross-sectional view showing a smooth shape of the tumor site. becomes possible.
- the infiltration distance estimating unit 32 or display control unit 33 when generating an infiltration distance map, the infiltration distance estimating unit 32 or display control unit 33 generates an infiltration distance map obtained by interpolating the infiltration distance output by the infiltration distance estimation model in the vertical and horizontal directions using arbitrary interpolation processing. may be generated.
- the display control section 33 will be explained with reference to FIG. 4 again.
- the display control unit 33 generates display information Ib based on the latest endoscopic image Ia supplied from the endoscopic image acquisition unit 30 and the infiltration distance estimation result supplied from the display control unit 33. Then, the display control unit 33 supplies the generated display information Ib to the display device 2, thereby causing the display device 2 to display an image based on the latest endoscopic image Ia and the infiltration distance. Further, when the tumor determination unit 31 detects a tumor-containing image, the display control unit 33 displays the latest tumor-containing image on the display device 2 in addition to or in place of the latest endoscopic image Ia. may be displayed.
- the display control unit 33 specifies a cross-section designation line Lc (see FIG. 3(C)) on the tumor-containing image or the latest endoscopic image Ia. It is preferable to receive a user input, generate a tumor cross-sectional view whose cut plane is the cross-section designation line Lc specified by the user input, and display it on the display device 2. In this case, when displaying at least the invasion distance map and the tumor cross-sectional view, the display control unit 33 may accept the designation of the cross-section designation line Lc after displaying the invasion distance map.
- the display control unit 33 may receive a user input specifying a cross-section designation line Lc on the invasion distance map, and generate a tumor cross-sectional view with the cross-section designation line Lc designated by the user input as the cutting plane.
- the display control unit 33 controls the sound output unit 16 to output a warning sound or voice guidance to notify the user that a tumor site has been detected. Output control may also be performed. Further, the display control unit 33 may control the sound output of the sound output unit 16 according to the estimation result of the infiltration distance. For example, when the tumor determining unit 31 detects a tumor-containing image, the display control unit 33 may output a sound (including pitch and melody) corresponding to the infiltration distance at the center of the tumor-containing image.
- the display control unit 33 refers to this information and Outputs a sound according to the infiltration distance in the area. In this way, the display control unit 33 may output a sound corresponding to the operation of the endoscope 3 by the examiner.
- each component of the endoscopic image acquisition unit 30, tumor determination unit 31, invasion distance estimation unit 32, and display control unit 33 can be realized by, for example, the processor 11 executing a program. Further, each component may be realized by recording necessary programs in an arbitrary non-volatile storage medium and installing them as necessary. Note that at least a part of each of these components is not limited to being realized by software based on a program, but may be realized by a combination of hardware, firmware, and software. Furthermore, at least a portion of each of these components may be realized using a user-programmable integrated circuit, such as a field-programmable gate array (FPGA) or a microcontroller. In this case, this integrated circuit may be used to implement a program made up of the above-mentioned components.
- FPGA field-programmable gate array
- each component may be configured by an ASSP (Application Specific Standard Produce), an ASIC (Application Specific Integrated Circuit), or a quantum processor (Quantum Computer Control Chip). Good.
- ASSP Application Specific Standard Produce
- ASIC Application Specific Integrated Circuit
- quantum processor Quantum Computer Control Chip
- FIG. 6 shows a first display example of the display screen displayed by the display device 2 during an endoscopy.
- the display control unit 33 of the image processing device 1 transmits the display information Ib generated based on the information supplied from the other processing units 30 to 32 in the processor 11 to the display device 2, as shown in FIG. A display screen is displayed on the display device 2.
- the display control unit 33 of the image processing device 1 accepts the designation of the cross-section designation line Lc, and also determines the infiltration distance regarding the detected tumor-containing image. Images and the like based on the estimation results are displayed on the display screen. Specifically, the display control unit 33 displays an endoscopic image 70, an invasion distance map 71, and a tumor cross-sectional view 72 on the display screen.
- the endoscopic image 70 represents a moving image based on the latest endoscopic image Ia acquired by the endoscopic image acquisition unit 30 or a still image of the latest tumor-containing image detected by the tumor determination unit 31.
- the display control unit 33 displays both a video based on the latest endoscopic image Ia and a still image of the latest tumor-containing image detected by the tumor determination unit 31 on the display screen. It may be displayed above. Furthermore, the display control unit 33 displays a cross-section designation line Lc designated by user input on the endoscopic image 70.
- the display control unit 33 estimates the invasion distance of the latest tumor-containing image based on the invasion distance estimation model, and displays an invasion distance map 71 showing the estimation result.
- the display control unit 33 displays an infiltration distance map 71 that shows contour lines connecting positions where each infiltration distance is the same with lines for infiltration distances at predetermined intervals.
- the display control unit 33 displays the tumor cross-sectional view 72 based on the invasion distance along the cross-section designation line Lc.
- the mucosal layer (M), muscularis mucosa (MM), and submucosal layer (SM) that constitute the wall layer of the large intestine, which is the subject, are also clearly shown on the tumor cross-sectional view 72.
- the display control unit 33 uses an invasion distance estimation model that is trained to estimate the depth of each wall layer of the large intestine in addition to the invasion distance when a partial image or a tumor-containing image is input.
- a tumor cross-sectional diagram 72 is generated based on the estimated depth (width) of each wall layer of the large intestine.
- the display control unit 33 interpolates the estimation results of the depth of each wall layer along the cross-section designation line Lc for each wall layer, and displays a tumor cross-sectional view based on the depth of each wall layer obtained by interpolation. 72 may be generated.
- the display control unit 33 refers to this information and calculates the depth of each wall layer as described above.
- a tumor cross-sectional view 72 is generated with the standard value of .
- the bottom of the submucosal layer (SM) is drawn to coincide with the bottom edge of the diagram.
- the display control unit 33 may display a 3D tumor model on the display screen in addition to or in place of the invasion distance map 71 and the tumor cross-sectional view 72.
- the display control unit 33 geometrically identifies the three-dimensional shape of the tumor site based on the estimation result of the invasion distance in the tumor-containing image, and displays a 3D tumor model representing the identified three-dimensional shape on the display screen. to be displayed.
- FIG. 7 shows a second display example of the display screen displayed by the display device 2 during an endoscopy.
- the display control unit 33 of the image processing device 1 transmits the display information Ib generated based on the information supplied from the other processing units 30 to 32 in the processor 11 to the display device 2, as shown in FIG. A display screen is displayed on the display device 2.
- the display control unit 33 of the image processing device 1 since the tumor determining unit 31 has detected a tumor-containing image, the display control unit 33 of the image processing device 1 generates an invasion distance map based on the estimation result of the invasion distance regarding the detected tumor-containing image. , the infiltration distance map is displayed superimposed on the endoscopic image 70.
- the infiltration distance map superimposed on the endoscopic image 70 is the same as the infiltration distance map 71 in FIG. 6, and is displayed with a predetermined transmittance so that the endoscopic image 70 can be visually recognized. .
- the display control unit 33 displays a text message 75 on the display screen that prompts the user to input the cross-section designation line Lc. Then, the display control unit 33 receives the designation of the cross-section designation line Lc based on the operation of the input unit 14 by the examiner. Then, when the display control unit 33 detects a signal from the input unit 14 that specifies the cross-section designation line Lc, the display control unit 33 specifies the cross-section designation line Lc, generates a tumor cross-sectional diagram based on the cross-section designation line Lc, and displays it on the display screen. A cross-sectional view of the tumor is displayed.
- the display control unit 33 can suitably receive the designation of the cross-section designation line Lc for displaying the tumor cross-section diagram. Note that instead of the second display example, the display control unit 33 displays the infiltration distance map and the endoscopic image 70 side by side without overlapping, and draws the cross-section designation line Lc on the infiltration distance map or the endoscopic image 70. It may also accept specified input.
- FIG. 8 is an example of a flowchart showing an overview of the processing executed by the image processing apparatus 1 during endoscopy in the first embodiment.
- the image processing device 1 acquires an endoscopic image Ia (step S11).
- the endoscopic image acquisition unit 30 of the image processing device 1 receives the endoscopic image Ia from the endoscope scope 3 via the interface 13.
- the image processing device 1 determines whether the endoscopic image Ia acquired in step S11 corresponds to a tumor-containing image that includes a tumor site (step S12). In this case, the image processing device 1 performs the above-described determination based on information output by the tumor detection model when the endoscopic image Ia is input to the tumor detection model configured based on the tumor detection model information D1.
- the image processing device 1 determines that the endoscopic image Ia acquired in step S11 is a tumor-containing image (step S12; Yes)
- the image processing device 1 calculates the invasion distance (step S13).
- the image processing device 1 acquires the invasion distance output by the invasion distance estimation model when the tumor-containing image or its partial image is input to the invasion distance estimation model configured based on the invasion distance estimation model information D2.
- the image processing device 1 displays the endoscopic image Ia acquired in step S11 and the image based on the infiltration distance calculated in step S13 on the display device 2 (step S14).
- the image based on the invasion distance is, for example, an invasion distance map, a tumor cross section, or a tumor 3D model.
- the image processing device 1 accepts a user input specifying a cross-section designation line Lc on the endoscopic image Ia.
- the image processing device 1 may display a moving image of the endoscopic image Ia and a still image of the latest tumor-containing image, respectively.
- step S12 determines that the endoscopic image Ia obtained in step S11 is not a tumor-containing image (step S12; No)
- the image processing device 1 displays the endoscopic image Ia obtained in step S11 on the display device 2. Display (step S15).
- step S16 the image processing device 1 determines whether the endoscopy has ended. For example, when the image processing device 1 detects a predetermined input to the input unit 14 or the operation unit 36, it determines that the endoscopy has ended. Then, when the image processing device 1 determines that the endoscopic examination has ended (step S16; Yes), the image processing device 1 ends the processing of the flowchart. On the other hand, if the image processing device 1 determines that the endoscopy has not been completed (step S16; No), the process returns to step S11. Then, the image processing device 1 executes the processes of steps S11 to S15 on the endoscopic image Ia newly generated by the endoscope 3.
- the image processing device 1 may generate an invasion distance map for a part of the tumor-containing image instead of generating the invasion distance map for the entire tumor-containing image.
- the image processing device 1 generates and displays an infiltration distance map targeting a rectangular area including the cross-section designation line Lc specified by the examiner (for example, the smallest rectangular area including the cross-section designation line Lc).
- the image processing device 1 specifies the cross-section designation line Lc based on the user input specifying the cross-section designation line Lc, and then displays the invasion distance map and the tumor cross-sectional view.
- the image processing device 1 can display an infiltration distance map limited to a region of interest to the examiner.
- the image processing device 1 may automatically set the cross-section designation line Lc instead of displaying the cross-section designation line Lc specified based on the user input.
- the image processing device 1 generates an invasion distance map for the entire tumor-containing image, and sets a cross-section designation line Lc of a predetermined length that passes through at least the point with the longest invasion distance in the invasion distance map.
- the image processing device 1 approximates a region where the infiltration distance is a predetermined distance or more with an ellipse, and sets a cross-section designation line Lc corresponding to the major axis of the approximated ellipse.
- the image processing device 1 can display a tumor cross-sectional view or the like regarding a tumor site without input from an examiner.
- the image processing device 1 may process an image made up of endoscopic images Ia generated during an endoscopy after the endoscopy.
- the image processing device 1 when a video to be processed is specified based on a user input through the input unit 14 at an arbitrary timing after an examination, the image processing device 1 performs a time-series internal view of the video constituting the video.
- the process shown in the flowchart of FIG. 8 is sequentially performed on the mirror image Ia. Then, if the image processing device 1 determines that the target video has ended in step S16, it ends the process of the flowchart, and if the target video has not ended, it returns to step S11 and returns to the next step in chronological order.
- the processing shown in the flowchart is performed on the endoscopic image Ia of .
- the tumor detection model information D1 and the invasion distance estimation model information D2 may be stored in a storage device different from the image processing device 1.
- FIG. 9 is a schematic configuration diagram of an endoscopy system 100A in Modification 3. Note that, for the sake of simplicity, the display device 2, endoscope 3, etc. are not shown.
- the endoscopy system 100A includes a server device 4 that stores tumor detection model information D1 and invasion distance estimation model information D2.
- the endoscopy system 100A also includes a plurality of image processing devices 1 (1A, 1B, . . . ) capable of data communication with the server device 4 via a network.
- each image processing device 1 refers to the tumor detection model information D1 and the invasion distance estimation model information D2 via the network.
- the interface 13 of each image processing device 1 includes a communication interface such as a network adapter for communication.
- each image processing device 1 can refer to the tumor detection model information D1 and the invasion distance estimation model information D2 and suitably execute processing related to lesion detection, as in the above-described embodiment.
- FIG. 10 is a block diagram of an image processing device 1X in the second embodiment.
- the image processing device 1X includes an infiltration distance acquisition means 32X and an output control means 33X.
- the image processing device 1X may be composed of a plurality of devices.
- the infiltration distance acquisition means 32X acquires the infiltration distance of the tumor site of the subject in the endoscopic image, based on an endoscopic image of the subject taken by an imaging unit provided in the endoscope.
- the infiltration distance acquisition means 32X can be, for example, the infiltration distance estimation unit 32 in the first embodiment (including modified examples, the same applies hereinafter).
- the infiltration distance acquisition means 32X receives information about the tumor of the subject from an external device (i.e., a device separate from the image processing device 1X) that executes processing equivalent to the infiltration distance estimating section 32 in the first embodiment.
- the above-mentioned infiltration distance may be acquired by receiving the estimation result of the infiltration distance of the site.
- the output control means 33X outputs an image or sound based on the infiltration distance to the output device.
- the output control means 33X can be the display control section 33 in the first embodiment.
- the output device can be at least one of the display device 2 and the sound output section 16 in the first embodiment. Further, the output device may be incorporated in the image processing device 1X.
- FIG. 11 is an example of a flowchart showing the processing procedure in the second embodiment.
- the infiltration distance acquisition means 32X acquires the infiltration distance of the tumor site of the subject in the endoscopic image based on the endoscopic image taken of the subject by the imaging unit provided in the endoscope (step S21).
- the output control means 33X outputs an image or sound based on the infiltration distance to the output device (step S22).
- the image processing device 1X can present information regarding the invasion distance of the tumor site of the subject included in the endoscopic image taken of the subject.
- Non-transitory computer-readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic storage media (e.g., flexible disks, magnetic tapes, hard disk drives), magneto-optical storage media (e.g., magneto-optical disks), CD-ROMs (Read Only Memory), CD-Rs, CD-R/W, semiconductor memory (e.g., mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory).
- Transitory computer readable media may be supplied to the computer by a transitory computer readable medium.
- Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
- Transitory computer readable media include electrical wires and optical
- the program can be supplied to the computer via a wired communication path such as a fiber or a wireless communication path.
- an infiltration distance acquisition means for acquiring an infiltration distance of a tumor site of the subject in the endoscopic image based on an endoscopic image taken of the subject by a photographing unit provided in the endoscope; output control means for outputting an image or sound based on the infiltration distance to an output device;
- An image processing device having: [Additional note 2] The image processing device according to supplementary note 1, wherein the output control means displays a map of the invasion distance in the endoscopic image including the tumor site on the output device as an image based on the invasion distance.
- the map is a contour map of the infiltration distance or a heat map of the infiltration distance other than a contour map.
- the infiltration distance acquisition means estimates the infiltration distance based on a model into which the endoscopic image or a partial image of the endoscopic image determined to include the tumor site is input,
- the image processing device according to appendix 9, wherein the model is a model that has learned a relationship between an image input to the model and the infiltration distance of the subject represented in the image.
- the computer is Obtaining the invasion distance of the tumor site of the subject in the endoscopic image based on an endoscopic image taken of the subject by an imaging unit provided in the endoscope; outputting an image or sound based on the infiltration distance to an output device; Image processing method.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得する浸潤距離取得手段と、
前記浸潤距離に基づく画像又は音を出力装置に出力する出力制御手段と、
を有する画像処理装置である。
コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する、
画像処理方法である。
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する処理をコンピュータに実行させるプログラムを格納した記憶媒体である。
(1)システム構成
図1は、内視鏡検査システム100の概略構成を示す。図1に示すように、内視鏡検査システム100は、内視鏡を利用した検査又は治療を行う医師等の検査者に対して腫瘍の疑いがある被検体の部位(「腫瘍部位」とも呼ぶ。)に関する情報を提示するシステムであって、主に、画像処理装置1と、表示装置2と、画像処理装置1に接続された内視鏡スコープ3と、を備える。
図2は、画像処理装置1のハードウェア構成を示す。画像処理装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、を含む。これらの各要素は、データバス19を介して接続されている。
腫瘍部位の浸潤距離に基づく表示処理について説明する。
概略的には、画像処理装置1は、腫瘍含有画像となる内視鏡画像Iaを検知した場合に、腫瘍含有画像に表された被検体の各位置での浸潤距離を推定し、推定した浸潤距離に基づく画像を表示装置2に表示させる。これにより、画像処理装置1は、生検等の詳細な検査を必要とすることなく、腫瘍部位の浸潤距離に関する情報を検査者に提示することができる。従って、画像処理装置1は、手術の要否判定等に必要となる情報を、内視鏡検査時に直ちに検査者に提示することができる。
図4は、腫瘍部位の浸潤距離に基づく表示処理に関する画像処理装置1の機能ブロック図である。画像処理装置1のプロセッサ11は、機能的には、内視鏡画像取得部30と、腫瘍判定部31と、浸潤距離推定部32と、表示制御部33と、を有する。なお、図4では、データの授受が行われるブロック同士を実線により結んでいるが、データの授受が行われるブロックの組合せはこれに限定されない。後述する他の機能ブロックの図においても同様である。
次に、表示制御部33が実行する表示装置2の表示制御について説明する。
図8は、第1実施形態において内視鏡検査時に画像処理装置1が実行する処理の概要を示すフローチャートの一例である。
次に、上述した実施形態に好適な変形例について説明する。以下の変形例は、組み合わせて上述の実施形態に適用してもよい。
浸潤距離マップを表示する場合、画像処理装置1は、腫瘍含有画像の全体に対する浸潤距離マップを生成する代わりに、腫瘍含有画像の一部に対する浸潤距離マップを生成してもよい。
画像処理装置1は、ユーザ入力に基づき指定された断面指定線Lcを表示する代わりに、断面指定線Lcを自動設定してもよい。
画像処理装置1は、内視鏡検査時に生成された内視鏡画像Iaから構成された映像を、検査後において処理してもよい。
腫瘍検知モデル情報D1及び浸潤距離推定モデル情報D2は、画像処理装置1とは別の記憶装置に記憶されてもよい。
図10は、第2実施形態における画像処理装置1Xのブロック図である。画像処理装置1Xは、浸潤距離取得手段32Xと、出力制御手段33Xと、を備える。画像処理装置1Xは、複数の装置から構成されてもよい。
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得する浸潤距離取得手段と、
前記浸潤距離に基づく画像又は音を出力装置に出力する出力制御手段と、
を有する画像処理装置。
[付記2]
前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位を含む前記内視鏡画像における前記浸潤距離のマップを前記出力装置に表示する、付記1に記載の画像処理装置。
[付記3]
前記マップは、前記浸潤距離の等高線図又は等高線図以外の前記浸潤距離のヒートマップである、付記2に記載の画像処理装置。
[付記4]
前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位における前記被検体の断面図を前記出力装置に表示する、付記1に記載の画像処理装置。
[付記5]
前記断面図には、前記腫瘍部位と、前記被検体の壁層とが含まれる、付記4に記載の画像処理装置。
[付記6]
前記出力制御手段は、前記出力装置に表示した前記内視鏡画像上において指定された線を切断面とする前記断面図を前記出力装置に表示する、付記4または5に記載の画像処理装置。
[付記7]
前記出力制御手段は、前記腫瘍部位を含む前記内視鏡画像における前記浸潤距離のマップを前記出力装置に表示後、前記線を指定する外部入力を受け付ける、付記6に記載の画像処理装置。
[付記8]
前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位の3次元モデルを前記出力装置に表示する、付記1に記載の画像処理装置。
[付記9]
前記撮影部が撮影した内視鏡画像から、前記腫瘍部位を含む内視鏡画像を判定する腫瘍判定手段をさらに有し、
前記浸潤距離取得手段は、前記腫瘍部位を含むと判定された前記内視鏡画像に基づき、浸潤距離を推定する、付記1に記載の画像処理装置。
[付記10]
前記浸潤距離取得手段は、前記腫瘍部位を含むと判定された前記内視鏡画像又は当該内視鏡画像の部分画像が入力されるモデルに基づき、前記浸潤距離を推定し、
前記モデルは、前記モデルに入力される画像と、当該画像に表された前記被検体の前記浸潤距離との関係を学習したモデルである、付記9に記載の画像処理装置。
[付記11]
コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する、
画像処理方法。
[付記12]
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
13 インターフェース
14 入力部
15 光源部
16 音出力部
100、100A 内視鏡検査システム
Claims (12)
- 内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得する浸潤距離取得手段と、
前記浸潤距離に基づく画像又は音を出力装置に出力する出力制御手段と、
を有する画像処理装置。 - 前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位を含む前記内視鏡画像における前記浸潤距離のマップを前記出力装置に表示する、請求項1に記載の画像処理装置。
- 前記マップは、前記浸潤距離の等高線図又は等高線図以外の前記浸潤距離のヒートマップである、請求項2に記載の画像処理装置。
- 前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位における前記被検体の断面図を前記出力装置に表示する、請求項1に記載の画像処理装置。
- 前記断面図には、前記腫瘍部位と、前記被検体の壁層とが含まれる、請求項4に記載の画像処理装置。
- 前記出力制御手段は、前記出力装置に表示した前記内視鏡画像上において指定された線を切断面とする前記断面図を前記出力装置に表示する、請求項4または5に記載の画像処理装置。
- 前記出力制御手段は、前記腫瘍部位を含む前記内視鏡画像における前記浸潤距離のマップを前記出力装置に表示後、前記線を指定する外部入力を受け付ける、請求項6に記載の画像処理装置。
- 前記出力制御手段は、前記浸潤距離に基づく画像として、前記腫瘍部位の3次元モデルを前記出力装置に表示する、請求項1に記載の画像処理装置。
- 前記撮影部が撮影した内視鏡画像から、前記腫瘍部位を含む内視鏡画像を判定する腫瘍判定手段をさらに有し、
前記浸潤距離取得手段は、前記腫瘍部位を含むと判定された前記内視鏡画像に基づき、浸潤距離を推定する、請求項1に記載の画像処理装置。 - 前記浸潤距離取得手段は、前記腫瘍部位を含むと判定された前記内視鏡画像又は当該内視鏡画像の部分画像が入力されるモデルに基づき、前記浸潤距離を推定し、
前記モデルは、前記モデルに入力される画像と、当該画像に表された前記被検体の前記浸潤距離との関係を学習したモデルである、請求項9に記載の画像処理装置。 - コンピュータが、
内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する、
画像処理方法。 - 内視鏡に設けられた撮影部により被検体を撮影した内視鏡画像に基づき、前記内視鏡画像における前記被検体の腫瘍部位の浸潤距離を取得し、
前記浸潤距離に基づく画像又は音を出力装置に出力する処理をコンピュータに実行させるプログラムを格納した記憶媒体。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024534852A JPWO2024018581A5 (ja) | 2022-07-21 | 画像処理装置、画像処理方法及びプログラム | |
| PCT/JP2022/028321 WO2024018581A1 (ja) | 2022-07-21 | 2022-07-21 | 画像処理装置、画像処理方法及び記憶媒体 |
| US18/881,403 US20260000270A1 (en) | 2022-07-21 | 2022-07-21 | Image processing device, image processing method, and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/028321 WO2024018581A1 (ja) | 2022-07-21 | 2022-07-21 | 画像処理装置、画像処理方法及び記憶媒体 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024018581A1 true WO2024018581A1 (ja) | 2024-01-25 |
Family
ID=89617535
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/028321 Ceased WO2024018581A1 (ja) | 2022-07-21 | 2022-07-21 | 画像処理装置、画像処理方法及び記憶媒体 |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20260000270A1 (ja) |
| WO (1) | WO2024018581A1 (ja) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015087167A (ja) * | 2013-10-29 | 2015-05-07 | キヤノン株式会社 | 画像処理方法、画像処理システム |
| JP2017113390A (ja) * | 2015-12-25 | 2017-06-29 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、その制御方法、及びプログラム |
| WO2020105699A1 (ja) * | 2018-11-21 | 2020-05-28 | 株式会社Aiメディカルサービス | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
| WO2020166247A1 (ja) * | 2019-02-14 | 2020-08-20 | 日本電気株式会社 | 病変領域分割装置、医用画像診断システム、病変領域分割方法及びプログラムが格納された非一時的なコンピュータ可読媒体 |
| WO2021054477A2 (ja) * | 2019-09-20 | 2021-03-25 | 株式会社Aiメディカルサービス | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
-
2022
- 2022-07-21 US US18/881,403 patent/US20260000270A1/en active Pending
- 2022-07-21 WO PCT/JP2022/028321 patent/WO2024018581A1/ja not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015087167A (ja) * | 2013-10-29 | 2015-05-07 | キヤノン株式会社 | 画像処理方法、画像処理システム |
| JP2017113390A (ja) * | 2015-12-25 | 2017-06-29 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、その制御方法、及びプログラム |
| WO2020105699A1 (ja) * | 2018-11-21 | 2020-05-28 | 株式会社Aiメディカルサービス | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
| WO2020166247A1 (ja) * | 2019-02-14 | 2020-08-20 | 日本電気株式会社 | 病変領域分割装置、医用画像診断システム、病変領域分割方法及びプログラムが格納された非一時的なコンピュータ可読媒体 |
| WO2021054477A2 (ja) * | 2019-09-20 | 2021-03-25 | 株式会社Aiメディカルサービス | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20260000270A1 (en) | 2026-01-01 |
| JPWO2024018581A1 (ja) | 2024-01-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3463032B1 (en) | Image-based fusion of endoscopic image and ultrasound images | |
| EP3705024B1 (en) | Inspection assistance device, endoscope device, inspection assistance method, and inspection assistance program | |
| JP2010279539A (ja) | 診断支援装置および方法並びにプログラム。 | |
| CN106659362A (zh) | 图像处理装置、图像处理方法、图像处理程序以及内窥镜系统 | |
| JP7385731B2 (ja) | 内視鏡システム、画像処理装置の作動方法及び内視鏡 | |
| US20230419517A1 (en) | Shape measurement system for endoscope and shape measurement method for endoscope | |
| WO2023126999A1 (ja) | 画像処理装置、画像処理方法、及び、記憶媒体 | |
| WO2023148812A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| JP7485193B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| WO2023042273A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2024018581A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| US20250378556A1 (en) | Endoscopic examination assistance device, endoscopic examination system, processing method, and storage medium | |
| JP6745748B2 (ja) | 内視鏡位置特定装置、その作動方法およびプログラム | |
| WO2023181353A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| JP7647873B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| WO2022224446A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2023234071A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2023187886A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| JP7750418B2 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム | |
| US20250166297A1 (en) | Image processing apparatus, image processing method, and storage medium | |
| JP7609278B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| WO2024180593A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2024013848A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2025083856A1 (ja) | 画像処理装置、画像処理方法及び記憶媒体 | |
| WO2025104800A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22951967 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18881403 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024534852 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22951967 Country of ref document: EP Kind code of ref document: A1 |
|
| WWP | Wipo information: published in national office |
Ref document number: 18881403 Country of ref document: US |