WO2024029502A1 - 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 - Google Patents
内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 Download PDFInfo
- Publication number
- WO2024029502A1 WO2024029502A1 PCT/JP2023/028001 JP2023028001W WO2024029502A1 WO 2024029502 A1 WO2024029502 A1 WO 2024029502A1 JP 2023028001 W JP2023028001 W JP 2023028001W WO 2024029502 A1 WO2024029502 A1 WO 2024029502A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- endoscopic
- camera
- posture
- endoscopic camera
- estimating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/0002—Operational features of endoscopes provided with data storages
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00055—Operational features of endoscopes provided with output arrangements for alerting the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/31—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the rectum, e.g. proctoscopes, sigmoidoscopes, colonoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M25/00—Catheters; Hollow probes
- A61M25/01—Introducing, guiding, advancing, emplacing or holding catheters
- A61M25/0105—Steering means as part of the catheter or advancing means; Markers for positioning
- A61M2025/0166—Sensors, electrodes or the like for guiding the catheter to a target zone, e.g. image guided or magnetically guided
Definitions
- the present disclosure relates to image processing related to endoscopy.
- Patent Document 1 proposes to provide an insertion system that presents a recommended insertion operation method when inserting a medical endoscope or the like into an inserted object.
- Patent Document 1 only presents a method for inserting an endoscope, and cannot present the direction of an endoscopic camera that allows appropriate observation of organs when the endoscope is removed.
- One purpose of the present disclosure is to present a direction of an endoscopic camera suitable for observation in endoscopy.
- an endoscopy support device includes: an image acquisition means for acquiring a captured image when the endoscope is removed; posture change estimating means for estimating a change in relative posture of the endoscopic camera from the photographed image; distance estimating means for estimating the distance between the surface of the large intestine and the endoscopic camera from the captured image; intestinal direction estimating means for estimating the intestinal direction of the large intestine based on the change in posture and the distance; calculation means for calculating the direction in which the endoscopic camera should be directed based on the intestinal direction and the relative posture of the endoscopic camera; output means for outputting a display image including a direction in which the endoscopic camera should be directed to a display device; Equipped with
- an endoscopy support method includes: Obtain images taken when the endoscope is removed, Estimating a change in the relative posture of the endoscopic camera from the captured image, estimating the distance between the surface of the large intestine and the endoscopic camera from the captured image; estimating the intestinal direction of the large intestine based on the change in posture and the distance; calculating the direction in which the endoscopic camera should be directed based on the intestinal direction and the relative posture of the endoscopic camera; A display image including a direction in which the endoscopic camera should be directed is output to a display device.
- the recording medium includes: Obtain images taken when the endoscope is removed, Estimating a change in the relative posture of the endoscopic camera from the captured image, Estimating the distance between the surface of the large intestine and the endoscopic camera from the captured image, estimating the intestinal direction of the large intestine based on the change in posture and the distance; Calculating the direction in which the endoscopic camera should be directed based on the intestinal direction and the relative posture of the endoscopic camera;
- a program is recorded that causes a computer to execute a process of outputting a display image including a direction in which the endoscopic camera should be directed to a display device.
- FIG. 1 is a block diagram showing a schematic configuration of an endoscopy system.
- FIG. 2 is a block diagram showing the hardware configuration of an endoscopy support device.
- FIG. 2 is a block diagram showing the functional configuration of an endoscopy support device.
- An example of the direction in which the endoscopic camera should be directed is shown. It is a figure which shows the example of a display of a calculation result. It is a figure which shows the other example of a display of a calculation result. It is a figure which shows the other example of a display of a calculation result. It is a figure which shows the other example of a display of a calculation result. It is a figure which shows the other example of a display of a calculation result. It is a flow chart of direction calculation processing of an endoscope camera by an endoscopy support device.
- FIG. 2 is a block diagram showing the functional configuration of an endoscopy support device according to a second embodiment. It is a flowchart of the process by the endos
- FIG. 1 shows a schematic configuration of an endoscopy system 100.
- the endoscopy system 100 estimates the direction of the intestinal tract and the direction of the endoscopic camera during an examination (including treatment) using an endoscope. Then, if the direction of the endoscopic camera is not toward the intestinal tract, the endoscopy system 100 presents the direction so that the endoscopic camera is directed toward the intestinal tract. The doctor can observe the entire intestinal tract by pointing the endoscope camera toward the intestinal tract according to the instructions of the endoscopy system 100. This makes it possible to reduce areas that cannot be observed.
- the endoscopy system 100 mainly includes an endoscopy support device 1, a display device 2, an endoscope scope 3 connected to the endoscopy support device 1, Equipped with
- the endoscopic examination support device 1 acquires from the endoscope scope 3 an image (i.e., a video, hereinafter also referred to as "endoscope image Ic") taken by the endoscope scope 3 during an endoscopy. Then, display data is displayed on the display device 2 for the endoscopy examiner to confirm. Specifically, the endoscopy support device 1 acquires a moving image of the large intestine photographed by the endoscope 3 during an endoscopy as an endoscopic image Ic.
- an image i.e., a video, hereinafter also referred to as "endoscope image Ic
- the endoscopic examination support device 1 extracts frame images from the endoscopic image Ic, and based on the frame images, determines the distance between the surface of the large intestine and the endoscopic camera (hereinafter also referred to as "depth"), Estimate the change in relative posture of the endoscopic camera. Then, the endoscopy support device 1 performs three-dimensional reconstruction of the intestinal tract of the large intestine based on the depth and changes in the relative posture of the endoscopic camera, and estimates the intestinal tract direction. The endoscopic examination support device 1 estimates the direction in which the endoscopic camera should be directed based on the direction of the intestinal tract and the relative posture of the endoscopic camera.
- the display device 2 is a display or the like that performs a predetermined display based on a display signal supplied from the endoscopy support device 1.
- the endoscope 3 mainly includes an operating section 36 through which the examiner inputs air supply, water supply, angle adjustment, photographing instructions, etc., and a flexible
- the distal end portion 38 has a built-in endoscope camera such as a micro-imaging device, and a connecting portion 39 for connecting to the endoscopic examination support device 1.
- FIG. 2 shows the hardware configuration of the endoscopy support device 1.
- the endoscopy support device 1 mainly includes a processor 11, a memory 12, an interface 13, an input section 14, a light source section 15, a sound output section 16, and a database (hereinafter referred to as "DB"). ) 17. Each of these elements is connected via a data bus 19.
- DB database
- the processor 11 executes a predetermined process by executing a program stored in the memory 12.
- the processor 11 is a processor such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a TPU (Tensor Processing Unit). Note that the processor 11 may include a plurality of processors.
- Processor 11 is an example of a computer.
- the memory 12 includes various types of volatile memory used as working memory, such as RAM (Random Access Memory) and ROM (Read Only Memory), and non-volatile memory that stores information necessary for processing of the endoscopy support device 1. Consists of memory. Note that the memory 12 may include an external storage device such as a hard disk connected to or built in the endoscopy support device 1, or may include a removable storage medium such as a flash memory or a disk medium. The memory 12 stores programs for the endoscopy support apparatus 1 to execute each process in this embodiment.
- RAM Random Access Memory
- ROM Read Only Memory
- the memory 12 temporarily stores a series of endoscopic images Ic taken by the endoscope 3 during an endoscopy, under the control of the processor 11.
- the interface 13 performs an interface operation between the endoscopy support device 1 and external devices. For example, the interface 13 supplies the display data Id generated by the processor 11 to the display device 2. Further, the interface 13 supplies illumination light generated by the light source section 15 to the endoscope 3. Further, the interface 13 supplies the processor 11 with an electrical signal indicating the endoscopic image Ic supplied from the endoscopic scope 3.
- the interface 13 may be a communication interface such as a network adapter for communicating with an external device by wire or wirelessly, and may be a hardware interface compliant with USB (Universal Serial Bus), SATA (Serial AT Attachment), etc. You can.
- the input unit 14 generates an input signal based on the operation of the examiner.
- the input unit 14 is, for example, a button, a touch panel, a remote controller, a voice input device, or the like.
- the light source section 15 generates light to be supplied to the distal end section 38 of the endoscope 3. Further, the light source section 15 may also incorporate a pump or the like for sending out water and air to be supplied to the endoscope 3.
- the sound output section 16 outputs sound under the control of the processor 11.
- the DB 17 stores endoscopic images obtained from past endoscopic examinations of the subject.
- the DB 17 may include an external storage device such as a hard disk connected to or built in the endoscopy support device 1, or may include a removable storage medium such as a flash memory. Note that instead of providing the DB 17 within the endoscopy system 100, the DB 17 may be provided in an external server or the like, and related information may be acquired from the server through communication.
- the endoscopic examination support device 1 may include a sensor capable of measuring rotation and translation of the endoscopic camera, such as a magnetic sensor.
- FIG. 3 is a block diagram showing the functional configuration of the endoscopy support device 1.
- the endoscopy support device 1 includes an interface 13, a depth estimation section 21, a camera posture estimation section 22, a three-dimensional reconstruction section 23, an operation direction estimation section 24, and a lesion detection section 25. , and a display image generation section 26.
- An endoscopic image Ic is input to the endoscopic examination support device 1 from the endoscope scope 3.
- the endoscopic image Ic is input to the interface 13.
- the interface 13 extracts a frame image (hereinafter also referred to as "endoscopic image") from the input endoscopic image Ic, and sends it to the depth estimation section 21, camera posture estimation section 22, and lesion detection section 25. Output. Further, the interface 13 outputs the input endoscopic image Ic to the display image generation section 26.
- An endoscopic image is input to the depth estimation unit 21 from the interface 13.
- the depth estimating unit 21 estimates the depth from the input endoscopic image using an image recognition model prepared in advance.
- the depth estimating unit 21 then outputs the estimated depth to the three-dimensional restoring unit 23.
- An endoscopic image is input to the camera posture estimation unit 22 from the interface 13.
- the camera posture estimating unit 22 uses two temporally consecutive endoscopic images to move from the photographing point of the first endoscopic image to the photographing point of the second endoscopic image.
- the rotation and translation of the endoscopic camera that is, the change in relative posture of the endoscopic camera; hereinafter also simply referred to as "camera posture change" is estimated.
- the camera posture estimation section 22 outputs the estimated camera posture change of the endoscopic camera to the three-dimensional reconstruction section 23.
- the camera posture estimating unit 22 estimates a change in camera posture from the input endoscopic image using an image recognition model prepared in advance.
- the camera posture estimating unit 22 may estimate a change in the relative posture of the endoscopic camera using measurement data from a magnetic sensor or the like.
- the image recognition model used by the depth estimation section 21 and the camera posture estimation section 22 is a machine learning model trained in advance to estimate depth and camera posture changes from endoscopic images. These are also referred to as a "depth estimation model” and a “camera pose estimation model.”
- the depth estimation model and camera pose estimation model can be generated by so-called supervised learning.
- teacher data in which depth is assigned to an endoscopic image as a correct label is used.
- the endoscopic images and depth used for learning are collected in advance from an endoscopic camera and a ToF (Time of Flight) sensor installed in the endoscope. That is, a pair of an RGB image photographed by an endoscopic camera and a depth is created as training data, and learning is performed using the training data.
- ToF Time of Flight
- the camera posture estimation model for example, teacher data in which a change in camera posture is added to an endoscopic image as a correct label is used.
- the change in camera posture can be obtained using a sensor capable of detecting rotation and translation, such as a magnetic sensor. That is, a pair of an RGB image photographed by an endoscopic camera and a change in the posture of the camera is created as teacher data, and learning is performed using the teacher data.
- the training data used for learning the depth estimation model and the camera pose estimation model may be created from a simulated image of an endoscope using computer graphics (CG). This allows a large amount of training data to be created at high speed.
- a depth estimation model and a camera attitude estimation model are generated by a machine learning device learning the relationship between an endoscopic image, depth, and camera attitude change using teacher data.
- the depth estimation model and camera pose estimation model may be generated by self-supervised learning.
- self-supervised learning training data is created using motion parallax.
- a depth CNN Convolutional Neural Network
- a depth CNN Convolutional Neural Network
- a depth CNN Convolutional Neural Network
- a Pose CNN that estimates the relative posture from the mirror image I i and the endoscopic image I j is prepared.
- an endoscopic image I j is reconstructed from the endoscopic image I i (this is also referred to as an "endoscopic image I i ⁇ j "). Then, the model is trained using the difference between the reconstructed endoscopic image I i ⁇ j and the actual endoscopic image I j as a loss.
- the three-dimensional reconstruction unit 23 performs three-dimensional reconstruction processing of the intestinal tract based on the depth input from the depth estimation unit 21 and the relative posture change of the endoscopic camera input from the camera posture estimation unit 22. to estimate the direction of the intestinal tract. Then, the three-dimensional restoration unit 23 outputs the three-dimensional model, the intestinal direction, the relative change in posture of the endoscopic camera, and the position of the endoscopic camera to the operation direction estimation unit 24.
- the three-dimensional model, the intestinal direction, and the relative posture change of the endoscopic camera are input to the operation direction estimation unit 24 from the three-dimensional restoration unit 23. Then, the operation direction estimating unit 24 calculates the direction in which the endoscopic camera should be directed based on the intestinal tract direction and the change in relative posture of the endoscopic camera. Then, the operation direction estimation unit 24 outputs the three-dimensional model, the change in relative posture of the endoscopic camera, and the direction in which the endoscopic camera should be directed to the display image generation unit 26.
- FIG. 4 shows an example of the direction in which the endoscopic camera should be directed.
- a three-dimensional model 31 of the intestinal tract, an intestinal tract direction 32, and an endoscopic camera direction 33 are shown on the XYZ coordinates.
- the three-dimensional model 31 is a model of the intestinal tract that has been three-dimensionally reconstructed by the three-dimensional reconstruction unit 23, and includes a detailed three-dimensional structure of the intestinal tract.
- the three-dimensional model 31 is shown approximated to have a cylindrical shape.
- the intestinal tract direction 32 is the longitudinal direction or axial direction of the intestinal tract, and is estimated based on the three-dimensional model 31 of the intestinal tract.
- the endoscopic camera direction 33 is the direction of the lens of the endoscopic camera, that is, the photographing direction.
- the operation direction estimation unit 24 calculates the angle formed between the intestinal tract direction 32 and the endoscopic camera direction 33, that is, the deviation angle ⁇ of the endoscopic camera direction 33 with respect to the intestinal tract direction 32. Then, if the deviation angle ⁇ is equal to or greater than a predetermined threshold value, the operation direction estimating unit 24 determines that the endoscopic camera is facing the intestinal wall. If the operation direction estimating unit 24 determines that the endoscopic camera is facing the intestinal wall, the operation direction estimation unit 24 adjusts the direction so that the direction of the endoscopic camera matches the direction of the intestinal tract (so that the deviation angle ⁇ becomes zero). The direction in which the endoscopic camera should be directed is calculated and output to the display image generation section 26.
- An endoscopic image is input to the lesion detection unit 25 from the interface 13. Then, the lesion detection unit 25 detects lesion candidates from the endoscopic image using an image recognition model prepared in advance, and generates a lesion candidate image including the detected lesion candidates.
- the lesion detection unit 25 surrounds the lesion candidate on the lesion candidate image with an ellipse or the like and outputs it to the display image generation unit 26.
- the display image generation unit 26 generates the three-dimensional model, the relative posture change of the endoscopic camera, the direction in which the endoscopic camera should be directed, and the lesion input from the operation direction estimation unit 24 and the lesion detection unit 25.
- Display data is generated using the candidate images and is output to the display device 2.
- the interface 13 is an example of an image acquisition unit
- the depth estimation unit 21 is an example of a distance estimation unit
- the camera attitude estimation unit 22 is an example of an attitude change estimation unit
- the three-dimensional restoration unit 23 is an example of an attitude change estimation unit.
- This is an example of an intestinal direction estimating means
- the operation direction estimating section 24 is an example of a calculating means
- the display image generating section 26 is an example of an output means.
- FIG. 5 is an example of a display by the display device 2.
- the display device 2 displays an endoscopic image 41, a lesion history 42, a camera trajectory 43, a camera mark 44, an intestinal direction indicator 45, and a lesion direction indicator 46.
- the endoscopic image 41 is an endoscopic image Ic during the examination, and is updated as the endoscopic camera moves.
- the lesion history 42 is an image showing a lesion candidate detected in an endoscopy, and a lesion candidate image input from the lesion detection unit 25 is used.
- a lesion candidate site detected by the lesion detection unit 25 is indicated by an ellipse 42a. Note that if a lesion candidate is detected at multiple locations, the image of the most recent lesion candidate is displayed in the lesion history 42.
- the camera trajectory 43 indicates the trajectory of the endoscopic camera within a predetermined time.
- a three-dimensional intestinal model 43a is represented as a cylinder, and a camera mark 44 indicating the direction and position of the endoscopic camera at a predetermined time is displayed superimposed on the intestinal model 43a to indicate the trajectory of the camera.
- Camera marks 44 schematically indicate the orientation and position of the endoscopic camera at different timings.
- the camera mark 44 is represented by a cone, and the bottom surface of the cone indicates the lens side of the endoscopic camera.
- the camera marks 44 are color-coded in chronological order, and the darker the color, the more recent the orientation and position of the endoscopic camera. Note that FIG. 5 shows that the camera direction of the endoscopic camera changes from the direction of the intestinal tract to the direction of the intestinal wall, as indicated by the arrow.
- the intestinal tract direction indicator 45 indicates the direction in which the endoscopic camera should be directed so that the endoscopic camera points toward the intestinal tract.
- the intestinal tract direction indicator 45 is displayed when the endoscopic camera is facing the intestinal wall, specifically when the above-mentioned deviation angle ⁇ is greater than or equal to a predetermined threshold.
- an intestinal tract direction indicator 45 is displayed at the left end and upper end of the endoscopic image 41. This allows the doctor to know that if the endoscopic camera is directed toward the upper left, the endoscopic camera will be directed toward the intestinal tract.
- the intestinal tract direction indicator 45 is displayed at the right end of the endoscopic image 41, and when the direction in which the endoscopic camera should be directed is downward.
- the intestinal direction indicator 45 is displayed at the lower end of the endoscopic image 41. In this way, when the endoscopic camera is facing the intestinal wall, the intestinal direction indicator 45 is set at at least one of the upper and lower ends and the left and right ends of the endoscopic image 41, depending on the direction in which the endoscopic camera should be directed. will be displayed.
- the lesion direction indicator 46 indicates the direction in which the endoscopic camera should be directed so that the endoscopic camera is directed toward the lesion.
- Lesion direction indicator 46 is displayed when a lesion candidate is detected. In FIG. 5, a lesion direction indicator 46 is displayed at the left end of the endoscopic image 41. This allows the doctor to understand that when the endoscopic camera is turned to the left, the endoscopic camera will be directed to the lesion candidate. In this manner, when a lesion candidate is detected, the lesion direction indicator 46 is displayed at at least one of the upper and lower ends and the left and right ends of the endoscopic image 41, depending on the position of the lesion candidate.
- the display image generation unit 26 may generate the display data of the camera trajectory 43 so as to display the intestinal tract model 43a viewed from a direction in which the plurality of camera marks 44 overlap as little as possible.
- the display image generation unit 26 uses principal component analysis or the like to determine a direction in which the dispersion of camera directions indicated by the plurality of camera marks 44 becomes large, and displays the camera while viewing the intestinal tract model 43a from that direction. Display data for displaying the trajectory 43 is generated. Thereby, the display device 2 can appropriately display the trajectory of the endoscopic camera using the intestinal tract model viewed from a direction in which the camera marks 44 overlap less, as shown in FIG.
- FIG. 7 shows another display example by the display device 2.
- This example is an example in which the intestinal tract direction indicator and the lesion direction indicator are displayed as arrows.
- an intestinal tract direction indicator 45a and a lesion direction indicator 46a are displayed on the endoscopic image 41.
- FIG. 8 shows another display example by the display device 2.
- the trajectory of the camera is displayed on the intestinal tract model 43a.
- FIG. 8 is an example in which the trajectory of the camera is displayed on an endoscopic image.
- a camera mark 44 indicating the direction and position of the endoscopic camera at a predetermined time is superimposed on the endoscopic image 43b.
- the endoscopic image 43b an endoscopic image taken in a past ideal imaging direction, for example, an endoscopic image taken with the endoscopic camera facing toward the intestinal tract is used.
- the ideal position of the camera is indicated by a camera mark 44a represented by a black cone.
- an endoscopic image photographed in the state indicated by the camera mark 44a can be used as the endoscopic image 43b shown in FIG. 8.
- the trajectory of the endoscopic camera is displayed on the actual endoscopic image, making it easier for the doctor to intuitively grasp the ideal position of the endoscopic camera.
- FIG. 9 is a flowchart of processing by the endoscopy support device 1. This processing is realized by the processor 11 shown in FIG. 2 executing a program prepared in advance and operating as each element shown in FIG. 3. Further, this process is executed during an examination using an endoscope, that is, when the endoscope 3 is removed.
- an endoscopic image Ic is input from the endoscopic scope 3 to the interface 13.
- the interface 13 acquires an endoscopic image from the input endoscopic image Ic (step S11).
- the depth estimation unit 21 estimates the distance between the surface of the large intestine and the endoscopic camera from the endoscopic image using an image recognition model prepared in advance or the like (step S12).
- the camera posture estimating unit 22 estimates a relative change in posture of the endoscopic camera from two temporally consecutive endoscopic images (step S13).
- the three-dimensional reconstruction unit 23 performs three-dimensional reconstruction processing of the intestinal tract based on the distance between the surface of the large intestine and the endoscopic camera and the relative change in posture of the endoscopic camera, and is estimated (step S14).
- the operation direction estimating unit 24 calculates the direction in which the endoscopic camera should be directed based on the relative posture change of the endoscopic camera and the intestinal direction (step S15).
- the display image generation unit 26 generates display data using the three-dimensional model, the relative change in posture of the endoscopic camera, and the direction in which the endoscopic camera should be directed, and outputs it to the display device 2 ( Step S16). In this way, a display as shown in FIG. 5 etc. is performed. Note that step S13 may be executed before step S12, or may be executed simultaneously with step S12.
- FIG. 10 is a block diagram showing the functional configuration of an endoscopy support device according to the second embodiment.
- the endoscopy support device 70 includes an image acquisition means 71 , a posture change estimation means 72 , a distance estimation means 73 , an intestinal direction estimation means 74 , a calculation means 75 , and an output means 76 .
- FIG. 11 is a flowchart of processing by the endoscopy support device of the second embodiment.
- the image acquisition means 71 acquires a captured image when the endoscope is removed (step S71).
- the posture change estimating means 72 estimates a relative change in posture of the endoscopic camera from the captured image (step S72).
- the distance estimating means 73 estimates the distance between the surface of the large intestine and the endoscopic camera from the captured image (step S73).
- the intestinal direction estimating means 74 estimates the intestinal direction of the large intestine based on the change in the posture of the endoscopic camera and the distance between the surface of the large intestine and the endoscopic camera (step S74).
- the calculation means 75 calculates the direction in which the endoscopic camera should be directed based on the intestinal direction and the relative posture of the endoscopic camera (step S75).
- the output means 76 outputs a display image including the direction in which the endoscopic camera should be directed to the display device (step S76).
- the endoscopic examination support device 70 of the second embodiment it is possible to present the direction of an endoscopic camera suitable for observation during an endoscopy.
- an image acquisition means for acquiring a captured image when the endoscope is removed; posture change estimating means for estimating a change in relative posture of the endoscopic camera from the photographed image; distance estimating means for estimating the distance between the surface of the large intestine and the endoscopic camera from the captured image; intestinal direction estimating means for estimating the intestinal direction of the large intestine based on the change in posture and the distance; calculation means for calculating the direction in which the endoscopic camera should be directed based on the intestinal direction and the relative posture of the endoscopic camera; output means for outputting a display image including a direction in which the endoscopic camera should be directed to a display device;
- An endoscopy support device equipped with:
- the endoscopy support device according to supplementary note 1, wherein the posture change estimating means estimates the change in posture using a machine learning model learned in advance to estimate depth and camera posture changes from endoscopic images. .
- the directions in which the endoscopic camera should be directed are the intestinal tract direction and the lesion direction
- the endoscopy support device according to supplementary note 1, wherein the output means outputs a display image that displays the intestinal direction and the lesion direction in a distinguishable manner.
- Appendix 6 The endoscopy support device according to appendix 5, wherein the output means outputs a display image in which the locus of the change in posture is displayed superimposed on the model of the intestinal tract.
- Appendix 7 The endoscopy support device according to appendix 6, wherein the output means outputs a display image of the intestinal tract model viewed from a direction in which trajectories of changes in posture overlap less.
- Endoscopy support device Display device 3 Endoscope scope 11 Processor 12 Memory 13 Interface 21 Depth estimation unit 22 Camera posture estimation unit 23 Three-dimensional restoration unit 24 Operation direction estimation unit 25 Lesion detection unit 26 Display image generation unit 100 Endoscopy system
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Robotics (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
Abstract
Description
内視鏡の抜去時の撮影画像を取得する画像取得手段と、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定する姿勢変化推定手段と、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定する距離推定手段と、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定する腸管方向推定手段と、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算する計算手段と、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する出力手段と、
を備える。
内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する。
内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する処理をコンピュータに実行させるプログラムを記録する。
<第1実施形態>
[システム構成]
図1は、内視鏡検査システム100の概略構成を示す。内視鏡検査システム100は、内視鏡を利用した検査(治療を含む)の際に、腸管方向と内視鏡カメラの方向を推定する。そして、内視鏡検査システム100は、内視鏡カメラの方向が、腸管方向に向いていない場合は、内視鏡カメラを腸管方向に向けるよう方向の提示を行う。医師は、内視鏡検査システム100の提示に従い、内視鏡カメラを腸管方向に向けることで、腸管全体が観察できるようになる。これにより、観察できない領域を削減することができる。
図2は、内視鏡検査支援装置1のハードウェア構成を示す。内視鏡検査支援装置1は、主に、プロセッサ11と、メモリ12と、インターフェース13と、入力部14と、光源部15と、音出力部16と、データベース(以下、「DB」と記す。)17と、を含む。これらの各要素は、データバス19を介して接続されている。
図3は、内視鏡検査支援装置1の機能構成を示すブロック図である。内視鏡検査支援装置1は、機能的には、インターフェース13と、深度推定部21と、カメラ姿勢推定部22と、3次元復元部23と、操作方向推定部24と、病変検知部25と、表示画像生成部26と、を含む。
次に、表示装置2による表示例を説明する。
次に、上記のような表示を行う表示処理について説明する。図9は、内視鏡検査支援装置1による処理のフローチャートである。この処理は、図2に示すプロセッサ11が予め用意されたプログラムを実行し、図3に示す各要素として動作することにより実現される。また、この処理は、内視鏡を利用した検査中、即ち、内視鏡スコープ3の抜去時に実行される。
図10は、第2実施形態の内視鏡検査支援装置の機能構成を示すブロック図である。内視鏡検査支援装置70は、画像取得手段71と、姿勢変化推定手段72と、距離推定手段73と、腸管方向推定手段74と、計算手段75と、出力手段76と、を備える。
内視鏡の抜去時の撮影画像を取得する画像取得手段と、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定する姿勢変化推定手段と、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定する距離推定手段と、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定する腸管方向推定手段と、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算する計算手段と、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する出力手段と、
を備える内視鏡検査支援装置。
前記内視鏡カメラを向けるべき方向は、腸管方向である付記1に記載の内視鏡検査支援装置。
前記姿勢変化推定手段は、内視鏡画像から深度及びカメラ姿勢変化を推定するように予め学習された機械学習モデルを用いて前記姿勢の変化を推定する付記1に記載の内視鏡検査支援装置。
前記内視鏡カメラを向けるべき方向は、腸管方向と病変方向であり、
前記出力手段は、前記腸管方向と前記病変方向とを識別可能な態様で表示する表示画像を出力する付記1に記載の内視鏡検査支援装置。
前記腸管方向推定手段は、前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管のモデルを作成し、前記腸管のモデルに基づいて前記腸管方向を推定する付記1に記載の内視鏡検査支援装置。
前記出力手段は、前記姿勢の変化の軌跡を、前記腸管のモデル上に重畳表示した表示画像を出力する付記5に記載の内視鏡検査支援装置。
前記出力手段は、前記腸管のモデルを前記姿勢の変化の軌跡の重なりが少ない方向から見た表示画像を出力する付記6に記載の内視鏡検査支援装置。
前記出力手段は、前記姿勢の変化の軌跡と、前記内視鏡カメラを向けるべき方向を、前記撮影画像上に重畳表示した表示画像を出力する付記1に記載の内視鏡検査支援装置。
内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する内視鏡検査支援方法。
内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する処理をコンピュータに実行させるプログラムを記録した記録媒体。
2 表示装置
3 内視鏡スコープ
11 プロセッサ
12 メモリ
13 インターフェース
21 深度推定部
22 カメラ姿勢推定部
23 3次元復元部
24 操作方向推定部
25 病変検知部
26 表示画像生成部
100 内視鏡検査システム
Claims (10)
- 内視鏡の抜去時の撮影画像を取得する画像取得手段と、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定する姿勢変化推定手段と、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定する距離推定手段と、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定する腸管方向推定手段と、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算する計算手段と、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する出力手段と、
を備える内視鏡検査支援装置。 - 前記内視鏡カメラを向けるべき方向は、腸管方向である請求項1に記載の内視鏡検査支援装置。
- 前記姿勢変化推定手段は、内視鏡画像から深度及びカメラ姿勢変化を推定するように予め学習された機械学習モデルを用いて前記姿勢の変化を推定する請求項1に記載の内視鏡検査支援装置。
- 前記内視鏡カメラを向けるべき方向は、腸管方向と病変方向であり、
前記出力手段は、前記腸管方向と前記病変方向とを識別可能な態様で表示する表示画像を出力する請求項1に記載の内視鏡検査支援装置。 - 前記腸管方向推定手段は、前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管のモデルを作成し、前記腸管のモデルに基づいて前記腸管方向を推定する請求項1に記載の内視鏡検査支援装置。
- 前記出力手段は、前記姿勢の変化の軌跡を、前記腸管のモデル上に重畳表示した表示画像を出力する請求項5に記載の内視鏡検査支援装置。
- 前記出力手段は、前記腸管のモデルを前記姿勢の変化の軌跡の重なりが少ない方向から見た表示画像を出力する請求項6に記載の内視鏡検査支援装置。
- 前記出力手段は、前記姿勢の変化の軌跡と、前記内視鏡カメラを向けるべき方向を、前記撮影画像上に重畳表示した表示画像を出力する請求項1に記載の内視鏡検査支援装置。
- 内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する内視鏡検査支援方法。 - 内視鏡の抜去時の撮影画像を取得し、
前記撮影画像から内視鏡カメラの相対的な姿勢の変化を推定し、
前記撮影画像から大腸の表面と前記内視鏡カメラとの距離を推定し、
前記姿勢の変化と前記距離とに基づいて、前記大腸の腸管方向を推定し、
前記腸管方向と前記内視鏡カメラの相対的な姿勢とに基づいて、前記内視鏡カメラを向けるべき方向を計算し、
前記内視鏡カメラを向けるべき方向を含む表示画像を表示装置に出力する処理をコンピュータに実行させるプログラムを記録した記録媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/555,166 US20250090236A1 (en) | 2022-08-01 | 2023-07-31 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
| JP2024539149A JP7768398B2 (ja) | 2022-08-01 | 2023-07-31 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム |
| US18/517,105 US20240081614A1 (en) | 2022-08-01 | 2023-11-22 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
| US18/519,453 US20240122444A1 (en) | 2022-08-01 | 2023-11-27 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/029450 WO2024028934A1 (ja) | 2022-08-01 | 2022-08-01 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
| JPPCT/JP2022/029450 | 2022-08-01 |
Related Child Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/555,166 A-371-Of-International US20250090236A1 (en) | 2022-08-01 | 2023-07-31 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
| US18/517,105 Continuation US20240081614A1 (en) | 2022-08-01 | 2023-11-22 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
| US18/519,453 Continuation US20240122444A1 (en) | 2022-08-01 | 2023-11-27 | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024029502A1 true WO2024029502A1 (ja) | 2024-02-08 |
Family
ID=89848672
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/029450 Ceased WO2024028934A1 (ja) | 2022-08-01 | 2022-08-01 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
| PCT/JP2023/028001 Ceased WO2024029502A1 (ja) | 2022-08-01 | 2023-07-31 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/029450 Ceased WO2024028934A1 (ja) | 2022-08-01 | 2022-08-01 | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 |
Country Status (3)
| Country | Link |
|---|---|
| US (3) | US20250090236A1 (ja) |
| JP (1) | JP7768398B2 (ja) |
| WO (2) | WO2024028934A1 (ja) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003093328A (ja) * | 2001-09-25 | 2003-04-02 | Olympus Optical Co Ltd | 内視鏡挿入方向検出方法及び内視鏡挿入方向検出装置 |
| US8795157B1 (en) * | 2006-10-10 | 2014-08-05 | Visionsense Ltd. | Method and system for navigating within a colon |
| WO2015049962A1 (ja) * | 2013-10-02 | 2015-04-09 | オリンパスメディカルシステムズ株式会社 | 内視鏡システム |
| JP2018057799A (ja) * | 2016-09-29 | 2018-04-12 | 富士フイルム株式会社 | 内視鏡システム及び内視鏡システムの駆動方法 |
| JP2019072259A (ja) * | 2017-10-17 | 2019-05-16 | 国立大学法人千葉大学 | 内視鏡画像処理プログラム、内視鏡システム及び内視鏡画像処理方法 |
| WO2019207740A1 (ja) * | 2018-04-26 | 2019-10-31 | オリンパス株式会社 | 移動支援システム及び移動支援方法 |
| US20210161604A1 (en) * | 2018-07-17 | 2021-06-03 | Bnaiahu Levin | Systems and methods of navigation for robotic colonoscopy |
-
2022
- 2022-08-01 WO PCT/JP2022/029450 patent/WO2024028934A1/ja not_active Ceased
-
2023
- 2023-07-31 US US18/555,166 patent/US20250090236A1/en active Pending
- 2023-07-31 WO PCT/JP2023/028001 patent/WO2024029502A1/ja not_active Ceased
- 2023-07-31 JP JP2024539149A patent/JP7768398B2/ja active Active
- 2023-11-22 US US18/517,105 patent/US20240081614A1/en active Pending
- 2023-11-27 US US18/519,453 patent/US20240122444A1/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003093328A (ja) * | 2001-09-25 | 2003-04-02 | Olympus Optical Co Ltd | 内視鏡挿入方向検出方法及び内視鏡挿入方向検出装置 |
| US8795157B1 (en) * | 2006-10-10 | 2014-08-05 | Visionsense Ltd. | Method and system for navigating within a colon |
| WO2015049962A1 (ja) * | 2013-10-02 | 2015-04-09 | オリンパスメディカルシステムズ株式会社 | 内視鏡システム |
| JP2018057799A (ja) * | 2016-09-29 | 2018-04-12 | 富士フイルム株式会社 | 内視鏡システム及び内視鏡システムの駆動方法 |
| JP2019072259A (ja) * | 2017-10-17 | 2019-05-16 | 国立大学法人千葉大学 | 内視鏡画像処理プログラム、内視鏡システム及び内視鏡画像処理方法 |
| WO2019207740A1 (ja) * | 2018-04-26 | 2019-10-31 | オリンパス株式会社 | 移動支援システム及び移動支援方法 |
| US20210161604A1 (en) * | 2018-07-17 | 2021-06-03 | Bnaiahu Levin | Systems and methods of navigation for robotic colonoscopy |
Also Published As
| Publication number | Publication date |
|---|---|
| US20240081614A1 (en) | 2024-03-14 |
| US20250090236A1 (en) | 2025-03-20 |
| JP7768398B2 (ja) | 2025-11-12 |
| US20240122444A1 (en) | 2024-04-18 |
| JPWO2024029502A1 (ja) | 2024-02-08 |
| WO2024028934A1 (ja) | 2024-02-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20110032347A1 (en) | Endoscopy system with motion sensors | |
| JP6254053B2 (ja) | 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法 | |
| JP7385731B2 (ja) | 内視鏡システム、画像処理装置の作動方法及び内視鏡 | |
| US12299922B2 (en) | Luminal structure calculation apparatus, creation method for luminal structure information, and non-transitory recording medium recording luminal structure information creation program | |
| JP7189355B2 (ja) | コンピュータプログラム、内視鏡用プロセッサ、及び情報処理方法 | |
| CN114980793A (zh) | 内窥镜检查辅助装置、内窥镜检查辅助装置的工作方法以及程序 | |
| US12433478B2 (en) | Processing device, endoscope system, and method for processing captured image | |
| US20250281022A1 (en) | Endoscopy support device, endoscopy support method, and recording medium | |
| CN117255642A (zh) | 图像处理装置、内窥镜装置以及图像处理方法 | |
| US20240057847A1 (en) | Endoscope system, lumen structure calculation system, and method for creating lumen structure information | |
| JP7768398B2 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム | |
| KR20200132174A (ko) | 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법 | |
| WO2024195100A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 | |
| US20250089986A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| US20250089987A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| WO2025104800A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 | |
| KR102875761B1 (ko) | 상부 위장관 이미지 획득을 위한 내시경 장치 및 제어 방법 | |
| US20250078348A1 (en) | Endoscopic examination support apparatus, endoscopic examination support method, and recording medium | |
| WO2025004206A1 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び、記録媒体 | |
| KR20250130110A (ko) | 하부 위장관 이미지 획득을 위한 내시경 장치 및 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 18555166 Country of ref document: US |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23850056 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024539149 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 18555166 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23850056 Country of ref document: EP Kind code of ref document: A1 |