WO2018043205A1 - Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme - Google Patents
Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme Download PDFInfo
- Publication number
- WO2018043205A1 WO2018043205A1 PCT/JP2017/029919 JP2017029919W WO2018043205A1 WO 2018043205 A1 WO2018043205 A1 WO 2018043205A1 JP 2017029919 W JP2017029919 W JP 2017029919W WO 2018043205 A1 WO2018043205 A1 WO 2018043205A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sub
- camera
- main
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
Definitions
- the present technology relates to a medical image processing apparatus, a medical image processing method, and a program. Regarding the program.
- Patent Document 1 discloses the positional relationship between an endoscope, an instrument, and a patient during endoscopic surgery. That is, an image in the body cavity is obtained by inserting a tube called a sheath SH into a through-hole created in the body wall in advance, creating a path, and inserting a camera (generally called an endoscope) through the path. .
- This type of endoscopic surgery has a positive effect of improving QoL (quality of life), which means that the period between recovery of the patient is short because the wound on the body is small, and reducing the burden caused by shortening the hospitalization period.
- the treatment tool is inserted through a small hole and the endoscope is inserted through a hole provided at another location, and the treatment is performed while referring to the image in the body cavity. Surgery techniques are required.
- main image main image
- sub-image sub-image
- the present technology has been made in view of such a situation, and makes it possible to provide a plurality of pieces of information in a state that is easy for the user to visually recognize.
- a first medical image processing apparatus uses one of images captured by a plurality of imaging units as a main image, another image as a sub image, and superimposes the sub image.
- a conversion unit that converts the image into a work image; and a superimposition unit that superimposes the converted image on a predetermined position in the main image.
- a second medical image processing apparatus uses one image of images captured by a plurality of imaging units as a main image, another image as a sub image, and the sub image as the image.
- An imaging unit that superimposes on the main image is provided, an imaging unit that captures the sub image is displayed on the main image, and the sub image is displayed in the vicinity of the imaging unit.
- one of images captured by a plurality of imaging units is used as a main image, another image is used as a sub image, and the sub image is superimposed. Converting to an image for use, and superimposing the converted image on a predetermined position in the main image.
- one of images captured by a plurality of imaging units is used as a main image, another image is used as a sub image, and the sub image is used as the sub image.
- the method includes a step of superimposing the image on the main image, displaying an image capturing unit capturing the sub image on the main image, and displaying the sub image near the image capturing unit.
- a first program causes a computer to use one of images captured by a plurality of imaging units as a main image, another image as a sub image, and superimpose the sub image. Conversion to an image is performed, and processing including a step of superimposing the converted image on a predetermined position in the main image is executed.
- a second program causes a computer to use one of images captured by a plurality of imaging units as a main image, another image as a sub image, and the sub image as the main image.
- a process including a step of superimposing the image on the image, displaying an image capturing unit capturing the sub image on the main image, and displaying the sub image near the image capturing unit is executed.
- one of the images captured by the plurality of imaging units is set as a main image, and the other The image is used as a sub image, the sub image is converted into an image for superimposition, and the converted image is superimposed at a predetermined position in the main image.
- one of images captured by a plurality of imaging units is set as a main image, and the other The image is set as a sub image, the sub image is superimposed on the main image, an imaging unit that captures the sub image is displayed on the main image, and the sub image is displayed in the vicinity of the imaging unit.
- the medical image processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the program can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- a plurality of pieces of information can be provided in a state that is easy for the user to visually recognize.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- an endoscopic operation system will be described as an example, but the present technology can also be applied to a surgical operation system, a microscopic operation system, and the like.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system 10 to which the technology according to the present disclosure can be applied.
- an endoscopic surgery system 10 includes an endoscope 20, other surgical tools 30, a support arm device 40 that supports the endoscope 20, and various devices for endoscopic surgery. And a cart 50 on which is mounted.
- trocars 37a to 37d are punctured into the abdominal wall. Then, the barrel 21 of the endoscope 20 and other surgical tools 30 are inserted into the body cavity of the patient 75 from the trocars 37a to 37d.
- an insufflation tube 31, an energy treatment tool 33, and forceps 35 are inserted into the body cavity of a patient 75 as other surgical tools 30.
- the energy treatment tool 33 is a treatment tool that performs tissue incision and peeling, blood vessel sealing, or the like by high-frequency current or ultrasonic vibration.
- the illustrated surgical tool 30 is merely an example, and as the surgical tool 30, various surgical tools generally used in endoscopic surgery, such as a lever and a retractor, may be used.
- the image of the surgical site in the body cavity of the patient 75 photographed by the endoscope 20 is displayed on the display device 53.
- the surgeon 71 performs a treatment such as excision of the affected area using the energy treatment tool 33 and the forceps 35 while viewing the image of the surgical site displayed on the display device 53 in real time.
- the pneumoperitoneum tube 31, the energy treatment tool 33, and the forceps 35 are supported by an operator 71 or an assistant during the operation.
- the support arm device 40 includes an arm portion 43 extending from the base portion 41.
- the arm portion 43 includes joint portions 45 a, 45 b, 45 c and links 47 a, 47 b and is driven by control from the arm control device 57.
- the endoscope 20 is supported by the arm portion 43, and its position and posture are controlled. Thereby, the fixation of the stable position of the endoscope 20 can be realized.
- the endoscope 20 includes a lens barrel 21 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 75, and a camera head 23 connected to the proximal end of the lens barrel 21.
- a lens barrel 21 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 75, and a camera head 23 connected to the proximal end of the lens barrel 21.
- an endoscope 20 configured as a so-called rigid mirror having a rigid lens barrel 21 is illustrated, but the endoscope 20 is configured as a so-called flexible mirror having a flexible lens barrel 21. Also good.
- An opening into which an objective lens is fitted is provided at the tip of the lens barrel 21.
- a light source device 55 is connected to the endoscope 20, and light generated by the light source device 55 is guided to the tip of the lens barrel by a light guide that extends inside the lens barrel 21. Irradiation is performed toward the observation target in the body cavity of the patient 75 through the lens.
- the endoscope 20 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 23, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to a camera control unit (CCU) 51.
- the camera head 23 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
- the camera head 23 may be provided with a plurality of imaging elements.
- a plurality of relay optical systems are provided inside the lens barrel 21 in order to guide observation light to each of the plurality of imaging elements.
- the CCU 51 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 20 and the display device 53. Specifically, the CCU 51 performs various types of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example, on the image signal received from the camera head 23. The CCU 51 provides the display device 53 with the image signal subjected to the image processing. Further, the CCU 51 transmits a control signal to the camera head 23 to control its driving.
- the control signal can include information regarding imaging conditions such as magnification and focal length.
- the display device 53 displays an image based on an image signal subjected to image processing by the CCU 51 under the control of the CCU 51.
- high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320), and / or 3D display
- a display device 53 capable of high-resolution display and / or 3D display can be used.
- a more immersive feeling can be obtained by using a display device 53 having a size of 55 inches or more.
- a plurality of display devices 53 having different resolutions and sizes may be provided depending on the application.
- the light source device 55 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 20 when photographing a surgical site.
- a light source such as an LED (light emitting diode)
- the arm control device 57 is configured by a processor such as a CPU, for example, and operates according to a predetermined program to control driving of the arm portion 43 of the support arm device 40 according to a predetermined control method.
- the input device 59 is an input interface for the endoscopic surgery system 10.
- the user can input various information and instructions to the endoscopic surgery system 10 via the input device 59.
- the user inputs various information related to the operation, such as the patient's physical information and information about the surgical technique, via the input device 59.
- the user instructs to drive the arm unit 43 via the input device 59 or to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 20.
- An instruction to drive the energy treatment device 33 is input.
- the type of the input device 59 is not limited, and the input device 59 may be various known input devices.
- the input device 59 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 69 and / or a lever can be applied.
- the touch panel may be provided on the display surface of the display device 53.
- the input device 59 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head-Mounted Display), and various types of input are performed according to the user's gesture and line of sight detected by these devices. Is done.
- the input device 59 includes a camera capable of detecting the user's movement, and various inputs are performed according to the user's gesture and line of sight detected from the video captured by the camera.
- the input device 59 includes a microphone capable of collecting a user's voice, and various inputs are performed by voice through the microphone.
- the input device 59 is configured to be able to input various information without contact, so that a user belonging to the clean area (for example, the operator 71) can operate a device belonging to the unclean area without contact. Is possible.
- the user since the user can operate the device without releasing his / her hand from the surgical tool he / she has, the convenience for the user is improved.
- the treatment instrument control device 61 controls the driving of the energy treatment instrument 33 for tissue cauterization, incision, or blood vessel sealing.
- the pneumoperitoneum device 63 gas is introduced into the body cavity via the pneumoperitoneum tube 31.
- the recorder 65 is a device that can record various types of information related to surgery.
- the printer 67 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the support arm device 40 includes a base portion 41 that is a base and an arm portion 43 that extends from the base portion 41.
- the arm portion 43 is composed of a plurality of joint portions 45a, 45b, 45c and a plurality of links 47a, 47b connected by the joint portions 45b.
- FIG. The structure of the arm part 43 is simplified and shown.
- the shape, number and arrangement of the joint portions 45a to 45c and the links 47a and 47b, the direction of the rotation axis of the joint portions 45a to 45c, and the like are appropriately set so that the arm portion 43 has a desired degree of freedom.
- the arm portion 43 can be preferably configured to have 6 degrees of freedom or more.
- the endoscope 20 can be freely moved within the movable range of the arm portion 43, so that the barrel 21 of the endoscope 20 can be inserted into the body cavity of the patient 75 from a desired direction. It becomes possible.
- the joints 45a to 45c are provided with actuators, and the joints 45a to 45c are configured to be rotatable around a predetermined rotation axis by driving the actuators.
- the rotation angle of each joint portion 45a to 45c is controlled, and the driving of the arm portion 43 is controlled.
- the arm control device 57 can control the driving of the arm unit 43 by various known control methods such as force control or position control.
- the arm control device 57 appropriately controls the driving of the arm unit 43 in accordance with the operation input.
- the position and posture of the endoscope 20 may be controlled.
- the endoscope 20 at the distal end of the arm portion 43 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement.
- the arm part 43 may be operated by what is called a master slave system.
- the arm unit 43 can be remotely operated by the user via the input device 59 installed at a location away from the operating room.
- the arm control device 57 When force control is applied, the arm control device 57 receives the external force from the user and moves the actuators of the joint portions 45a to 45c so that the arm portion 43 moves smoothly according to the external force. You may perform what is called power assist control to drive. Thereby, when the user moves the arm unit 43 while directly touching the arm unit 43, the arm unit 43 can be moved with a relatively light force. Accordingly, the endoscope 20 can be moved more intuitively and with a simpler operation, and the convenience for the user can be improved.
- the endoscope 20 is supported by a doctor called a scopist.
- the position of the endoscope 20 can be more reliably fixed without relying on human hands, so that an image of the surgical site can be stably obtained. It becomes possible to perform the operation smoothly.
- the arm controller 57 does not necessarily have to be provided in the cart 50. Further, the arm control device 57 is not necessarily one device. For example, the arm control device 57 may be provided in each of the joint portions 45a to 45c of the arm portion 43 of the support arm device 40. The plurality of arm control devices 57 cooperate with each other to drive the arm portion 43. Control may be realized.
- the light source device 55 supplies irradiation light to the endoscope 20 when photographing a surgical site.
- the light source device 55 is composed of a white light source composed of, for example, an LED, a laser light source, or a combination thereof.
- a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Adjustments can be made.
- the driving of the light source device 55 may be controlled so as to change the intensity of the output light every predetermined time.
- the driving of the image sensor of the camera head 23 is controlled to acquire images in a time-sharing manner, and the images are synthesized, so that high dynamics without so-called blackout and overexposure are obtained. A range image can be generated.
- the light source device 55 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow-band light observation is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue.
- ICG indocyanine green
- the light source device 55 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 2 is a block diagram illustrating an example of functional configurations of the camera head 23 and the CCU 51 illustrated in FIG.
- the camera head 23 has a lens unit 25, an imaging unit 27, a drive unit 29, a communication unit 26, and a camera head control unit 28 as its functions.
- the CCU 51 includes a communication unit 81, an image processing unit 83, and a control unit 85 as its functions.
- the camera head 23 and the CCU 51 are connected to each other via a transmission cable 91 so that they can communicate with each other.
- the lens unit 25 is an optical system provided at a connection portion with the lens barrel 21. Observation light taken from the tip of the lens barrel 21 is guided to the camera head 23 and enters the lens unit 25.
- the lens unit 25 is configured by combining a plurality of lenses including a zoom lens and a focus lens. The optical characteristics of the lens unit 25 are adjusted so that the observation light is condensed on the light receiving surface of the image pickup device of the image pickup unit 27.
- the zoom lens and the focus lens are configured such that their positions on the optical axis are movable in order to adjust the magnification and focus of the captured image.
- the image pickup unit 27 is configured by an image pickup device, and is arranged at the rear stage of the lens unit 25.
- the observation light that has passed through the lens unit 25 is collected on the light receiving surface of the image sensor, and an image signal corresponding to the observation image is generated by photoelectric conversion.
- the image signal generated by the imaging unit 27 is provided to the communication unit 26.
- CMOS Complementary Metal Metal Oxide Semiconductor
- the imaging element for example, an element capable of capturing a high-resolution image of 4K or more may be used.
- the image sensor that configures the image capturing unit 27 is configured to have a pair of image sensors for acquiring right-eye and left-eye image signals corresponding to 3D display. By performing the 3D display, the operator 71 can more accurately grasp the depth of the living tissue in the operation site.
- the imaging unit 27 is configured as a multi-plate type, a plurality of lens units 25 are also provided corresponding to each imaging element.
- the imaging unit 27 is not necessarily provided in the camera head 23.
- the imaging unit 27 may be provided in the barrel 21 immediately after the objective lens.
- the drive unit 29 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 25 by a predetermined distance along the optical axis under the control of the camera head control unit 28. Thereby, the magnification and the focus of the image captured by the imaging unit 27 can be appropriately adjusted.
- the communication unit 26 includes a communication device for transmitting and receiving various types of information to and from the CCU 51.
- the communication unit 26 transmits the image signal obtained from the imaging unit 27 as RAW data to the CCU 51 via the transmission cable 91.
- the image signal is preferably transmitted by optical communication.
- the surgeon 71 performs the surgery while observing the state of the affected part with the captured image, so that a moving image of the surgical part is displayed in real time as much as possible for safer and more reliable surgery. Because it is required.
- the communication unit 26 is provided with a photoelectric conversion module that converts an electrical signal into an optical signal.
- the image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 51 via the transmission cable 91.
- the communication unit 26 receives a control signal for controlling the driving of the camera head 23 from the CCU 51.
- the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
- the communication unit 26 provides the received control signal to the camera head control unit 28.
- control signal from the CCU 51 may also be transmitted by optical communication.
- the communication unit 26 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
- the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 28.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus are automatically set by the control unit 85 of the CCU 51 based on the acquired image signal. That is, a so-called AE (Auto-Exposure) function, AF (Auto-Focus) function, and AWB (Auto-White Balance) function are mounted on the endoscope 20.
- AE Auto-Exposure
- AF Auto-Focus
- AWB Auto-White Balance
- the camera head control unit 28 controls driving of the camera head 23 based on the control signal from the CCU 51 received via the communication unit 26. For example, the camera head control unit 28 controls driving of the imaging element of the imaging unit 27 based on information indicating that the frame rate of the captured image is specified and / or information indicating that the exposure at the time of imaging is specified. For example, the camera head control unit 28 appropriately moves the zoom lens and the focus lens of the lens unit 25 via the drive unit 29 based on information indicating that the magnification and focus of the captured image are designated.
- the camera head control unit 28 may further have a function of storing information for identifying the lens barrel 21 and the camera head 23.
- the camera head 23 can be resistant to autoclave sterilization by arranging the configuration of the lens unit 25, the imaging unit 27, etc. in a sealed structure with high airtightness and waterproofness.
- the communication unit 81 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 23.
- the communication unit 81 receives an image signal transmitted from the camera head 23 via the transmission cable 91.
- the image signal can be suitably transmitted by optical communication.
- the communication unit 81 is provided with a photoelectric conversion module that converts an optical signal into an electric signal.
- the communication unit 81 provides the image processing unit 83 with the image signal converted into an electrical signal.
- the communication unit 81 transmits a control signal for controlling the driving of the camera head 23 to the camera head 23.
- the control signal may also be transmitted by optical communication.
- the image processing unit 83 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 23.
- image processing for example, development processing, image quality enhancement processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing)
- image quality enhancement processing band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.
- / or enlargement processing electronic zoom processing
- the image processing unit 83 performs detection processing on the image signal for performing AE, AF, and AWB.
- the image processing unit 83 is configured by a processor such as a CPU and a GPU, and the above-described image processing and detection processing can be performed by the processor operating according to a predetermined program.
- the image processing unit 83 is configured by a plurality of GPUs, the image processing unit 83 appropriately divides information related to the image signal and performs image processing in parallel by the plurality of GPUs.
- the control unit 85 performs various controls related to imaging of the surgical site by the endoscope 20 and display of the captured image. For example, the control unit 85 generates a control signal for controlling the driving of the camera head 23. At this time, when the imaging condition is input by the user, the control unit 85 generates a control signal based on the input by the user. Alternatively, when the endoscope 20 is equipped with the AE function, the AF function, and the AWB function, the control unit 85 determines the optimum exposure value, focal length, and the distance according to the detection processing result by the image processing unit 83. A white balance is appropriately calculated and a control signal is generated.
- control unit 85 causes the display device 53 to display an image of the surgical site based on the image signal subjected to the image processing by the image processing unit 83. At this time, the controller 85 recognizes various objects in the surgical part image using various image recognition techniques.
- control unit 85 detects the shape and color of the edge of the object included in the surgical part image, thereby removing surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 33, and the like. Can be recognized.
- the control unit 85 uses the recognition result to superimpose and display various types of surgery support information on the image of the surgical site. Surgery support information is displayed in a superimposed manner and presented to the operator 71, so that the surgery can be performed more safely and reliably.
- the transmission cable 91 connecting the camera head 23 and the CCU 51 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- communication is performed by wire using the transmission cable 91, but communication between the camera head 23 and the CCU 51 may be performed wirelessly.
- communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 91 in the operating room, so that the situation where the movement of the medical staff in the operating room is hindered by the transmission cable 91 can be solved.
- the endoscopic surgery system 10 has been described here as an example, a system to which the technology according to the present disclosure can be applied is not limited to such an example.
- the technology according to the present disclosure may be applied to a testing flexible endoscope system or a microscope operation system.
- FIG. 3 is a diagram illustrating an example of a screen displayed on the display device 53.
- an image 201 captured by the endoscope 20 and an image 202 captured by the sub camera are displayed.
- the endoscope 20 is a main camera
- a camera different from the main camera is a sub camera.
- this technique is applicable to the apparatus which image
- the main camera and a plurality of sub cameras are inserted into the body cavity of the patient 75, and images from the plurality of cameras are displayed on the display device 53.
- the display is a display that is easy for the operator 71 to visually recognize.
- the forceps 231, the forceps 232, and the thread 233 are captured in the image 201 captured by the endoscope 20.
- An image 202 captured by the sub camera is displayed in computer graphics (hereinafter referred to as mirror 211) in which a mirror is replicated.
- the screen example shown in FIG. 3 shows a state where the forceps 231 and the thread 233 are reflected on the mirror 211.
- the forceps 231 is imaged by the sub camera.
- the imaged forceps 231 is displayed in the mirror 211.
- the forceps 231 displayed in the mirror 211 is described with a forceps 231 'and a dash. Also in the following description, the object displayed in the mirror 211 is described with a dash.
- the thread 233 ' is also projected.
- the thread 233 is in a state of being pinched by the forceps 232.
- the image from the sub-camera can be provided as an image that is easy for the user to visually recognize.
- a picture replicating a mirror
- it may be a picture replicating an actual tool other than a mirror, such as a tooth mirror or a loupe.
- a tool that displays an image of a sub camera is described as a virtual tool.
- An actual tool is displayed as a virtual tool on the screen, and an image from the sub camera is displayed on the virtual tool.
- the virtual tool is the above-described mirror 211
- the angle of the mirror 211 when the angle of the mirror 211 is changed, the image from the sub camera displayed on the mirror is also changed following the angle.
- an image in which the positional relationship between the forceps 231 and the thread 233 is changed is displayed in the mirror 211.
- the virtual tool is a loupe
- the enlargement ratio is changed, for example, when an operation such as moving the loupe as a virtual tool closer to or away from the forceps 232 is performed, an image displayed in the loupe is also displayed. The image is enlarged or reduced according to the change. Note that the image displayed in the loupe in this case can be a part of the image captured by the main camera.
- the loupe is superimposed on the image captured by the main camera, and an image obtained by enlarging the main image in the loupe is displayed in the loupe.
- optical tools that exist in the real world such as tooth mirrors (mirrors) and loupes
- PinP PinP (Picture-In-Picture) is performed by superimposing an image (sub-image) from the sub-camera.
- the command for operating the virtual tool may be the same command as when operating the tool in the real world, it is easy to convey to a person, and a third party (a person other than the surgeon 71) operates it. It is easy to give an instruction, and an operation performed by a third party can easily obtain a sufficient result.
- the third party has used a tool in the real world, it is easy to operate the virtual tool, and the third party can easily operate the virtual tool desired by the surgeon 71. For example, a third person can perform an operation before the surgeon 71 gives an instruction. Therefore, operations such as surgery can be performed more smoothly.
- sub camera images can be processed by image processing.
- the virtual tool is the mirror 211
- the angle of the mirror 211 is changed
- the image 202 displayed on the mirror 211 is also changed, but the change can be dealt with by image processing.
- image processing it is possible to deal with user operations without moving the sub-camera.
- By not moving the sub camera it is possible to prevent the sub camera and the organ from coming into contact with each other, and it is possible to improve safety.
- FIG. 4 is a diagram for explaining the attachment position of the sub camera in the endoscope.
- the endoscope 20a In the body cavity of the patient 75, the endoscope 20a, the forceps 35, and the endoscope 20b are inserted.
- the endoscope 20a is a main camera
- the endoscope 20b is a sub camera. In this way, a plurality of endoscopes 20 can be inserted into the body cavity, and one of the endoscopes 20 can be used as the main camera and the other endoscope 20 can be used as the sub camera.
- the diameter of the endoscope 20b as the sub camera may be an endoscope having a smaller diameter than the diameter of the endoscope 20a as the main camera. Further, the resolution of the sub camera may be lower than the resolution of the main camera, and may not be the same resolution.
- the main camera and the sub camera may be switched according to an instruction from the operator 71. That is, in the case shown in FIG. 4, at a certain point in time, the endoscope 20a is a main camera and the endoscope 20b is a sub-camera. A mechanism that is a sub camera and the endoscope 20b serves as a main camera may be provided.
- FIG. 4 shows the case where there is one sub camera, a plurality of sub cameras may be inserted into the body cavity.
- FIG. 5 is a diagram for explaining another mounting position of the sub camera in the endoscope.
- the endoscope 20 In the body cavity of the patient 75, the endoscope 20, the forceps 35a, and the forceps 35b are inserted.
- two forceps 35 are inserted into the body cavity, and the sub-camera 251 is attached to one of the forceps 35b.
- the sub camera 251 is mounted at a position that does not hinder the function of the forceps 35b as a forceps.
- the sub camera 251 can be a camera that has been authenticated as a medical camera.
- an endoscope called a capsule endoscope may be used as the sub camera 251 and attached to the forceps 35b.
- the forceps 35b may be inserted into the body cavity while holding the sub camera 251.
- the sub camera 251 may be detachably attached to the forceps 35b, or may be configured as a part of the forceps 35b.
- the endoscope 20 is used as a main camera, and a camera attached to a surgical instrument other than the endoscope 20 is used as a sub camera.
- FIG. 5 shows a case where there is one sub camera 251, a plurality of sub cameras 251 may be attached to a plurality of surgical instruments and inserted into a body cavity.
- the main camera and the sub camera may be switched according to an instruction from the operator 71. That is, in the case shown in FIG. 5, at a certain point in time, the endoscope 20a is a main camera and the sub camera 251 is a sub camera. However, when a switching instruction is given from the user, the endoscope 20a is A mechanism may be provided in which the sub camera 251 functions as a main camera.
- the medical robot has a configuration as shown in FIG.
- the medical robot includes an operation unit 281, a main body 282, and a monitor unit 283.
- the operation unit 281 is a device for operating the main body 282.
- the main body 282 includes, for example, three arms 291 to 293.
- the operation unit 281 is operated by the operator 71a to remotely operate the arms 291 to 293 of the main body 282.
- the surgeon 71a operates the arms 291 to 293 of the main body 282 while looking at the display provided in the operation unit 281.
- the arms 291 to 293 of the main body 282 are an electric knife, an endoscope, forceps, and the like.
- the monitor unit 283 is a monitor that is installed near the main body 282 and monitors the state of the operation.
- the surgeon 71b looks at the monitor unit 283 and supports surgery as necessary.
- the arm 292 is an arm camera having the same function as the above-described endoscope, and functions as a main camera.
- the arms 291 and 293 are forceps, a scalpel, and the like.
- a sub camera 252 and a sub camera 253 are attached to the arm 291 and the arm 293, respectively.
- the sub-cameras 252 and 253 are mounted at positions that do not hinder the functions of the arms 291 and 293 as an arm, for example, the functions of a knife and forceps.
- the sub cameras 252 and 253 may be cameras that have been authenticated as medical cameras, and are referred to as capsule endoscopes, for example. Can be used as the sub cameras 252 and 253.
- the sub cameras 252 and 253 may be detachably attached to the arms 291 and 293, or may be configured as part of the arms 291 and 293.
- the arm camera of the arm 292 is used as a main camera, and a camera attached to a surgical instrument other than the arm camera is used as a sub camera.
- FIG. 7 shows a case where there are two sub cameras 251, one or a plurality of (three or more) sub cameras 252 (253) are respectively attached to a plurality of surgical instruments and inserted into a body cavity. May be.
- the main camera and the sub camera may be switched according to an instruction from the operator 71. That is, in the case shown in FIG. 7, at a certain point in time, the arm camera of the arm 292 is the main camera and the sub cameras 252 and 253 are sub cameras. A mechanism may be provided in which the sub camera is instructed by one of the sub cameras 252 and 253 to function as the main camera.
- FIG. 8 shows the configuration of the image processing unit 83 (FIG. 2) that generates the screen as shown in FIG. 3 and controls the display on the display device 53.
- the case where the endoscope 20a as the main camera and the endoscope 20b as the sub camera are inserted into the body cavity as shown in FIG. 4 will be described as an example.
- the image processing unit 83 includes a virtual tool drawing superimposing unit 411 and an image conversion processing unit 412.
- the virtual tool drawing superimposing unit 411 is supplied with image data from the main camera, in this case, image data from the endoscope 20a.
- the image conversion processing unit 412 is supplied with image data from the sub camera, in this case, image data from the endoscope 20b.
- the position sensor 421 is a sensor that detects the position of the endoscope 20a or the endoscope 20b.
- a sensor using GPS (Global Positioning System) or the like can be used.
- the position sensor 421 is provided to grasp the positional relationship between the endoscope 20a and the endoscope 20b (main camera and sub-camera), for example, how far away they are, and how many angles are formed. Yes. As long as these pieces of information are obtained, any sensor may be used as the position sensor 421.
- the position sensor 421 may not be provided.
- the position sensor 421 may not be provided.
- the position sensor 421 may not be provided.
- the position sensor 421 may not be provided.
- the processing in the image processing unit 83 will be described with reference to FIG.
- the upper left diagram in FIG. 9 shows an image captured by the endoscope 20a that is the main camera.
- the upper right diagram in FIG. 9 shows an image captured by the endoscope 20b as a sub camera.
- the lower part of FIG. 9 shows an image obtained by superimposing an image captured by the endoscope 20b on an image captured by the endoscope 20a.
- the lower diagram of FIG. 9 is the image shown in FIG. 3, and here, the case where the image described with reference to FIG. 3 is generated will be described as an example.
- the forceps 231, the forceps 232, and the thread 233 are imaged.
- image data of the image 201 is supplied to the virtual tool drawing superimposing unit 411.
- the forceps 231, the forceps 232, and the thread 233 are imaged.
- image data of the image 202 is supplied to the image conversion processing unit 412.
- Both the endoscope 20a and the endoscope 20b image the forceps 231, the forceps 232, and the thread 233, but are different images because they are inserted at different positions.
- the endoscope 20a and the endoscope 20b are inserted into the body cavity in a positional relationship having an angle of 90 degrees. Therefore, when the image 201 captured by the endoscope 20a is an image captured from the front, the image 202 captured by the endoscope 20b is an image captured from the side.
- the image conversion processing unit 412 cuts out, enlarges, reduces, or converts an image to be superimposed on the image 201 from the endoscope 20a as the main camera from the image 202 from the endoscope 20b as the sub camera. To do. That is, the image conversion processing unit 412 generates an image to be displayed in the mirror 211 from the image 202.
- the image conversion processing unit 412 performs processing using position information from the position sensor 421 as necessary when performing processing such as conversion.
- the virtual tool drawing superimposing unit 411 draws the mirror 211 on the image 201 from the main camera. Then, an image in which the image from the image conversion processing unit 412 is displayed in the drawn mirror 211 is generated. The image data of the image generated by the virtual tool drawing superimposing unit 411 is supplied to the display device 53, and an image as shown in the lower diagram of FIG. 9 is displayed on the display device 53.
- the processing described with reference to FIG. 10 will be described by taking the processing at the time of sewing work as an example. Therefore, the process at the time of other work is a process suitable for the work.
- the processing of the flowchart shown in FIG. 10 is changed as appropriate according to operations such as an incision operation, an affected area removal operation, and a cleaning operation.
- step S101 it is determined whether or not the sewing operation is performed. If it is determined in step S101 that the sewing operation is not performed, the process proceeds to step S108. As described above, since it differs from work to work, when it is determined that the work is not a suturing work, other work, for example, whether it is an incision work or an affected part removal work, is sequentially determined. You may make it.
- the determination as to whether or not the operation is a suturing operation can be performed by, for example, recognizing the occurrence of the operator 71 by voice.
- the determination in step S101 can be performed by determining whether or not the operator 71 has issued a keyword relating to suturing such as “start sewing” and “thread and needle”.
- the operation may be determined whether or not the operation is a suturing operation using the fact that the flow of the operation is patterned by an operation method.
- the process for determining whether or not the operation is a suturing operation in the operation flow may be performed together with the determination based on the voice recognition described above, and the determination may be performed by integrating the results.
- the stitching operation is performed by image recognition. For example, since a thread or a needle is used during a sewing operation, it is determined whether or not a thread or a needle is detected from an image captured by a main camera or a sub camera. When a thread or a needle is detected It may be determined that the sewing operation is performed.
- step S101 it is also possible to make the determination in step S101 by voice recognition and image recognition. Further, the determination in step S101 may be performed by another method not illustrated here.
- step S101 If it is determined in step S101 that it is a suturing operation, the process proceeds to step S102.
- step S102 a default value is set. The default value to be set will be described with reference to FIG.
- the mirror 211 which is a virtual tool is displayed on the side where the sub camera (endoscope 20b) is present.
- the mirror 211 is also within the screen. Displayed on the right side of. First, in this way, the side displaying the image of the sub camera is set as a default value.
- the position where the mirror 211 is displayed is a position away from the forceps on the side close to the sub camera (endoscope 20b) by the length of the tip portion of the forceps.
- the mirror 211 is positioned away from the forceps 232 by the length (length d1) of the distal end portion of the forceps. Is displayed.
- the length d1 between the forceps 232 and the mirror 211 may be the length between the center of the forceps 232 and the center of the mirror 211, or the side surface of the forceps 232 (side closer to the mirror 211) and the side surface of the mirror 211 (forceps).
- the length on the side close to 232 may be used. That is, how the point between the forceps 232 and the mirror 211 is separated by the length of the forceps tip may be set in any way.
- the display position of the mirror 211 in other words, the distance from the forceps is set as a default value.
- the size of the mirror 211 is set as a default value.
- the size of the mirror 211 can be the radius of the tip of the forceps.
- the shape of the mirror 211 is a circle, the shape is a circular shape having a radius of the size (length d1 in FIG. 11) of the distal end portion of the forceps.
- the mirror 211 has an elliptical shape as shown in FIG. 11, it has an elliptical shape having a major axis or a minor axis that is twice the size (length d1) of the distal end portion of the forceps.
- setting the display position and size of the mirror 211 of the virtual tool on the basis of the size of the tip portion of the forceps is a tool familiar to the operator 71, and such a tool. This is because the size of the virtual tool can be easily recognized by using the size of the tool as a reference.
- the forceps is described as an example, but the size of the main subject (for example, forceps) in the main image can be used as a reference.
- the angle of the mirror 211 is 45 degrees as a default value.
- the angle of the mirror 211 is 90 degrees, only the side surface of the mirror 211 can be seen, and the mirror 211 is positioned perpendicular to the image 201. In other words, when the angle of the mirror 211 is 90 degrees, the mirror surface is not facing this side.
- the mirror surface of the mirror 211 is directed to this side and is in a state of being parallel to the image 201. In such a state, an image from the sub camera is not displayed on the mirror 211.
- 45 degrees which is between 0 degrees and 90 degrees, is set as the default value of the mirror 211.
- the default value is 45 degrees here, the default value related to the angle may be set to other than 45 degrees depending on the positional relationship between the endoscope 20a and the endoscope 20b (main camera and sub camera). In other words, the default value related to the angle may be set as a variable value set according to the situation at that time, not a fixed value.
- the angle at which the side of the forceps 232 closest to the sub camera can be projected in other words, the angle at which the image captured when viewed from the direction perpendicular to the surface of the forceps 232 captured by the main camera can be projected.
- a default value may be set.
- the default value may be set by learning.
- the position and size of the mirror 211 are set as defaults, the position, size, angle, and the like of the mirror 211 may be changed by the operator 71 after the setting.
- the operator 71 changes the display position, size, angle, etc. of the mirror 211, the history is recorded (learned). Then, frequently used positions, sizes, angles, etc. may be set as default values.
- a default value may be set according to the surgical sequence. Depending on the sequence of the operation, what kind of operation is performed, for example, an operation for removing an affected part, an operation for cutting a bone, and what kind of work process is performed, for example, an incision operation
- the default value may be set according to whether it is a sewing operation or the like.
- step S102 when a default value is set in step S102, the process proceeds to step S103.
- step S103 the sub-camera image is converted. In other words, an image to be displayed on the mirror surface of the virtual tool (mirror 211) is generated.
- FIG. 12 is a diagram for explaining the positional relationship between the main camera, the sub camera, and the mirror 211.
- the imaging surface of the main camera is the main camera 501
- the imaging surface of the sub camera is the sub camera 502.
- FIG. 12 shows a case where the main camera 501 and the sub camera 502 are installed with an acute angle of 90 degrees or less.
- the main camera 501 and the sub camera 502 image the object 511.
- the main camera 501 captures an image of the surface side of the object 511 including the x axis and the y axis.
- the sub camera 502 captures an image of the surface of the object 511 formed by the y ′ axis and the z ′ axis.
- the main camera 501 captures an image 201 as illustrated in FIG. 12 by capturing the object 511.
- a mirror 211 is superimposed on the image 201 for explanation.
- a point P in the mirror 211 is a point Q in the image 202 captured by the sub camera 502.
- the virtual wall xw is a wall created by the object 511.
- the edge of the object 511 is extracted and is a straight line passing through the edge.
- the point P of the mirror 211 reflects the point R on the virtual wall xw, and the point R (point P) corresponds to the point Q in the image 202 captured by the sub camera 502.
- the center point of the mirror 211 is a point C, and the x-axis coordinate is described as xc.
- the x-axis coordinate of the point P to be obtained in the mirror 211 is described as xp.
- the distance between the point c of the mirror 211 and the main camera 501 is a distance dxc, and the distance between the point P of the mirror 211 and the main camera 501 is a distance dxp.
- an angle formed by a line connecting the point P on the mirror 211 and the point R on the virtual plane xw and a perpendicular to the mirror 211 is defined as an angle ⁇ .
- the main camera 501 and the sub camera 502 are arranged at positions that satisfy the relationship of the angle ⁇ .
- the x-axis coordinate of the point R on the virtual plane xw is described as xw.
- the coordinate of the point Q in the image 202 of the sub camera 502 corresponding to the point P on the mirror 211 (coordinate of the z ′ axis) is obtained by the following equation.
- the distance da between the point R on the virtual plane xw and the point P of the mirror 211 is obtained by the following equation (1).
- the image 202 captured by the sub camera 502 displayed in the mirror 211 is specified and cut out.
- the mirror 211 When enlarging or reducing the image in the mirror 211, the mirror 211 can be handled as a convex mirror, and basically the same processing as described above can be performed by changing the perpendicular line on each surface.
- image signal processing such as enlargement, reduction, brightness adjustment, edge enhancement, etc. may be performed. Since processing that cannot be performed with an actual mirror can be performed as image processing with the mirror 211 as a virtual tool, such image processing may be performed.
- step S104 the process proceeds to step S104.
- step S104 the virtual tool drawing / superimposing unit 411 draws and superimposes the virtual tool (mirror 211) on the image 201 captured by the main camera. Further, the virtual tool drawing superimposing unit 411 superimposes the image generated by the image conversion processing unit 412 in the process of step S103 on the drawn virtual tool.
- a screen as shown in the lower diagram of FIG. 9 (FIG. 3) is provided to the operator 71.
- step S105 it is determined whether or not a control value has been input. For example, it is determined that the control value has been input when an instruction such as “close the mirror”, “turn off the mirror”, or “end suture” is issued by the operator 71 by voice input.
- step S105 If it is determined in step S105 that no control value has been input, the process returns to step S105, and the subsequent processes are repeated. That is, in this case, processing such as conversion of the image of the sub camera and drawing of the virtual tool based on the control value set at that time is continued.
- step S105 it is determined whether or not the sewing is finished. This determination is performed by determining whether or not the control value input in step S105 is a control value indicating the end of stitching. For example, when a control value is input by voice input, it is performed by determining whether or not a keyword indicating the end of sewing such as “end of sewing” has been issued.
- step S106 If it is determined in step S106 that the stitching has not ended, the process proceeds to step S107.
- step S107 a change value is set based on the input control value. After the change value is set, the process returns to step S103, and the subsequent processing is repeated based on the change value.
- step S106 determines whether the end of sewing has been instructed. If it is determined in step S106 that the end of sewing has been instructed, the process proceeds to step S108. In step S108, there is a case where it is determined in step S101 that the sewing operation is not performed.
- step S108 when it is determined that the end of the operation is not instructed, the process is returned to step S101, and the subsequent processes are repeated. On the other hand, if it is determined in step S108 that the end of the operation has been instructed, the process based on the flowchart shown in FIG. 10 is ended.
- the tool used by the surgeon 71 in the real world is superimposed on the image from the main camera as a virtual tool, and the image from the sub camera is superimposed on the virtual tool.
- the image of the sub camera can be provided by a method that is easy for the operator 71 to visually recognize.
- the mirror 211 is described as an example of the virtual tool.
- the virtual tool may be other than the mirror.
- the virtual tool may be a loupe, and an enlarged image may be displayed in the loupe.
- the virtual tool may be a spotlight, and for example, an image from the main camera or an image from the sub camera may be projected brightly. Displaying such a spotlight as a virtual tool is effective when it is desired to partially control the brightness.
- an image that is brightly displayed and illuminated by the spotlight in this case can be a part of an image captured by the main camera. That is, when the virtual tool is a spotlight, the spotlight is superimposed on the image captured by the main camera, and an image in which the main image illuminated by the spotlight is displayed brightly is presented to the user.
- a special tool image may be used as a lens filter so that a special light image is displayed in the lens filter, and a special light multiplexed image may be realized. Images with different features can coexist naturally, and the location where the image is to be switched can be easily designated.
- FIG. 13 shows another screen example displayed on the display device 53.
- the sub camera 502-1 and the sub camera 502-2 are captured by the main camera.
- the image captured by the sub camera is displayed in the vicinity of the sub camera captured by the main camera.
- the sub camera 502-1 since the sub camera 502-1 is captured by the main camera, the sub camera 502-1 is located in the vicinity of the sub camera 502-1 (on the left side of the sub camera 502-1 in FIG. 13). The image 202-1 captured at is displayed. Further, since the sub camera 502-2 is imaged by the main camera, the sub camera 502-2 is imaged in the vicinity of the sub camera 502-2 (below the sub camera 502-2 in FIG. 13). An image 202-2 is displayed.
- the sub camera image may be displayed superimposed on the main camera image.
- a picture of the sub camera may be drawn by computer graphics, and an image captured by the sub camera may be displayed in the vicinity thereof.
- the position where the sub camera drawn by computer graphics is displayed is a position reflecting the positional relationship between the main camera and the sub camera and the positional relationship between the sub cameras. Therefore, it is assumed that information on the positions of the main camera and the sub camera is acquired.
- the position sensor 421 is provided, and information on the positions of the main camera and the sub camera is acquired by information from the position sensor 421. 6 and 7, in the case of a robot, the positional relationship between the arms 291 to 293 can be grasped in advance from the arm position control information and the angle of view information (zoom magnification, etc.) of each camera. It is also possible to use known information.
- the sub camera image frame is The display may be performed by displaying a dotted line or the like and indicating that the sub camera is outside the range captured by the main camera.
- FIG. 14 shows another display example.
- the display example shown in FIG. 14 is basically the same as the display example shown in FIG. 13, and when the forceps 35b and arm 291 are imaged by the main camera, the forceps 35b and arm 291 are located near the forceps 35b and arm 291. And an image from a sub camera attached to the arm 291 is displayed.
- the arm 291 and the arm 294 are picked up by the arm 292 which is the main camera. Has been.
- the image 202-1 captured by the sub camera 252 attached to the arm 291 is displayed in the vicinity of the arm 291.
- an image 202-2 captured by a sub camera (not shown) attached to the arm 294 is displayed in the vicinity of the arm 294.
- the arm 293 is not imaged by the arm 292 (arm camera) which is the main camera. As described above, the arm 293 not captured by the main camera is drawn by computer graphics. Then, in the vicinity of the drawn arm 293, an image 202-3 captured by the sub camera 253 attached to the arm 293 is displayed.
- the display in consideration of the depth information is displayed. It may be performed. For example, an image captured by a sub camera located at a position distant in the depth direction may be displayed small.
- a display using that information may also be performed. good. For example, an image from the sub camera may be displayed at the depth position of the arm.
- FIG. 4 the main camera is the endoscope 20a, and the endoscope 20b different from the endoscope 20a has been described as an example.
- the scope of application of the present technology is not limited when the main camera and the sub camera are mounted on different surgical instruments.
- the present technology described above can be applied even when the main camera and the sub camera are mounted on the same surgical instrument.
- the endoscope 601 includes a main camera 601a at the tip thereof.
- the endoscope 601 includes a sub camera 602a and a sub camera 602b in a part of a housing.
- the endoscope 601 may be configured to include the main camera 601a and the sub camera 602.
- the main camera 601a and the sub camera 601 are included in one surgical instrument such as the endoscope 601. The present technology can be applied even when the camera 602 is provided.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
- FIG. 16 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 1005 is further connected to the bus 1004.
- An input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010 are connected to the input / output interface 1005.
- the input unit 1006 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 1007 includes a display, a speaker, and the like.
- the storage unit 1008 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 1009 includes a network interface.
- the drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 1001 loads, for example, the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program. Is performed.
- the program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in advance in the ROM 1002 or the storage unit 1008.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- system represents the entire apparatus composed of a plurality of apparatuses.
- this technique can also take the following structures.
- One of the images captured by the plurality of imaging units is a main image, the other image is a sub-image, A conversion unit that converts the sub-image into a superimposition image;
- a medical image processing apparatus comprising: a superimposing unit that superimposes the converted image on a predetermined position in the main image.
- the said conversion part converts into the sub image according to the movement, when a display position is moved.
- the medical image processing apparatus as described in said (1).
- the sub image is displayed at a position satisfying the same positional relationship as the positional relationship between the imaging unit that captured the main image and the imaging unit that captured the sub image.
- Medical image processing device (5) The medical image processing apparatus according to any one of (1) to (4), wherein the plurality of imaging units are endoscopes. (6) The medical image processing apparatus according to any one of (1) to (5), wherein an imaging unit that captures the sub-image among the plurality of imaging units is provided in a surgical instrument. (7) The medical image processing apparatus according to any one of (1) to (5), wherein an imaging unit that captures the sub-image among the plurality of imaging units is provided in an arm.
- a superimposition unit that superimposes one of the images captured by the plurality of imaging units as a main image, another image as a sub image, and superimposing the sub image on the main image;
- a medical image processing apparatus in which an imaging unit that captures the sub image is displayed in the main image, and the sub image is displayed in the vicinity of the imaging unit.
- the imaging unit that captures the sub image is an imaging unit that is captured by the imaging unit that captures the main image.
- the imaging unit that captures the sub-image is displayed with a picture that mimics the surgical instrument provided with the imaging unit that captures the sub-image, The picture that replicates the surgical instrument is displayed at a position that satisfies the same positional relationship as the positional relationship between the imaging unit that captured the main image and the imaging unit that captured the sub-image.
- Image processing device One of the images captured by the plurality of imaging units is a main image, the other image is a sub-image, Converting the sub-image into an image for superimposition; A medical image processing method including a step of superimposing the converted image on a predetermined position in the main image.
- One of the images captured by the plurality of imaging units is a main image, another image is a sub image, and the sub image is superimposed on the main image.
- a medical image processing method comprising: displaying an imaging unit capturing the sub-image on the main image, and displaying the sub-image in the vicinity of the imaging unit.
- One of the images captured by the plurality of imaging units is a main image, the other image is a sub-image, Converting the sub-image into an image for superimposition;
- a program for executing a process including a step of superimposing the converted image on a predetermined position in the main image.
- One of the images captured by the plurality of imaging units is a main image, another image is a sub image, and the sub image is superimposed on the main image.
- a program for executing a process including a step of displaying an imaging unit capturing the sub-image on the main image and displaying the sub-image near the imaging unit.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne un dispositif de traitement d'image médicale, un procédé de traitement d'image médicale, et un programme pour permettre qu'une pluralité d'images soient affichées de façon aisément visualisable pour un utilisateur. Le dispositif de traitement d'image médicale est pourvu de : une unité de conversion qui définit l'une parmi une pluralité d'images, respectivement capturées par une pluralité d'unités d'imagerie, en tant qu'image principale et définit une autre des images sous la forme d'une sous-image, et convertit la sous-image en image de superposition ; et une unité de superposition qui superpose l'image convertie à une position prédéterminée dans l'image principale. Lorsqu'une position d'affichage est déplacée, l'unité de conversion convertit la sous-image en sous-image correspondant au déplacement. La position d'affichage de sous-image et la zone d'affichage sont définies en référence à la taille d'un sujet principal dans l'image principale. La présente invention peut être appliquée dans un endoscope.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016172421 | 2016-09-05 | ||
| JP2016-172421 | 2016-09-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018043205A1 true WO2018043205A1 (fr) | 2018-03-08 |
Family
ID=61301764
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/029919 Ceased WO2018043205A1 (fr) | 2016-09-05 | 2017-08-22 | Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018043205A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021075306A1 (fr) * | 2019-10-17 | 2021-04-22 | ソニー株式会社 | Dispositif de traitement d'informations chirurgicales, procédé de traitement d'informations chirurgicales et programme de traitement d'informations chirurgicales |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080030578A1 (en) * | 2006-08-02 | 2008-02-07 | Inneroptic Technology Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
| JP2011509715A (ja) * | 2008-01-10 | 2011-03-31 | タイコ ヘルスケア グループ リミテッド パートナーシップ | 外科用デバイスのための撮像システム |
-
2017
- 2017-08-22 WO PCT/JP2017/029919 patent/WO2018043205A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080030578A1 (en) * | 2006-08-02 | 2008-02-07 | Inneroptic Technology Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
| JP2011509715A (ja) * | 2008-01-10 | 2011-03-31 | タイコ ヘルスケア グループ リミテッド パートナーシップ | 外科用デバイスのための撮像システム |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021075306A1 (fr) * | 2019-10-17 | 2021-04-22 | ソニー株式会社 | Dispositif de traitement d'informations chirurgicales, procédé de traitement d'informations chirurgicales et programme de traitement d'informations chirurgicales |
| JPWO2021075306A1 (fr) * | 2019-10-17 | 2021-04-22 | ||
| JP7567804B2 (ja) | 2019-10-17 | 2024-10-16 | ソニーグループ株式会社 | 手術用情報処理装置、手術用情報処理方法及び手術用情報処理プログラム |
| US12150617B2 (en) | 2019-10-17 | 2024-11-26 | Sony Group Corporation | Medical information processing apparatus, medical information processing method, and medical information processing program |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7480477B2 (ja) | 医療用観察システム、制御装置及び制御方法 | |
| JP7067467B2 (ja) | 医療用情報処理装置、情報処理方法、医療用情報処理システム | |
| JP7095693B2 (ja) | 医療用観察システム | |
| JP2019162231A (ja) | 医療用撮像装置及び医療用観察システム | |
| WO2018123613A1 (fr) | Appareil de traitement d'image médicale, procédé de traitement d'image médicale et programme | |
| US20220008156A1 (en) | Surgical observation apparatus, surgical observation method, surgical light source device, and surgical light irradiation method | |
| JPWO2018168261A1 (ja) | 制御装置、制御方法、及びプログラム | |
| WO2018088105A1 (fr) | Bras de support médical et système médical | |
| JP7135869B2 (ja) | 発光制御装置、発光制御方法、プログラム、発光装置、および撮像装置 | |
| WO2018088113A1 (fr) | Actionneur d'entraînement d'articulation et système médical | |
| WO2018221068A1 (fr) | Dispositif, procédé et programme de traitement d'informations | |
| WO2017221491A1 (fr) | Dispositif, système et procédé de commande | |
| WO2021256168A1 (fr) | Système de traitement d'image médicale, dispositif de commande d'image chirurgicale et procédé de commande d'image chirurgicale | |
| WO2019181242A1 (fr) | Endoscope et système de bras | |
| JPWO2020045014A1 (ja) | 医療システム、情報処理装置及び情報処理方法 | |
| CN114340469B (zh) | 医疗支撑臂和医疗系统 | |
| JP7092111B2 (ja) | 撮像装置、映像信号処理装置および映像信号処理方法 | |
| WO2018043205A1 (fr) | Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme | |
| WO2022269992A1 (fr) | Système d'observation médicale, dispositif de traitement d'informations et procédé de traitement d'informations | |
| WO2020203164A1 (fr) | Système médical, dispositif de traitement d'informations, et procédé de traitement d'informations | |
| WO2020009127A1 (fr) | Système d'observation médicale, dispositif d'observation médicale et procédé de commande de dispositif d'observation médicale | |
| JP7420141B2 (ja) | 画像処理装置、撮像装置、画像処理方法、プログラム | |
| US20240346826A1 (en) | Medical observation system, information processing apparatus, and information processing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17846210 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17846210 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |