WO2014061554A1 - 画像処理装置及び画像処理方法 - Google Patents
画像処理装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2014061554A1 WO2014061554A1 PCT/JP2013/077613 JP2013077613W WO2014061554A1 WO 2014061554 A1 WO2014061554 A1 WO 2014061554A1 JP 2013077613 W JP2013077613 W JP 2013077613W WO 2014061554 A1 WO2014061554 A1 WO 2014061554A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- image
- image processing
- unit
- capsule endoscope
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/2163—Partitioning the feature space
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/034—Recognition of patterns in medical or anatomical images of medical instruments
Definitions
- the present invention relates to an image processing apparatus and an image processing method for processing an image acquired by a capsule endoscope in an examination using a capsule endoscope that is introduced into a subject and images the inside of the subject.
- a capsule endoscope is a device in which an imaging function, a wireless communication function, and the like are incorporated in a capsule-shaped casing formed in a size that can be introduced into the digestive tract of a subject.
- Capsule endoscopy is usually performed as follows. First, a medical worker such as a nurse attaches an antenna unit to the body surface of a patient who is a subject, and connects a receiving device capable of wireless communication with a capsule endoscope to the antenna unit. Then, the imaging function of the capsule endoscope is turned on, and the patient is swallowed. Thereby, the capsule endoscope is introduced into the subject, performs imaging while moving in the digestive tract by a peristaltic motion or the like, and wirelessly transmits image data of the in-vivo image. The image data is received by the receiving device and stored in the built-in memory. Thereafter, the patient can act freely, such as leaving the hospital, until the time designated by the medical staff, if the receiving device is carried around.
- the examination is temporarily terminated, and the medical staff collects the receiving device from the patient and connects it to an image processing device constituted by a workstation or the like.
- the image data stored in the receiving apparatus is downloaded (transferred) to the image processing apparatus, and is subjected to predetermined image processing in the image processing apparatus to be imaged.
- a medical worker observes the in-vivo image displayed on the screen of the image processing device, and after the capsule endoscope is swallowed, it reaches the large intestine without causing poor communication (antenna failure) or battery exhaustion, After confirming that image data is generated by imaging a necessary part in the subject, the patient is returned home. Thereafter, the medical worker cleans up the receiving device or the like and ends the work.
- Patent Document 1 determines whether or not an imaging apparatus is in a living body by analyzing the last image or a plurality of images received from the imaging apparatus. Technology is disclosed.
- the nurse when performing a capsule endoscopy, the nurse confirms on the screen that the capsule endoscope has reached the large intestine and imaging within the subject has been completed normally. You must wait until data download and image processing is complete.
- the doctor who diagnoses the patient grasps the range (organ) observed by the capsule endoscope and checks whether or not the in-vivo images necessary for the diagnosis have been acquired. And waiting for image processing to complete. If imaging is not normally completed and a necessary image has not been obtained, a reexamination is necessary. Therefore, it is necessary to keep the patient waiting during that time.
- the present invention has been made in view of the above, and is an image processing that can shorten the waiting time until the user can make a necessary determination after the end of the capsule endoscopy.
- An object is to provide an apparatus and an image processing method.
- an image processing apparatus includes a capsule type that wirelessly transmits image data that is introduced into a subject and generated by imaging the subject.
- a capsule endoscope system including an endoscope and a receiving device that receives and accumulates image data wirelessly transmitted from the capsule endoscope, an image that acquires and processes the image data from the receiving device
- An image data acquisition unit that acquires the image data from the receiving device in order from the latest imaging time, and the image data acquired by the image data acquisition unit in order of acquisition of the image data.
- An image processing unit that performs predetermined image processing; and a display control unit that performs control to display a result obtained by the predetermined image processing on a screen.
- the image processing unit generates an image based on the image data
- the display control unit performs control to display the images generated by the image processing unit in the order in which the images are generated. It is characterized by that.
- the image data acquisition unit acquires image data generated within a time that is a predetermined time in time series from the latest imaging time in the capsule endoscope.
- the image processing unit generates an image based on the image data, further determines whether or not a specific part in the subject is reflected in the image, and the display control unit , And control to display the result determined by the image processing unit.
- the image processing unit determines whether an organ in the subject is reflected in an image based on the image data, and then determines whether the organ is captured. It is characterized in that it is determined whether or not the organ shown in the image is the specific part.
- the image data acquisition unit divides a series of image data captured in one examination by the capsule endoscope into a plurality of blocks, and sequentially starts from the later imaging time of each block.
- the image processing unit acquires the image data, the image processing unit sequentially generates an image based on the image data, and the display control unit displays a plurality of images based on the image data respectively acquired from the plurality of blocks. Control is performed to display a screen provided with a plurality of image display areas.
- the image data acquisition unit acquires part of the image data of each block while sequentially switching the blocks from which the image data is acquired among the plurality of blocks.
- the image data acquisition unit acquires the image data in parallel from the plurality of blocks.
- An image processing method includes a capsule endoscope that wirelessly transmits image data that is introduced into a subject and imaged inside the subject, and is wirelessly transmitted from the capsule endoscope.
- a capsule endoscope system including a receiving device that receives and accumulates received image data, an image processing method for processing the image data acquired from the receiving device, the image data from the receiving device, An image data acquisition step for acquiring images in order from the latest imaging time; an image processing step for applying predetermined image processing to the image data acquired in the image data acquisition step in the order of acquisition of the image data; And a display control step for performing control to display the result obtained by the image processing on the screen.
- the image data stored in the receiving device is fetched in order from the latest imaging time, the image processing is performed in the order of obtaining the image data, and the result of the image processing is displayed on the screen. This makes it possible to shorten the time required to make a correct judgment than before.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system including an image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram illustrating an internal configuration of the capsule endoscope and the receiving device illustrated in FIG.
- FIG. 3 is a block diagram showing a schematic configuration of the image processing apparatus shown in FIG.
- FIG. 4 is a flowchart showing the operation of the capsule endoscope system shown in FIG.
- FIG. 5 is a schematic diagram for explaining the operation of the image processing apparatus shown in FIG.
- FIG. 6 is a schematic diagram illustrating a display example of a screen displayed on the display device during acquisition of image data.
- FIG. 7 is a schematic diagram for explaining the operation of the image processing apparatus according to the first modification.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system including an image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram illustrating an internal configuration of the capsule endoscope
- FIG. 8 is a flowchart showing the operation of the capsule endoscope system including the image processing apparatus according to the second embodiment of the present invention.
- FIG. 9 is a schematic diagram illustrating an example of a notification screen indicating that the large intestine has been confirmed.
- FIG. 10 is a schematic diagram illustrating an example of a notification screen indicating that the large intestine is not confirmed.
- FIG. 11 is a schematic diagram illustrating an example of an input screen for inputting an instruction as to whether or not to continue image processing.
- FIG. 12 is a schematic diagram for explaining the operation of the image processing apparatus according to the third embodiment of the present invention.
- FIG. 13 is a schematic diagram illustrating an example of a screen displayed on the display device during acquisition of image data.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system including an image processing apparatus according to Embodiment 1 of the present invention.
- a capsule endoscope system 1 shown in FIG. 1 is introduced into a subject 10 and images the inside of the subject 10 to generate image data, which is superimposed on a radio signal and transmitted.
- a reception device 3 that receives a radio signal transmitted from the capsule endoscope 2 via a reception antenna unit 4 attached to the subject 10, and image data generated by the capsule endoscope 2.
- FIG. 2 is a block diagram showing the internal configuration of the capsule endoscope 2 and the receiving device 3.
- the capsule endoscope 2 is a device in which various components such as an image sensor are incorporated in a capsule-shaped casing that is sized to allow the subject 10 to swallow, and as shown in FIG. 2, the inside of the subject 10 is imaged.
- An imaging unit 21 that illuminates the inside of the subject 10 during imaging, a signal processing unit 23, a memory 24, a transmission unit 25, an antenna 26, and a battery 27.
- the imaging unit 21 is disposed, for example, on an image sensor such as a CCD or CMOS that generates image data of an image representing the inside of the subject 10 from an optical image formed on the light receiving surface, and on the light receiving surface side of the image sensor. And an optical system such as an objective lens.
- an image sensor such as a CCD or CMOS that generates image data of an image representing the inside of the subject 10 from an optical image formed on the light receiving surface, and on the light receiving surface side of the image sensor.
- an optical system such as an objective lens.
- the illumination unit 22 is realized by a semiconductor light emitting element (for example, an LED (Light Emitting Diode)) that emits light toward the subject 10 during imaging.
- the capsule endoscope 2 has a built-in circuit board (not shown) on which drive circuits and the like for driving the imaging unit 21 and the illumination unit 22 are formed.
- the imaging unit 21 and the illumination unit 22 include The capsule endoscope 2 is fixed to the circuit board in a state where the visual field is directed outward from one end of the capsule endoscope 2.
- the signal processing unit 23 controls each unit in the capsule endoscope 2 and A / D converts the imaging signal output from the imaging unit 21 to generate digital image data, and further performs predetermined signal processing. Apply.
- the memory 24 temporarily stores various operations executed by the signal processing unit 23 and image data subjected to signal processing in the signal processing unit 23.
- the transmitting unit 25 and the antenna 26 superimpose image data stored in the memory 24 together with related information on a radio signal and transmit the image data to the outside.
- the battery 27 supplies power to each part in the capsule endoscope 2. It is assumed that the battery 27 includes a power supply circuit that boosts the power supplied from a primary battery such as a button battery or a secondary battery.
- the capsule endoscope 2 moves through the digestive tract of the subject 10 by a peristaltic movement of the organ 10 while moving a living body part (esophagus, stomach, small intestine, large intestine, etc.) Images are taken sequentially at time intervals (for example, 0.5 second intervals). Then, the image data generated from the acquired imaging signal and related information are sequentially wirelessly transmitted to the receiving device 3.
- the related information includes identification information (for example, a serial number) assigned to identify the individual capsule endoscope 2.
- the receiving device 3 receives image data and related information wirelessly transmitted from the capsule endoscope 2 via the receiving antenna unit 4 having a plurality of (eight in FIG. 1) receiving antennas 4a to 4h.
- Each of the receiving antennas 4a to 4h is realized by using, for example, a loop antenna, and corresponds to a predetermined position on the external surface of the subject 10 (for example, each organ in the subject 10 that is a passage path of the capsule endoscope 2). Arranged).
- the reception device 3 includes a reception unit 31, a signal processing unit 32, a memory 33, a data transmission unit 34, an operation unit 35, a display unit 36, a control unit 37, and a battery 38.
- a reception unit 31 a signal processing unit 32, a memory 33, a data transmission unit 34, an operation unit 35, a display unit 36, a control unit 37, and a battery 38.
- the receiving unit 31 receives the image data wirelessly transmitted from the capsule endoscope 2 via the receiving antennas 4a to 4h.
- the signal processing unit 32 performs predetermined signal processing on the image data received by the receiving unit 31.
- the memory 33 stores the image data subjected to signal processing in the signal processing unit 32 and related information.
- the data transmission unit 34 is an interface that can be connected to a communication line such as a USB, wired LAN, or wireless LAN. Under the control of the control unit 37, the image transmission unit 34 stores image data and related information stored in the memory 33. Send to 5. The operation unit 35 is used when the user inputs various setting information.
- the display unit 36 displays registration information (examination information, patient information, etc.) related to the examination, various setting information input by the user, and the like.
- the control unit 37 controls the operation of each unit in the receiving device 3.
- the battery 38 supplies power to each unit in the receiving device 3.
- the receiving device 3 is connected to the receiving antenna unit 4 attached to the subject 10 while the capsule endoscope 2 is imaging (predetermined time from swallowing the capsule endoscope 2). 10 to be carried. During this time, the reception device 3 stores the image data received via the reception antenna unit 4 in the memory 33 together with related information such as reception intensity information and reception time information at the reception antennas 4a to 4h. After the imaging by the capsule endoscope 2 is completed, the receiving device 3 is removed from the subject 10 and this time connected to the image processing device 5 and the image data and related information stored in the memory 33 are stored in the image processing device 5. Forward to. In FIG. 1, the cradle 3a is connected to the USB port of the image processing apparatus 5, and the receiving apparatus 3 is set in the cradle 3a to connect the receiving apparatus 3 and the image processing apparatus 5.
- FIG. 3 is a block diagram showing a schematic configuration of the image processing apparatus 5.
- the image processing device 5 is constituted by a workstation including a display device 5a such as a CRT display or a liquid crystal display, for example, and as shown in FIG. 3, an input unit 51, an image data acquisition unit 52, a storage unit 53, , An image processing unit 54, a display control unit 55, and a control unit 56.
- a display device 5a such as a CRT display or a liquid crystal display, for example, and as shown in FIG. 3, an input unit 51, an image data acquisition unit 52, a storage unit 53, , An image processing unit 54, a display control unit 55, and a control unit 56.
- the input unit 51 is realized by an input device such as a keyboard, a mouse, a touch panel, and various switches, for example, and accepts input of information and commands in accordance with a user operation.
- the image data acquisition unit 52 is an interface that can be connected to a communication line such as USB, wired LAN, or wireless LAN, and includes a USB port, a LAN port, and the like.
- the image data acquisition unit 52 acquires image data and related information from the receiving device 3 via an external device such as the cradle 3a connected to the USB port and various communication lines.
- the storage unit 53 includes a semiconductor memory such as a flash memory, a RAM, and a ROM, a recording medium such as an HDD, an MO, a CD-R, and a DVD-R, and a writing / reading device that writes and reads information on the recording medium. It is realized by.
- the storage unit 53 stores a program and various information for operating the image processing apparatus 5 to execute various functions, image data acquired by capsule endoscopy, and the like.
- the image processing unit 54 is realized by hardware such as a CPU, and performs predetermined image processing on the image data acquired via the image data acquiring unit 52 by reading a predetermined program stored in the storage unit 53. A process for generating an in-vivo image and generating an observation screen in a predetermined format including the in-vivo image is executed.
- the image processing unit 54 performs white balance processing, demosaicing, color conversion, density conversion (gamma conversion, etc.), smoothing (noise removal, etc.), sharpness on the image data stored in the storage unit 53.
- Image processing image processing for changing the stored image data into a format that can be displayed as an image
- image conversion edge enhancement, etc.
- the generated image Image processing according to purposes such as position detection processing, average color calculation processing, lesion detection processing, red detection processing, organ detection processing, and predetermined feature detection processing is performed.
- the display control unit 55 performs control to display the in-vivo image generated by the image processing unit 54 on the display device 5a in a predetermined format under the control of the control unit 56.
- the control unit 56 is realized by hardware such as a CPU, and by reading various programs stored in the storage unit 53, a signal input via the input unit 51 or acquired via the image data acquisition unit 52. Based on the image data and the like, instructions to each unit constituting the image processing device 5 and data transfer are performed, and the overall operation of the image processing device 5 is comprehensively controlled.
- FIG. 4 is a flowchart showing the operation of the capsule endoscope system 1 including the image processing device 5.
- FIG. 5 is a schematic diagram for explaining the operation of the image processing apparatus 5.
- the subject 10 swallows the capsule endoscope 2 with the receiving antenna unit 4 attached to the subject 10 and the receiving device 3 connected to the receiving antenna unit 4. To start. Thereafter, when a predetermined time elapses, the user (medical worker) removes the receiving device 3 from the receiving antenna unit 4 and sets it in the cradle 3a.
- the predetermined time is set to a time sufficient for the capsule endoscope 2 to move within the subject 10 by the peristaltic motion and pass through the region to be examined such as the small intestine (for example, about 8 hours).
- the user keeps the subject 10 on standby.
- step S10 the image data acquisition unit 52 starts acquiring a series of image data stored in the reception device 3.
- the image data acquisition unit 52 acquires the image data from the later imaging time, that is, in the reverse order of the imaging order of the images.
- the image processing unit 54 starts image processing for image generation such as white balance processing and demosaicing, in the order of image data acquisition, on the image data acquired by the image data acquisition unit 52. Even when the image processing of the image processing unit 54 is started, the process of acquiring the image data in the reverse order to the image capturing order by the image data acquiring unit 52 is continuously executed.
- step S12 the display control unit 55 starts displaying the images generated by the image processing unit 54 on the display device 5a in the order of image generation.
- FIG. 6 is a schematic diagram illustrating a display example of a screen displayed on the display device 5a during acquisition of image data.
- a preview screen D1 shown in FIG. 6 is a screen for allowing the user to confirm whether or not an in-vivo image necessary for diagnosis of the subject 10 has been obtained.
- the image display area d1 is an area in which the in-vivo image generated by the image processing unit 54 is displayed in chronological order from the end of imaging.
- the OK button d2 and the NG button d3 are provided for the user to input a confirmation result using the input unit 51 formed of a mouse or the like.
- the user observes the in-vivo image displayed in the image display area d1 and determines whether an image necessary for diagnosis is obtained. For example, when the examination target site is the entire small intestine, if the image displayed in the image display area d1 starts from the large intestine, it is determined that the necessary entire small intestine image has been obtained. In this case, the user uses a mouse or the like to perform a predetermined pointer operation (for example, a click operation) on the OK button d2. In response to this, a signal (OK signal) indicating that the image has been confirmed is input to the control unit 56.
- a predetermined pointer operation for example, a click operation
- step S13 when an OK signal is input to the control unit 56 (step S13: Yes), the control unit 56 causes the display control unit 55 to end the display of the preview screen D1 (step S14). Thereafter, the acquisition of the image data by the image data acquisition unit 52 is continuously executed in the background. Further, the image processing for image generation may be temporarily stopped and the image processing may be resumed after all the image data is acquired. Note that the user may return the subject 10 at the stage where the image is confirmed.
- step S15 when the acquisition of all the image data is completed, the image processing unit 54 performs image processing for image generation and image processing for a predetermined purpose on the acquired image data in the order of image capturing (Ste S16).
- image processing for each purpose only image processing for each purpose needs to be executed for image data that has already been generated before the end of image display (step S14).
- preset processing among position detection processing, average color calculation processing, lesion detection processing, red detection processing, organ detection processing, predetermined feature detection processing, and the like is executed.
- the image processing is performed in the same order as the imaging order.
- information on adjacent images is used as in the position detection processing and the similarity detection processing. This is because there are processes in which the order of images is important.
- the user may terminate his / her business by removing the receiving device 3 from the cradle 3a and cleaning up after all the image data has been taken into the image processing device 5.
- the image processing apparatus 5 may further generate and display an observation screen including an in-vivo image in accordance with a signal input from the input unit 51 by a user operation.
- a predetermined pointer operation for example, a click operation
- a predetermined pointer operation is performed on the NG button d3 using a mouse or the like.
- a case where an image necessary for diagnosis is not obtained is, for example, a case where the image displayed in the image display region d1 starts from the middle of the small intestine even though the examination target site is the entire small intestine Or a case where the image quality is extremely poor due to the influence of noise or the like.
- NG signal indicating that the image has not been confirmed is input to the control unit 56.
- step S13 when an NG signal is input to the control unit 56 (step S13: No), the control unit 56 causes the image data acquisition unit 52 to stop acquiring image data (step S17).
- the user can restart the examination.
- step S18 When resuming the examination (step S18: Yes), when the user connects the receiving device 3 to the receiving antenna unit 4 again and attaches it to the subject 10, the examination is resumed (step S19). In response to this, the receiving device 3 receives the image data wirelessly transmitted from the capsule endoscope 2 via the receiving antenna unit 4. After a sufficient time has elapsed since the restart of the inspection, when the receiving device 3 is removed from the receiving antenna unit 4 again, the inspection ends (step S20). Thereafter, by setting the receiving device 3 again in the cradle 3a, the processing returns to step S10.
- step S18: No the image processing apparatus 5 resumes the acquisition of the image data by the image data acquisition unit 52 (step S21).
- the in-vivo image generated retroactively from the imaging end point is displayed on the preview screen. Therefore, after the examination, the user can determine whether or not the subject 10 needs to be reexamined at an early stage. Therefore, it is possible to shorten the time for waiting the subject 10 as compared with the conventional case. In the unlikely event that a necessary image has not been obtained, the examination can be resumed on the spot, so that it is possible to reduce the burden on the subject 10 such as performing the examination again at a later date. Furthermore, since the user himself / herself can clean up the receiving apparatus 3 at the stage when the transfer of the image data is completed, the efficiency of work related to the inspection can be improved.
- FIG. 7 is a schematic diagram for explaining the operation of the image processing apparatus according to the first modification.
- the image processing device 5 starts acquiring the image data stored in the receiving device 3 from the end of imaging and continues until an OK signal is input by a user operation on the preview screen D1. .
- an OK signal is input by a user operation on the preview screen D1.
- only image data generated within a time (ie, within a predetermined number of images) that is traced back a predetermined time in time series from the end of imaging is acquired as preview image data. It is also good to do.
- the image processing device 5 even if acquisition of image data for preview is completed, if the user does not confirm the image (when no OK signal or NG signal is input), the image processing device 5 generates the image for preview on the display device 5a.
- the input of an OK signal or an NG signal may be waited in a state where the last in-vivo image that has been displayed is displayed as a still image.
- the image data is captured from the receiving device 3 to the image processing device 5, and this time, the image is captured from the start of image capturing. It is good to execute according to the order of imaging toward the end point. In this way, by matching the acquisition order of the remaining image data with the order of the image processing in step S16, the image processing is started before the acquisition of the image data is completed (see step S15). This is because the processing can be executed in parallel.
- the image processing apparatus according to the second embodiment is characterized by automatically determining whether or not a series of in-vivo images captured by the capsule endoscope 2 includes in-vivo images necessary for diagnosis.
- the configuration of the image processing apparatus according to the second embodiment is the same as that of the image processing apparatus 5 shown in FIG.
- the configuration of the entire capsule endoscope system including the image processing apparatus is the same as that shown in FIG.
- FIG. 8 is a flowchart showing the operation of the capsule endoscope system 1 including the image processing apparatus 5 according to the second embodiment.
- 9 to 11 are schematic diagrams showing screens displayed on the display device 5a. Note that steps S10 and S11 shown in FIG. 8 are the same as those in the first embodiment.
- step S31 following step S11 the image processing unit 54 further performs image processing (part determination processing) for determining a part reflected in the in-vivo image on the image data on which image processing for image generation has been performed.
- image processing part determination processing
- Any known method may be used as the part determination process.
- an in-vivo image with a strong brown system can be determined as the large intestine
- an in-vivo image with a strong yellow system can be determined as the small intestine.
- FIG. 9 is a schematic diagram illustrating a display example of a notification screen.
- a text message “The large intestine has been confirmed.” Is displayed. The user may return the subject 10 after confirming this display.
- the notification to the user is not limited to the display of the text message, but may be performed by, for example, a notification sound or a voice message. Moreover, you may display the in-vivo image by which site
- the acquisition of the image data by the image data acquisition unit 52 is continuously executed in the background.
- the acquisition of the image data may be continuously performed in the reverse order to the imaging order, or may be performed in the same order as the imaging order (that is, the same order as the subsequent image processing) as in the first modification. Further, the image processing for image generation may be temporarily stopped and resumed after all the image data is acquired.
- the image processing unit 54 performs image generation for image generation and image processing for a predetermined purpose on the acquired image data in the order of image capture (Ste S35).
- image processing may be started before the acquisition of image data is completed. Further, only necessary image processing may be performed on the image data that has been subjected to the part determination processing (step S31).
- the user may terminate his / her business by removing the receiving device 3 from the cradle 3a and cleaning up after all the image data has been taken into the image processing device 5.
- the image processing apparatus 5 may further generate and display an observation screen including an in-vivo image in accordance with a signal input from the input unit 51 by a user operation.
- FIG. 10 is a schematic diagram illustrating a display example of a notification screen.
- a text message “Cannot confirm the large intestine. Do you want to resume the examination?” Is displayed.
- the in-vivo image that is the part discrimination target may be displayed. In this case, the user himself can confirm the part of the in-vivo image.
- the notification screen D3 is provided with a YES button d6 and a NO button d7 that are used when the user inputs a determination as to whether or not to resume the inspection.
- a predetermined pointer operation for example, a click operation
- a signal indicating that the inspection is resumed is input to the control unit 56.
- the user determines not to resume the examination
- the user performs a predetermined pointer operation on the NO button d7 using a mouse or the like.
- a signal indicating that the inspection is not resumed is input to the control unit 56.
- control unit 56 causes the image data acquisition unit 52 to stop acquiring the image data from the receiving device 3 (step S38).
- step S39 When the user connects the receiving device 3 to the receiving antenna unit 4 again and attaches it to the subject 10, the examination is resumed (step S39). In response to this, the receiving device 3 receives the image data wirelessly transmitted from the capsule endoscope 2 via the receiving antenna unit 4. After a sufficient time has elapsed since the restart of the inspection, when the receiving device 3 is removed from the receiving antenna unit 4 again, the inspection ends (step S40). Thereafter, by setting the receiving device 3 again in the cradle 3a, the processing returns to step S10.
- FIG. 11 is a schematic diagram illustrating a display example of the input screen.
- a text message “Do you want to continue image processing?” Is displayed. The user confirms this display and determines whether or not the image processing device 5 should continue the image processing.
- the input screen D4 is provided with a YES button d9 and a NO button d10 that are used when the user inputs a determination as to whether or not the image processing apparatus 5 should continue image processing.
- the user uses a mouse or the like to perform a predetermined pointer operation (for example, a click operation) on the YES button d9.
- a predetermined pointer operation for example, a click operation
- an instruction signal for continuing the image processing is input to the control unit 56.
- the user performs a predetermined pointer operation on the NO button d10 using a mouse or the like.
- an instruction signal indicating that image processing is not continued is input to the control unit 56.
- step S42 When an instruction signal for continuing the image processing is input to the control unit 56 (step S42: Yes), the operation of the image processing apparatus 5 proceeds to step S34. In this case, the image processing device 5 continues to acquire the image data stored in the receiving device 3, and subsequently performs image processing.
- step S42 when an instruction signal indicating that the image processing is not continued is input to the control unit 56 (step S42: No), the control unit 56 stops the image data acquisition unit 52 from acquiring the image data from the receiving device 3. (Step S43).
- the region determination is performed on the in-vivo image generated retroactively from the imaging end point. Since the process is performed, the user can easily determine whether or not the subject 10 needs to be reexamined at an early stage. In addition, the user can determine whether or not to continue the image processing according to his / her own judgment according to the contents of individual examinations.
- an image showing other than the organ may be skipped and excluded from the part determination process target.
- the image to be skipped can be determined based on the average color of the image, for example.
- the part determination process for an image that is not clearly used for diagnosis such as an image in which the subject can hardly be determined by halation, may be omitted.
- the halation image can be determined based on, for example, the average luminance value of the image.
- Embodiment 3 Next, a third embodiment of the present invention will be described.
- the image processing apparatus according to Embodiment 3 is characterized in that a series of image data accumulated in the receiving apparatus 3 is divided into a plurality of blocks, and image data is acquired from each block and image processing is performed.
- the configuration of the image processing apparatus according to the third embodiment is the same as that of the image processing apparatus 5 shown in FIG. Further, the configuration and operation of the entire capsule endoscope system including the image processing apparatus are the same as those shown in FIGS.
- the image at the end of imaging is an image of the outside of the subject 10. For this reason, when all the images generated retroactively from the end of imaging are sequentially displayed on the preview screen, it takes time to reach the image in the subject 10. For the user, whether an image of a specific part in the subject 10 has been acquired, or whether there is an area in which no image is obtained due to failure in wireless transmission of image data due to an antenna failure or the like. Sometimes you want to check.
- the series of image data stored in the receiving device 3 is divided into a plurality of blocks so that the user can roughly grasp the entire series of in-vivo images obtained by the examination, A preview of images at a plurality of locations in the subject 10 is simultaneously displayed.
- step S10 of FIG. 4 the image data acquisition unit 52, as shown in FIG. 12, from the plurality of blocks 1 to 4 obtained by dividing a series of image data, the final imaging time t 1 of each block, Image data is acquired from t 2 , t 3 , t 4 in the reverse order of the imaging order.
- a time ie, a predetermined image
- Only the image data generated within the number of images may be acquired as image data for the preview screen.
- the image processing unit 54 performs image processing for image generation in the order of acquisition on the image data acquired by the image data acquisition unit 52 from each of the blocks 1 to 4.
- FIG. 13 is a schematic diagram showing an example of a screen displayed on the display device 5a during acquisition of image data.
- the preview screen D5 shown in FIG. 13 is provided with four image display areas d11 to d14 for displaying images based on the image data acquired from the blocks 1 to 4, an OK button d15, and an NG button d16. ing.
- the OK button d15 and the NG button d16 are the same as the OK button d2 and the NG button d3 shown in FIG. 6, and the result of confirming whether the user who has observed the preview screen D5 has obtained an image necessary for diagnosis is obtained. Used when inputting.
- transfer of image data from each of the blocks 1 to 4 may be serial or parallel.
- the image data acquisition unit 52 stores image data while moving between blocks in the order of, for example, block 4 ⁇ block 3 ⁇ block 2 ⁇ block 1 ⁇ block 4 ⁇ . Acquire a fixed amount (for example, one image at a time).
- the in-vivo images displayed in the image display areas d11 to d14 are switched one by one while tracing the imaging time in time series.
- the image data acquisition unit 52 simultaneously acquires image data from the blocks 1 to 4 by a predetermined amount.
- the image processing unit 54 also performs image processing in parallel on the image data acquired from each of the blocks 1 to 4.
- the preview screen D5 the in-vivo images displayed in the image display areas d11 to d14 are simultaneously switched while taking the imaging time back in time series.
- the user can roughly grasp the entire series of in-vivo images obtained by the examination. Therefore, it is possible to easily and accurately determine whether an image necessary for diagnosis is obtained.
- a part determination process may be performed instead of previewing the images acquired from the blocks 1 to 4 (see step S31 in FIG. 8).
- Embodiments 1 to 3 and modifications thereof are not limited to Embodiments 1 to 3 and modifications thereof, and various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiments and modifications. it can. For example, some components may be excluded from all the components shown in each embodiment and modification, or the components shown in different embodiments and modifications may be combined as appropriate. May be.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Endoscopes (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、本発明の実施の形態1に係る画像処理装置を含むカプセル型内視鏡システムの概略構成を示す模式図である。図1に示すカプセル型内視鏡システム1は、被検体10内に導入されて該被検体10内を撮像することにより画像データを生成し、無線信号に重畳して送信するカプセル型内視鏡2と、カプセル型内視鏡2から送信された無線信号を、被検体10に装着された受信アンテナユニット4を介して受信する受信装置3と、カプセル型内視鏡2により生成された画像データを受信装置3から取得し、所定の画像処理を施す画像処理装置5とを備える。
信号処理部32は、受信部31が受信した画像データに所定の信号処理を施す。
メモリ33は、信号処理部32において信号処理が施された画像データ及びその関連情報を記憶する。
操作部35は、ユーザが各種設定情報等を入力する際に用いられる。
制御部37は、これらの受信装置3内の各部の動作を制御する。
バッテリ38は、受信装置3内の各部に電力を供給する。
なお、ユーザは、画像確認がなされた段階で、被検体10を帰宅させても良い。
次に、本発明の実施の形態1の変形例1について説明する。図7は、変形例1に係る画像処理装置の動作を説明するための模式図である。
次に、本発明の実施の形態2について説明する。
実施の形態2に係る画像処理装置は、カプセル型内視鏡2により撮像された一連の体内画像に、診断に必要な体内画像が含まれているか否かを自動判別することを特徴とする。なお、実施の形態2に係る画像処理装置の構成は、図3に示す画像処理装置5と同様である。また、画像処理装置を含むカプセル型内視鏡システム全体の構成も、図1と同様である。
次に、本発明の実施の形態2の変形例2について説明する。
カプセル型内視鏡2が被検体10から排出された後も撮像を続けていた場合、撮像終了時点における画像は、被検体10外が写った画像となる。このため、撮像終了時点から時系列に遡って生成された全ての画像に部位判別処理を施すと、被検体10内の画像に到達するまでに時間がかかってしまう。
次に、本発明の実施の形態3について説明する。
実施の形態3に係る画像処理装置は、受信装置3に蓄積された一連の画像データを複数のブロックに分割し、各ブロックから画像データを取得して画像処理を施すことを特徴とする。なお、実施の形態3に係る画像処理装置の構成は、図3に示す画像処理装置5と同様である。また、画像処理装置を含むカプセル型内視鏡システム全体の構成及び動作は、図1及び図4に示すものと同様である。
2 カプセル型内視鏡
3 受信装置
3a クレードル
4 受信アンテナユニット
4a~4h 受信アンテナ
5 画像処理装置
5a 表示装置
10 被検体
21 撮像部
22 照明部
23 信号処理部
24、33 メモリ
25 送信部
26 アンテナ
27、38 バッテリ
31 受信部
32 信号処理部
34 データ送信部
35 操作部
36 表示部
37 制御部
51 入力部
52 画像データ取得部
53 記憶部
54 画像処理部
55 表示制御部
56 制御部
Claims (9)
- 被検体内に導入されて該被検体内を撮像することにより生成した画像データを無線送信するカプセル型内視鏡と、前記カプセル型内視鏡から無線送信された画像データを受信して蓄積する受信装置とを含むカプセル型内視鏡システムにおいて、前記受信装置から前記画像データを取得して処理する画像処理装置であって、
前記受信装置から前記画像データを、撮像時刻が遅い方から順に取得する画像データ取得部と、
前記画像データ取得部が取得した前記画像データに対し、該画像データの取得順に所定の画像処理を施す画像処理部と、
前記所定の画像処理により得られた結果を画面表示する制御を行う表示制御部と、
を備えることを特徴とする画像処理装置。 - 前記画像処理部は、前記画像データに基づく画像を生成し、
前記表示制御部は、前記画像処理部により生成された前記画像を、該画像の生成順に表示させる制御を行うことを特徴とする請求項1に記載の画像処理装置。 - 前記画像データ取得部は、前記カプセル型内視鏡において撮像時刻が最も遅い時刻から時系列に所定の時間だけ遡った時間内に生成された画像データを取得することを特徴とする請求項1に記載の画像処理装置。
- 前記画像処理部は、前記画像データに基づく画像を生成し、さらに、該画像に前記被検体内の特定の部位が写っているか否かを判別し、
前記表示制御部は、前記画像処理部により判別された結果を表示する制御を行うことを特徴とする請求項1に記載の画像処理装置。 - 前記画像処理部は、前記画像データに基づく画像に前記被検体内の臓器が写っているか否かを判別した上で、前記臓器が写っていると判別された画像に対し、該画像に写った臓器が前記特定の部位であるか否かを判別することを特徴とする請求項4に記載の画像処理装置。
- 前記画像データ取得部は、前記カプセル型内視鏡による1回の検査において撮像された一連の画像データを複数のブロックに分割し、各ブロックの撮像時刻が遅い方から順に前記画像データを取得し、
前記画像処理部は、前記画像データに基づく画像を順次生成し、
前記表示制御部は、前記複数のブロックからそれぞれ取得された画像データに基づく複数の画像がそれぞれ表示される複数の画像表示領域が設けられた画面を表示する制御を行うことを特徴とする請求項1に記載の画像処理装置。 - 前記画像データ取得部は、前記複数のブロックのうち前記画像データを取得するブロックを順次切り替えながら、各ブロックの画像データを一部ずつ取得することを特徴とする請求項6に記載の画像処理装置。
- 前記画像データ取得部は、前記複数のブロックから並列に前記画像データを取得することを特徴とする請求項6に記載の画像処理装置。
- 被検体内に導入されて該被検体内を撮像することにより生成した画像データを無線送信するカプセル型内視鏡と、前記カプセル型内視鏡から無線送信された画像データを受信して蓄積する受信装置とを含むカプセル型内視鏡システムにおいて、前記受信装置から取得した前記画像データを処理する画像処理方法であって、
前記受信装置から前記画像データを、撮像時刻が遅い方から順に取得する画像データ取得ステップと、
前記画像データ取得ステップにおいて取得された前記画像データに対し、該画像データの取得順に所定の画像処理を施す画像処理ステップと、
前記所定の画像処理により得られた結果を画面表示する制御を行う表示制御ステップと、
を含むことを特徴とする画像処理方法。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP13846413.6A EP2910172A4 (en) | 2012-10-18 | 2013-10-10 | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD |
| JP2014519331A JP5593008B1 (ja) | 2012-10-18 | 2013-10-10 | 画像処理装置及び画像処理方法 |
| CN201380017933.5A CN104203073A (zh) | 2012-10-18 | 2013-10-10 | 图像处理装置和图像处理方法 |
| US14/276,247 US20140321724A1 (en) | 2012-10-18 | 2014-05-13 | Image processing apparatus and image processing method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012-231183 | 2012-10-18 | ||
| JP2012231183 | 2012-10-18 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/276,247 Continuation US20140321724A1 (en) | 2012-10-18 | 2014-05-13 | Image processing apparatus and image processing method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014061554A1 true WO2014061554A1 (ja) | 2014-04-24 |
Family
ID=50488121
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/077613 Ceased WO2014061554A1 (ja) | 2012-10-18 | 2013-10-10 | 画像処理装置及び画像処理方法 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20140321724A1 (ja) |
| EP (1) | EP2910172A4 (ja) |
| JP (1) | JP5593008B1 (ja) |
| CN (1) | CN104203073A (ja) |
| WO (1) | WO2014061554A1 (ja) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018074223A1 (ja) * | 2016-10-20 | 2018-04-26 | オリンパス株式会社 | 内視鏡システム、端末装置、サーバ、送信方法およびプログラム |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018008195A1 (ja) * | 2016-07-05 | 2018-01-11 | オリンパス株式会社 | 画像処理装置、画像処理システム、画像処理装置の作動方法、及び画像処理装置の作動プログラム |
| JP7013677B2 (ja) * | 2017-05-01 | 2022-02-01 | ソニーグループ株式会社 | 医用画像処理装置、医用画像処理装置の作動方法、及び、内視鏡システム |
| JP7138719B2 (ja) * | 2018-10-30 | 2022-09-16 | オリンパス株式会社 | 内視鏡システムに用いる画像処理装置、内視鏡システム及び内視鏡システムの作動方法 |
| US11514576B2 (en) * | 2018-12-14 | 2022-11-29 | Acclarent, Inc. | Surgical system with combination of sensor-based navigation and endoscopy |
| CN116668836B (zh) * | 2022-11-22 | 2024-04-19 | 荣耀终端有限公司 | 拍照处理方法和电子设备 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009297497A (ja) | 2008-04-30 | 2009-12-24 | Given Imaging Ltd | 処理の終了を決定するためのシステムおよび方法 |
| JP2010082241A (ja) * | 2008-09-30 | 2010-04-15 | Olympus Medical Systems Corp | 画像表示装置、画像表示方法、および画像表示プログラム |
| JP2010099139A (ja) * | 2008-10-21 | 2010-05-06 | Olympus Medical Systems Corp | 画像表示装置、画像表示方法、および画像表示プログラム |
| WO2011013475A1 (ja) * | 2009-07-29 | 2011-02-03 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、読影支援システムおよび読影支援プログラム |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3221085B2 (ja) * | 1992-09-14 | 2001-10-22 | 富士ゼロックス株式会社 | 並列処理装置 |
| JP4537803B2 (ja) * | 2004-08-27 | 2010-09-08 | オリンパス株式会社 | 画像表示装置 |
| IL171772A (en) * | 2004-11-04 | 2009-11-18 | Given Imaging Ltd | Device and method for selecting and integrating the absorption device |
| JP2006288612A (ja) * | 2005-04-08 | 2006-10-26 | Olympus Corp | 画像表示装置 |
| JP5271710B2 (ja) * | 2005-09-09 | 2013-08-21 | ギブン イメージング リミテッド | 生体内画像を同時に転送及び処理し、そしてリアルタイムに閲覧するシステム |
| JP2008301968A (ja) * | 2007-06-06 | 2008-12-18 | Olympus Medical Systems Corp | 内視鏡画像処理装置 |
| JP5291955B2 (ja) * | 2008-03-10 | 2013-09-18 | 富士フイルム株式会社 | 内視鏡検査システム |
| US20100097392A1 (en) * | 2008-10-14 | 2010-04-22 | Olympus Medical Systems Corp. | Image display device, image display method, and recording medium storing image display program |
| JP5576711B2 (ja) * | 2010-05-14 | 2014-08-20 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
-
2013
- 2013-10-10 WO PCT/JP2013/077613 patent/WO2014061554A1/ja not_active Ceased
- 2013-10-10 EP EP13846413.6A patent/EP2910172A4/en not_active Withdrawn
- 2013-10-10 JP JP2014519331A patent/JP5593008B1/ja active Active
- 2013-10-10 CN CN201380017933.5A patent/CN104203073A/zh active Pending
-
2014
- 2014-05-13 US US14/276,247 patent/US20140321724A1/en not_active Abandoned
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009297497A (ja) | 2008-04-30 | 2009-12-24 | Given Imaging Ltd | 処理の終了を決定するためのシステムおよび方法 |
| JP2010082241A (ja) * | 2008-09-30 | 2010-04-15 | Olympus Medical Systems Corp | 画像表示装置、画像表示方法、および画像表示プログラム |
| JP2010099139A (ja) * | 2008-10-21 | 2010-05-06 | Olympus Medical Systems Corp | 画像表示装置、画像表示方法、および画像表示プログラム |
| WO2011013475A1 (ja) * | 2009-07-29 | 2011-02-03 | オリンパスメディカルシステムズ株式会社 | 画像表示装置、読影支援システムおよび読影支援プログラム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2910172A4 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018074223A1 (ja) * | 2016-10-20 | 2018-04-26 | オリンパス株式会社 | 内視鏡システム、端末装置、サーバ、送信方法およびプログラム |
| JP6368885B1 (ja) * | 2016-10-20 | 2018-08-01 | オリンパス株式会社 | 内視鏡システム、端末装置、サーバ、送信方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5593008B1 (ja) | 2014-09-17 |
| EP2910172A1 (en) | 2015-08-26 |
| EP2910172A4 (en) | 2016-06-22 |
| JPWO2014061554A1 (ja) | 2016-09-05 |
| CN104203073A (zh) | 2014-12-10 |
| US20140321724A1 (en) | 2014-10-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7737529B2 (ja) | 内視鏡情報管理システム | |
| JP5593008B1 (ja) | 画像処理装置及び画像処理方法 | |
| CN102753078B (zh) | 图像显示装置以及胶囊型内窥镜系统 | |
| JP5044066B2 (ja) | 画像表示装置及びカプセル型内視鏡システム | |
| CN105283114B (zh) | 胶囊型内窥镜系统 | |
| JP7289373B2 (ja) | 医療画像処理装置、内視鏡システム、診断支援方法及びプログラム | |
| JP5690447B2 (ja) | 情報管理装置及びカプセル型内視鏡検査システム | |
| JP2009039449A (ja) | 画像処理装置 | |
| US8982204B2 (en) | Inspection management apparatus, system, and method, and computer readable recording medium | |
| JP5242852B2 (ja) | 画像表示装置及びカプセル型内視鏡システム | |
| US20140336454A1 (en) | Apparatus and system for management of inspection information | |
| JP7314394B2 (ja) | 内視鏡検査支援装置、内視鏡検査支援方法、及び内視鏡検査支援プログラム | |
| CN115279249B (zh) | 图像选择辅助装置、图像选择辅助方法及记录介质 | |
| KR101601410B1 (ko) | 휴대용 내시경 시스템 | |
| JP2011172965A (ja) | 画像表示システムおよび画像表示端末装置 | |
| US10726553B2 (en) | Image processing apparatus, image processing system, operation method of image processing apparatus, and computer-readable recording medium | |
| WO2023195103A1 (ja) | 検査支援システムおよび検査支援方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2014519331 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13846413 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2013846413 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2013846413 Country of ref document: EP |