WO2008044365A1 - Dispositif de traitement d'image médicale et procédé de traitement d'image médicale - Google Patents
Dispositif de traitement d'image médicale et procédé de traitement d'image médicale Download PDFInfo
- Publication number
- WO2008044365A1 WO2008044365A1 PCT/JP2007/061628 JP2007061628W WO2008044365A1 WO 2008044365 A1 WO2008044365 A1 WO 2008044365A1 JP 2007061628 W JP2007061628 W JP 2007061628W WO 2008044365 A1 WO2008044365 A1 WO 2008044365A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dimensional model
- boundary line
- image boundary
- correction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/507—Depth or shape recovery from shading
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
- G06T2207/30032—Colon polyp
Definitions
- the present invention relates to a medical image processing apparatus and a medical image processing method, and in particular, medical image processing for estimating 3D model data of a living tissue based on 2D image data of an image of a living tissue.
- the present invention relates to an apparatus and a medical image processing method.
- the endoscope device has, for example, an insertion part that can be inserted into a body cavity, and the inside of the body cavity imaged by an objective optical system disposed at the distal end of the insertion part.
- An image is picked up by an image pickup means such as a solid-state image pickup device and output as an image pickup signal, and an image and an image of a body cavity are displayed on a display means such as a monitor based on the image pickup signal.
- the user observes, for example, an organ in the body cavity based on the image of the image in the body cavity displayed on the display means such as a monitor.
- the endoscope apparatus can directly capture an image of the digestive tract mucosa. Therefore, the user can comprehensively observe, for example, the color of the mucous membrane, the shape of the lesion, and the fine structure of the mucosal surface.
- an endoscope apparatus is disclosed in, for example, Japanese Patent Application Laid-Open No. 2005-192880 (Patent Document 1). It is also possible to detect an image including a lesion site such as a polyp by using the image processing method described in).
- Patent Document 1 extracts a contour of an input image and detects a lesion having a local raised shape in the image based on the shape of the contour. be able to.
- 3D data is estimated from a 2D image, and colon polyps are detected using 3D feature values (Shape Index / Curvedness).
- 3D feature values Shape Index / Curvedness.
- MI2003-102 The Institute of Electronics, Information and Communication Engineers, IEICE Technical Report (MI2003-102), Examination of the method for detecting large intestine polyps from 3D abdominal CT images based on shape information Kimura, Hayashi, Kitasaka, Mori, Suenaga pp. 29-34, 2004).
- This three-dimensional feature is realized by calculating the partial differential coefficient at the reference point from the three-dimensional data and using the partial differential coefficient.
- a polyp candidate is detected by performing threshold processing on the three-dimensional feature value.
- the “Shape From Shading” method which has been conventionally used as an estimation method of 3D data, has a problem that accuracy is deteriorated in a portion imaged as an edge in a 2D image.
- the boundary between a curved surface boundary or a visible range and an occlusion (visible concealment or invisible) range on the 3D image is used as a boundary.
- An edge area occurs.
- the “Shape From Shading” method which estimates 3D data from 2D images, is a method of estimating 3D positions based on pixel values of 2D images. Since the pixel value of this edge region is lower than the pixel value of the adjacent region, the “Shape From Shading” method uses a 3D position as if a “groove (concave)” exists in this edge region. The estimated value is calculated.
- the coordinates of the three-dimensional data points are estimated by the "Shape From Shading" method, and the three-dimensional data points are When generating a 3D surface, ideally a 3D image as shown in Figure 27 should be generated.
- the present invention has been made in view of the above circumstances, and accurately constructs a three-dimensional image from a two-dimensional image to improve detection accuracy when detecting a lesion having a local raised shape. It is an object of the present invention to provide a medical image processing apparatus and a medical image processing method that can be performed.
- a medical image processing apparatus of the present invention includes a three-dimensional model estimation unit that estimates a three-dimensional model of a biological tissue from a two-dimensional image of a biological tissue image in a body cavity that is input from the medical imaging apparatus.
- An image boundary line detecting means for detecting an image boundary line of an image area constituting the two-dimensional image image, a line width calculating means for calculating a line width of the image boundary line, and the three-dimensional based on the line width
- correction means for correcting the estimation result of the three-dimensional model of the image boundary line by the model estimation means.
- the medical image processing method of the present invention provides a three-dimensional model estimation that estimates a three-dimensional model of a living tissue from a two-dimensional image of the image of the living tissue in a body cavity that is input from a medical imaging device.
- An image boundary line detecting step for detecting an image boundary line of an image area constituting the two-dimensional image image, a line width calculating step for calculating a line width of the image boundary line, and based on the line width, And a correction step of correcting the estimation result of the three-dimensional model of the image boundary line by the three-dimensional model estimation step.
- FIG. 1 is a diagram showing an example of the overall configuration of an endoscope system in which a medical image processing apparatus according to Embodiment 1 of the present invention is used.
- FIG. 2 is a functional block diagram showing the functional configuration of the CPU in FIG.
- FIG. 3 is a diagram showing a storage information configuration of the hard disk in FIG.
- FIG. 4 is a flowchart showing the processing flow of the CPU of FIG.
- FIG. 5 is a flowchart showing a flow of correction processing of the three-dimensional model data of FIG.
- FIG. 6 is a first diagram illustrating the process of FIG.
- FIG. 7 is a second diagram for explaining the processing in FIG.
- FIG. 8 is a third diagram illustrating the process of FIG.
- FIG. 9 is a fourth diagram illustrating the process of FIG.
- FIG. 10 is a flowchart showing a flow of correction processing of 3D point sequence data between edge end points as the edge region of FIG.
- FIG. 11 is a fifth diagram illustrating the process of FIG.
- FIG. 12 is a sixth diagram illustrating the process of FIG.
- FIG. 13 is a flowchart showing a process flow of the first modified example of FIG.
- FIG. 14 is a flowchart showing a process flow of the second modified example of FIG.
- FIG. 15 is a first diagram illustrating the process of FIG.
- FIG. 16 is a second diagram illustrating the process of FIG.
- FIG. 17 is a diagram showing a storage information configuration of a hard disk according to Embodiment 2 of the present invention.
- FIG. 18 is a first diagram illustrating Example 2.
- FIG. 19 is a second diagram illustrating Example 2.
- FIG. 20 is a third diagram for explaining the second embodiment.
- FIG. 21 is a flowchart showing the flow of processing of a CPU according to the second embodiment.
- FIG. 22 is a flowchart showing a flow of correction processing of the three-dimensional model data of FIG.
- FIG. 23 is a first diagram illustrating the processing of FIG.
- FIG. 24 is a second diagram illustrating the process of FIG.
- FIG. 25 is a third diagram illustrating the process of FIG.
- FIG. 26 is a first diagram illustrating a problem of the conventional example.
- FIG. 27 is a second diagram for explaining the problem of the conventional example.
- FIG. 28 is a third diagram for explaining the problem of the conventional example.
- FIG. 1 is a diagram illustrating an example of an overall configuration of an endoscope system in which a medical image processing apparatus is used.
- FIG. 2 is a functional block diagram showing the functional configuration of the CPU of FIG.
- FIG. 3 is a diagram showing a storage information configuration of the hard disk of FIG. 1
- FIG. 4 is a flowchart showing a processing flow of the CPU of FIG.
- FIG. 5 is a flowchart showing a flow of correction processing of the three-dimensional model data of FIG.
- FIG. 6 is a first diagram illustrating the processing of FIG.
- FIG. 7 is a second diagram for explaining the processing of FIG.
- FIG. 8 is a third diagram for explaining the processing of FIG.
- FIG. 9 is a fourth diagram for explaining the processing of FIG.
- FIG. 10 is a flowchart showing a flow of correction processing of 3D point sequence data between edge end points as the edge region of FIG.
- FIG. 11 is a fifth diagram for explaining the processing of FIG.
- FIG. 12 is a sixth diagram for explaining the processing of FIG.
- FIG. 13 is a flowchart showing the flow of processing of the first modification of FIG.
- FIG. 14 is a flowchart showing the flow of processing of the second modified example of FIG.
- FIG. 15 is a first diagram illustrating the process of FIG.
- FIG. 16 is a second diagram for explaining the processing of FIG.
- an endoscope system 1 includes a medical observation device 2, a medical image processing device 3, and a monitor 4, and the main part is configured. Les.
- the medical observation apparatus 2 is an observation apparatus that captures an image of a subject and outputs a two-dimensional image of the image of the subject.
- the medical image processing apparatus 3 is configured by a personal computer or the like, and performs image processing on the video signal of the two-dimensional image output from the medical observation apparatus 2 and after performing the image processing. It is an image processing device that outputs the video signal as an image signal.
- the monitor 4 is a display device that displays an image based on the image signal output from the medical image processing device 3.
- the medical observation apparatus 2 includes an endoscope 6, a light source device 7, a camera control unit (hereinafter abbreviated as CCU) 8, and a monitor 9. Yes.
- CCU camera control unit
- the endoscope 6 is inserted into a body cavity of a subject, images a subject such as a living tissue existing in the body cavity, and outputs it as an imaging signal.
- the light source device 7 supplies illumination light for illuminating a subject imaged by the endoscope 6.
- the CCU 8 performs various controls on the endoscope 6 and performs signal processing on the imaging signal output from the endoscope 6 to output it as a video signal of a two-dimensional image.
- the monitor 9 displays an image of the subject imaged by the endoscope 6 based on the video signal of the two-dimensional image output from the CCU 8.
- the endoscope 6 is provided on the insertion portion 11 to be inserted into the body cavity and on the proximal end side of the insertion portion 11. And an operation unit 12.
- a light guide 13 for transmitting illumination light supplied from the light source device 7 is inserted into a portion from the proximal end side in the insertion portion 11 to the distal end portion 14 on the distal end side in the insertion portion 11. Yes.
- the light guide 13 has a distal end side disposed at the distal end portion 14 of the endoscope 6 and a rear end side connected to the light source device 7.
- the illumination light supplied from the light source device 7 is transmitted by the light guide 13 and then provided on the distal end surface of the distal end portion 14 of the insertion portion 11.
- the light is emitted from an illumination window (not shown).
- illumination light is emitted from an illumination window (not shown) to illuminate a living tissue or the like as a subject.
- an objective optical system 15 attached to an observation window (not shown) adjacent to an illumination window (not shown) and an imaging position of the objective optical system 15 are arranged.
- An imaging unit 17 having an imaging device 16 constituted by a CCD (charge coupled device) or the like is provided. With such a configuration, the subject image formed by the objective optical system 15 is captured by the imaging element 16 and then output as an imaging signal.
- the image sensor 16 is not limited to a CCD, and may be composed of, for example, a C-MOS sensor.
- the image sensor 16 is connected to the CCU 8 via a signal line.
- the image sensor 16 is driven based on the drive signal output from the CCU 8 and outputs an image signal corresponding to the imaged subject to the CCU 8.
- the imaging signal input to the CCU 8 is converted and output as a video signal of a two-dimensional image by performing signal processing in a signal processing circuit (not shown) provided in the CCU 8.
- the video signal of the two-dimensional image output from the CCU 8 is output to the monitor 9 and the medical image processing device 3.
- the monitor 9 displays the subject image based on the video signal output from the CCU 8 as a two-dimensional image.
- the medical image processing apparatus 3 performs an AZD conversion on the video signal of the two-dimensional image output from the medical observation apparatus 2, and outputs the image input section 21 from the image input section 21.
- CPU 22 as a central processing unit that performs image processing on a video signal to be processed, a processing program storage unit 23 in which a processing program related to the image processing is written, a video signal output from the image input unit 21, and the like Images stored in the image storage unit 24 and images performed by the CPU 22 And an analysis information storage unit 25 for storing calculation results in the processing.
- the medical image processing apparatus 3 includes a storage device interface (I / F) 26, image data as a result of image processing of the CPU 22 via the storage device I / F 26, and the CPU 22 uses it for image processing.
- the display processing for displaying the image data on the monitor 4 is performed.
- Display processing unit 28 that outputs the image data after the image processing as an image signal, a parameter for the image processing performed by the CPU 22, and an operation instruction for the medical image processing apparatus 3 can be input by the user, a keyboard or a mouse, etc.
- an input operation unit 29 composed of a pointing device or the like.
- the monitor 4 displays an image based on the image signal output from the display processing unit 28.
- Each of 29 is connected to each other via a data bus 30.
- the CPU 22 performs a model correction as a 3D model estimation unit 22a, a detection target region setting unit 22b, a shape feature amount calculation unit 22c, a line width calculation unit, and a correction unit as a 3D model estimation unit. It consists of functional units 22d, 3D shape detection unit 22e, and polyp determination unit 22f.
- these functional units are realized by software executed by the CPU 22.
- the hard disk 27 has a plurality of storage areas for storing various data generated by each process performed by the CPU 22. Specifically, the hard disk 27 has an edge image storage area 27a, an edge thinned image storage area 27b, a 3D point sequence data storage area 27c, a correspondence table storage area 27d, and a detected lesion part storage area 27e. Details of the data stored in each of these storage areas will be described later.
- FIGS. 4, 5 and 10 the flowcharts of FIGS. 4, 5 and 10 are used, and FIGS. 6 to 9 and FIG. This will be described with reference to FIG. [0034]
- the user After the user turns on the power of each part of the endoscope system 1, the user inserts the insertion part 11 of the endoscope 6 into the body cavity of the subject, for example, into the large intestine.
- the insertion unit 11 When the insertion unit 11 is inserted into the large intestine of the subject by the user, for example, an object such as a living tissue existing in the large intestine, for example, an image of the polyp 500 shown in FIG.
- the image is captured by the imaging unit 17 provided in the unit 14. Then, the subject image captured by the imaging unit 17 is output to the CCU 8 as an imaging signal.
- the CCU 8 performs signal processing on the imaging signal output from the imaging device 16 of the imaging unit 17 in a signal processing circuit (not shown), thereby converting the imaging signal as a video signal of a two-dimensional image. Output. Based on the video signal output from the CCU 8, the monitor 9 displays the subject image captured by the imaging unit 17 as a two-dimensional image. Further, the CCU 8 outputs a video signal of a two-dimensional image obtained by performing signal processing on the imaging signal output from the imaging device 16 of the imaging unit 17 to the medical image processing device 3. .
- the video signal of the two-dimensional image output to the medical image processing apparatus 3 is an image input unit.
- the 3D model estimation unit 22a of the CPU 22 performs, for example, a “Shape From Shading” method on the 2D image output from the image input unit 21 in step S1.
- the 3D model corresponding to the 2D image is estimated by performing processing such as geometric conversion based on the luminance information of the 2D image.
- the 3D model estimation unit 22a generates a correspondence table in which the 3D data point sequence indicating the intestinal surface of the large intestine is associated with the 2D image data, and stores the correspondence table in the correspondence table storage area 27d of the hard disk 27. To do.
- the model correction unit 22d of the CPU 22 performs model correction processing described later on the 3D model estimated in step S2, and stores the coordinates of each data point of the corrected 3D model in the storage device I. Stored in the 3D point sequence data storage area 27c of the hard disk 27 via / F26.
- the detection target region setting unit 22b of the CPU 22 performs the color change of the two-dimensional image output from the image input unit 21 in step S3 and the three-dimensional model corrected by the processing in step S2.
- a target area that is a detection target area is set as a target area to which a process for detecting a disease having a bulge shape in the three-dimensional model is applied.
- the detection target area setting unit 22b of the CPU 22 converts, for example, the two-dimensional image output from the image input unit 21 into an R (red) image, a G (green) image, and a B (blue) image. After being separated into each of the plane images, a raised change is detected based on the data of the three-dimensional model estimated according to the R image, and the color tone is determined based on the chromaticity of the R image and the G image. Detect changes. Then, the detection target area setting unit 22b of the CPU 22 determines, based on the detection result of the bulge change and the detection result of the color tone, the area where both the bulge change and the color change are detected as the target area. Set as.
- the shape feature quantity calculation unit 22c of the CPU 22 calculates the local partial differential coefficient of the target region in step S4. Specifically, the shape feature quantity calculation unit 22c of the CPU 22 applies the R pixel in the local region (curved surface) including the noted 3D position (x, y, z) to the calculated 3D shape. Calculate first order partial differential coefficients fx, fy, fz and second order partial differential coefficients fxx, fyy, fzz, fxy, fyz, fxz at the value f.
- the shape feature quantity calculation unit 22c of the CPU 22 calculates a local partial differential coefficient as a shape feature quantity of (3D shape) for each data point existing in the processing target area of the 3D model in step S5. Based on the above, a process for calculating a shape index value and a curvedness value is performed.
- the shape feature amount calculation unit 22c of the CPU 22 calculates a Gaussian curvature K and an average curvature H.
- Shape IndexSI and Curvedness ssCV which are feature amounts representing the curved surface shape in this case, are respectively
- the shape feature amount calculation unit 22c of the CPU 22 calculates the Shape Index SI and Curvedness CV of each three-dimensional curved surface as the three-dimensional shape information, and stores it in the analysis information storage unit 25.
- the above-described Shape Index value is a value for indicating the state of unevenness at each data point of the three-dimensional model, and is indicated as a numerical value within a range of 0 to 1. Specifically, at each data point in the 3D model, if the Shape Index value is close to 0, it indicates that a concave shape exists, and if the Shape Index value is close to 1, it is convex. The existence of the shape is suggested.
- the aforementioned Curvedness value is a value for indicating the curvature at each data point of the three-dimensional model. Specifically, at each data point existing in the 3D model, the smaller the Curvedness value, the more sharply curved the surface is suggested, and the larger the Curvedness value, the slower the curve. The existence of a curved surface is suggested.
- step S6 the 3D shape detection unit 22d of the CPU 22 presets the shape index value and the curvedness value at each data point existing in the target area of the 3D model.
- the data points are detected as a data group having a raised shape.
- the CPU 22 selects, for example, a plurality of data points having a shape index value larger than the threshold T and a curvedness value larger than the threshold T2 from the data points existing in the processing target area of the three-dimensional model. This is detected as a data group having a raised shape.
- the polyp determination unit 22f of the CPU 22 corresponds to the ridge shape derived from a disease such as a polyp, in which each of the plurality of data points detected as the data group having the ridge shape in the three-dimensional model in step S7. A raised shape discriminating process is performed to discriminate whether the data point.
- step S8 the polyp determination unit 22f of the CPU 22 determines a region having a data group consisting of data points corresponding to the raised shape derived from the lesion as the polyp region, and detects the polyp that is the lesion region To do.
- the CPU 22 stores the detection result in association with the endoscopic image to be detected in the detected lesion part storage area 27e of the hard disk 27 of FIG. For example, it is displayed side by side with the endoscopic image to be detected on the monitor 4 via the unit 28.
- the monitor 4 displays an image of a three-dimensional model of the subject so that the user can easily recognize the position where the raised shape derived from a lesion such as a polyp exists.
- the model correction processing by the model correction unit 22d in step S2 will be described.
- the mode correction unit 22d of the CPU 22 performs edge extraction processing on the input 2D image to generate an edge image in step S21, and generates an edge image storage area on the hard disk 27.
- the model correction unit 22d performs thinning processing on the generated edge image to generate an edge thinned image, and stores it in the edge thinned image storage area 27b of the hard disk 27.
- the edge thinned image is an image obtained by thinning each edge region of the edge image to a line width of 1 pixel by thinning processing.
- the mode correction unit 22d of the CPU 22 sets the parameters i and j in step S23, respectively.
- the mode correction unit 22d of the CPU 22 obtains the i-th edge line Li of the edge thinned image in step S24, and in step S25 the j-th (noticeable) on the i-th edge line Li. Get the edge point Pi, j (which is a point).
- the model correction unit 22d of the CPU 22 generates an edge orthogonal line Hi, j that passes through the jth edge point PiJ and is orthogonal to the ith edge line Li in step S26. Subsequently, the model correction unit 22d of the CPU 22 determines the 3D data points corresponding to the points (xi, j, yi, j, zi, j) on the 2D image on the edge orthogonal line Hi, j in step S27. Get column from correspondence table. As described above, this correspondence table is stored in the correspondence table storage area 27d of the hard disk 27 by the three-dimensional model estimation unit 22a of the CPU 22 in step S1.
- the model correction unit 22d of the CPU 22 determines the edge endpoints AiJ, Bi, j (see the enlarged view of FIG. 6) on the edge orthogonal line Hi, j in step S28. Specifically, in this embodiment, as shown in FIG. 7, the pixel value corresponding to each point on the two-dimensional image is acquired, and the value of the pixel value having a predetermined threshold value is set as the point on the edge. Using the determination process, the edge endpoints Ai, j, B i, j are obtained by making a one-way determination from each point on the thinned edge.
- the mode correction unit 22d of the CPU 22 determines whether the edge end points Ai, j, and BiJ Find the distance Di, j. Then, the mode correction unit 22d of the CPU 22 determines in step S30 whether the distance Di, j between the edge points Ai, j, Bi, j of the edge is smaller than the predetermined value DO (Di, j is DO). .
- the mode correction unit 22d of the CPU 22 determines the edge in step S31. It is determined that the area between the end points Ai, j, and BiJ is the part represented as an edge region, and the 3D point sequence data is corrected. Details of the correction processing of the three-dimensional point sequence data in step S31 will be described later.
- the model correction unit 22d of the CPU 22 2 determines that the edge endpoint Ai, j is at step S32.
- Bi, j is determined to be a portion expressed as an occlusion area
- the three-dimensional point sequence data between edge endpoints Ai, j, BiJ is deleted, and the process proceeds to step S33.
- the model correction unit 22d determines in step S32 that the distance Di, j between the edge endpoints Ai, j, Bi, j is equal to or greater than a predetermined value DO, as shown in FIG.
- the 3D point sequence data between the edge points Ai, j, Bi, and j on the 3D point sequence data line indicating the surface is deleted as shown in Fig. 9 and corrected as an occlusion area. Generate a line.
- step S33 the model correction unit 22d of the CPU 22 determines whether the parameter j is less than all the points Nj on the edge line Li, and the parameter j is less than all the points Nj on the edge line Li. In this case, the parameter j is incremented in step S34 and the process returns to step S25, and the above steps S25 to S34 are repeated for all points on the edge line Li.
- the mode correction unit 22d of the CPU 22 determines whether the parameter i is less than the number of all edge lines Ni in step S35. If the parameter i is less than the number of all edge lines Ni, the parameter i is incremented in step S36 and the process returns to step S24, and the above steps S24 to S36 are repeated for all edge lines.
- step S31 Next, the 3D point sequence data correction process in step S31 will be described.
- the model correction unit 22d of the CPU 22 performs processing in step S41 on the 3D space having the 3D point sequence data (the point of interest). (3) An NXNXN cubic region centered on the edge point Pi, j is formed, and the average value of the coordinates of the 3D point sequence data included in this NXNXN cubic region is calculated.
- the model correction unit 22d of the CPU 22 smoothes the coordinate data of the three-dimensional point sequence data on the point sequence data line of the edge portion by the calculated average value.
- 3D point sequence data correction process is not limited to the process shown in FIG. 10, and the following modifications 1 and 2 of the 3D point sequence data may be corrected.
- the mode correction unit 22d of the CPU 22 calculates the average value of the pixel values (tone values) of the coordinate points of the two-dimensional image included in the 5 ⁇ 5 square area.
- the model correction unit 22d of the CPU 22 performs the edge edge detection from the edge edge points Ai, j, Bi, j in step S41b as shown in FIG.
- step S42b the model correction unit 22d of the CPU 22 compares, for example, the z coordinate of each point in the edge region with the z coordinate of the intersection point Qi, j, and performs correction according to the magnitude relationship. Determine the approximate line used for. Specifically, as shown in Fig. 16, when the y coordinate of each point in the edge region is smaller than the y coordinate of the intersection QU, the approximation approximated using the coordinate value smaller than the y coordinate of the intersection Qi, j If a straight line is used (in the case of Fig.
- an approximate straight line on the edge end point Ai, j side) and the y coordinate of each point in the edge area is larger than the y coordinate of the intersection QiJ, the y coordinate of the intersection QU
- an approximate straight line approximated by using a larger coordinate value in the case of Fig. 16, the approximate straight line on the edge Bi, j side of the edge.
- the threshold value is corrected using the position (Z coordinate) at the point of interest in the three-dimensional data.
- a threshold that excludes the influence of secondary light can be used for polyp detection processing, and the detection accuracy of polyp candidates can be improved. Therefore, it is possible to prompt the user to improve the polyp candidate discovery rate in the colonoscopy.
- FIGS. 17 to 25 relate to the second embodiment of the present invention.
- FIG. 17 is a diagram showing a configuration of information stored in the hard disk.
- FIG. 18 is a first diagram illustrating the second embodiment.
- FIG. 19 is a second diagram illustrating the second embodiment.
- FIG. 20 is a third diagram for explaining the second embodiment.
- FIG. 21 is a flowchart illustrating the processing flow of the CPU according to the second embodiment.
- FIG. 22 is a flowchart showing the flow of the correction process for the 3D model data of FIG.
- FIG. 23 is a first diagram illustrating the process of FIG.
- FIG. 24 is a second diagram for explaining the processing of FIG.
- FIG. 25 is a third diagram for explaining the processing of FIG.
- the hard disk 27 includes an edge image storage area 27a, an edge thinned image storage area 27b, a three-dimensional point sequence data storage area 27c, and a corresponding table storage area.
- an edge image storage area 27a In addition to 27d and the detected lesion storage area 27e, further morphological parameters Data map storage area 27f.
- Other configurations are the same as those in the first embodiment.
- This morphological transformation is a transformation method that outputs the center of the sphere when rolling the sphere on the three-dimensional surface, and dilation processing rolls the sphere to the intestinal tract surface, and erosion processing rolls the sphere to the back of the intestinal tract. Become.
- the center of the sphere in the morphological transformation is determined by the magnitude of the target noise. For example, as shown in Fig. 18, for example, when rolling a sphere 300a with a diameter of 5 on a noisy surface surface with a width of 1 pixel, the locus of the center is a straight line, so the same size from the back side of the surface formed by the locus You can fill the noise by rolling the sphere.
- the size of the noise is small, but the size of the spike noise or edge noise on the edge depends on the thickness of the edge. Dependent. Therefore, at the position where the edge exists in the 2D image, it is necessary to change the diameter of the sphere for smoothing, that is, the smoothing parameter.
- the model correction unit 22d of the CPU 22 performs the morphological processing by the morphological parameter Wi in step S10 after the model correction processing in step S2 described in the first embodiment. Execute, and move to step S3 described in the first embodiment.
- the CPU 22 moderation correction 22df, FIG. 22 (as shown, steps S100 and S101 (trowel, FIG. 23 ( The three-dimensional space shown here is divided into cubic small regions at regular intervals, a map that can store the morphology parameters in each small region is created, and the edge line width DiJ shown in Fig. 24 is shown in Fig. 25.
- the morphological conversion parameter WiJ correspondence table is stored in the morphological parameter map storage area 27f of the hard disk 27.
- step S100 the model correction unit 22d selects and acquires one edge of the edge thinned image as a processing target, and the edge width Wi, j is calculated. Subsequently, the model correction unit 22d obtains the morphology conversion parameter W using the correspondence table of the edge line width Di, j and the morphology conversion parameter Wi, j in FIG.
- step S101 the model correction unit 22d acquires the three-dimensional coordinates corresponding to the selected point on the edge line on which the process of step S100 has been performed from the corresponding point table recorded on the hard disk 27.
- the coordinate position of the 3D small area map corresponding to the acquired tertiary coordinate is obtained.
- the model correction unit 22d adds and stores the morphology conversion parameter W to the coordinate position of the obtained three-dimensional small region map, and adds one count value at the coordinate position.
- the edge line width Di, j corresponds to the edge image when a line orthogonal to the edge line is drawn at a point on the edge thin line of the edge thinned image. Indicates the width of the edge.
- the morphological transformation parameter Wi, j in the edge image indicates the diameter of the sphere (see FIGS. 18 to 20).
- step S10 the model correcting unit 22d determines the morphology transformation parameter W at the time of morphology transformation based on the three-dimensional coordinates, and continuously performs dilation processing and erosion processing which are morphology transformation processing. Execute noise removal processing.
- the morphological transformation parameter W at each coordinate position of the 3D small area map is averaged by the count value at each coordinate position. Turn into. Through this process, it is possible to obtain the optimum parameters for smoothing by referring to the 3D small area map based on the 3D coordinate position.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
La présente invention concerne un système d'endoscope substantiellement construit à partir d'un dispositif d'observation médical, un dispositif de traitement d'image médical et un moniteur. Une unité de traitement centrale (22) du dispositif de traitement d'image médical dispose des sections de fonction suivantes : une section estimation de modèle tridimensionnel (22a), une section réglage de région de détection cible (22b), une section calcul de quantité caractéristique de forme (22c) en tant que moyen de calcul de quantité caractéristique de forme, une section correction modèle (22d), une section détection de forme tridimensionnelle (22e), et une section décision de polype (22f).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2006-279236 | 2006-10-12 | ||
| JP2006279236A JP2008093213A (ja) | 2006-10-12 | 2006-10-12 | 医療用画像処理装置及び医療用画像処理方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2008044365A1 true WO2008044365A1 (fr) | 2008-04-17 |
Family
ID=39282576
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2007/061628 Ceased WO2008044365A1 (fr) | 2006-10-12 | 2007-06-08 | Dispositif de traitement d'image médicale et procédé de traitement d'image médicale |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2008093213A (fr) |
| WO (1) | WO2008044365A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112788978A (zh) * | 2019-03-28 | 2021-05-11 | Hoya株式会社 | 内窥镜用处理器、信息处理装置、内窥镜系统、程序以及信息处理方法 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2008136098A1 (fr) * | 2007-04-24 | 2008-11-13 | Olympus Medical Systems Corp. | Dispositif de traitement d'images médicales et procédé de traitement d'images médicales |
| JP5658931B2 (ja) * | 2010-07-05 | 2015-01-28 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11337845A (ja) * | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
| JP2005506140A (ja) * | 2001-10-16 | 2005-03-03 | ザ・ユニバーシティー・オブ・シカゴ | コンピュータ支援の3次元病変検出方法 |
| JP2005192880A (ja) * | 2004-01-08 | 2005-07-21 | Olympus Corp | 画像処理方法 |
-
2006
- 2006-10-12 JP JP2006279236A patent/JP2008093213A/ja active Pending
-
2007
- 2007-06-08 WO PCT/JP2007/061628 patent/WO2008044365A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11337845A (ja) * | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
| JP2005506140A (ja) * | 2001-10-16 | 2005-03-03 | ザ・ユニバーシティー・オブ・シカゴ | コンピュータ支援の3次元病変検出方法 |
| JP2005192880A (ja) * | 2004-01-08 | 2005-07-21 | Olympus Corp | 画像処理方法 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112788978A (zh) * | 2019-03-28 | 2021-05-11 | Hoya株式会社 | 内窥镜用处理器、信息处理装置、内窥镜系统、程序以及信息处理方法 |
| US11869183B2 (en) | 2019-03-28 | 2024-01-09 | Hoya Corporation | Endoscope processor, information processing device, and endoscope system |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2008093213A (ja) | 2008-04-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8165370B2 (en) | Medical image processing apparatus and medical image processing method | |
| US8515141B2 (en) | Medical image processing apparatus and method for detecting locally protruding lesion | |
| US7830378B2 (en) | Medical image processing apparatus and medical image processing method | |
| JP5276225B2 (ja) | 医用画像処理装置及び医用画像処理装置の作動方法 | |
| EP1994876A1 (fr) | Dispositif et procédé de traitement vidéo d'image médicale | |
| US8165367B2 (en) | Medical image processing apparatus and medical image processing method having three-dimensional model estimating | |
| WO2007119297A1 (fr) | dispositif de traitement d'image pour usage médical et procédé de traitement d'image pour usage médical | |
| US8121369B2 (en) | Medical image processing apparatus and medical image processing method | |
| JPWO2008136098A1 (ja) | 医療用画像処理装置及び医療用画像処理方法 | |
| WO2012153568A1 (fr) | Dispositif de traitement d'image médicale et procédé de traitement d'image médicale | |
| EP1992273B1 (fr) | Dispositif et procede de traitement d'images medicales | |
| WO2008044365A1 (fr) | Dispositif de traitement d'image médicale et procédé de traitement d'image médicale | |
| EP1992274B1 (fr) | Dispositif et procede de traitement d'images medicales | |
| JP5148096B2 (ja) | 医療用画像処理装置及び医療用画像処理装置の作動方法 | |
| JP2008023266A (ja) | 医療用画像処理装置及び医療用画像処理方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07767068 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07767068 Country of ref document: EP Kind code of ref document: A1 |