WO2013061939A1 - Dispositif endoscopique et procédé de commande de foyer - Google Patents
Dispositif endoscopique et procédé de commande de foyer Download PDFInfo
- Publication number
- WO2013061939A1 WO2013061939A1 PCT/JP2012/077282 JP2012077282W WO2013061939A1 WO 2013061939 A1 WO2013061939 A1 WO 2013061939A1 JP 2012077282 W JP2012077282 W JP 2012077282W WO 2013061939 A1 WO2013061939 A1 WO 2013061939A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focus
- target
- unit
- focus position
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2423—Optical details of the distal end
- G02B23/243—Objectives for endoscopes
- G02B23/2438—Zoom objectives
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00188—Optical arrangements with focusing or zooming features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/007—Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light
- G02B26/008—Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light in the form of devices for effecting sequential colour changes, e.g. colour wheels
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
Definitions
- the present invention relates to an endoscope apparatus, a focus control method, and the like.
- an imaging apparatus such as an endoscope
- a pan-focus image is required in order not to interfere with the doctor's diagnosis.
- the endoscope achieves such performance by using an optical system having a relatively large F number to increase the depth of field.
- an image sensor having a high pixel number of about several hundred thousand pixels has been used.
- the permissible circle of confusion decreases with the pixel pitch in an image sensor with a high pixel, it is necessary to reduce the F number, and the depth of field of the image pickup device becomes narrow.
- an endoscope apparatus for providing an in-focus object position driving unit for driving the in-focus object position of the objective optical system in the imaging unit of the endoscope and performing autofocus (hereinafter referred to as AF) on the subject.
- AF autofocus
- a focus operation is easily and quickly performed on a focus area by performing focus control based on discrete focus object position control after detecting the focus area.
- An endoscope apparatus and a focus control method to be performed can be provided.
- an imaging unit that acquires an image signal, and switching that switches a focused object position determined by the state of the imaging unit to any one of a plurality of discretely set target focus position candidates And one of the plurality of target focus position candidates is selected as a target focus position, and the switching unit is controlled so that the selected target focus position becomes the focus object position.
- a focus control unit that performs focusing control, and the focus control unit includes a target region detection unit that detects a target region on a subject from the image signal, and the focus control unit detects the detected target unit It is determined whether or not the region of interest is in focus. When it is determined that the target region is in focus, the target focus position candidate corresponding to the timing at which the determination is performed is determined as the target focus position.
- the switching unit is controlled so that the target in-focus position candidate different from the target in-focus position candidate corresponding to the timing at which the determination is performed is set to the in-focus state.
- the present invention relates to an endoscope apparatus that switches an object position.
- control is performed so that the in-focus object position corresponding to the state of the imaging unit is a target in-focus position selected from a plurality of discrete target in-focus position candidates.
- a target focus position candidate used as the target focus position is selected based on the determination as to whether or not the region of interest detected from the image signal is in focus.
- Another aspect of the present invention is a focus control method in an objective optical system in which a plurality of target in-focus position candidates are discretely set, and an image signal from an imaging unit is acquired, and the object signal is obtained from the image signal.
- the target area is detected, and it is determined whether or not the detected target area is in focus.
- the target corresponding to the timing at which the determination is performed
- the in-focus position candidate is selected as the in-focus position and it is determined that the in-focus position is not in focus
- the target in-focus position different from the in-focus position candidate corresponding to the timing at which the determination is performed This is related to a focus control method for switching a focused object position to a candidate.
- FIG. 1 is a configuration example of an endoscope apparatus according to the present embodiment.
- FIG. 2 is a configuration example of a rotating color filter.
- FIG. 3 shows an example of spectral characteristics of the rotating color filter.
- FIG. 4 is a configuration example of the image processing unit.
- FIG. 5 is a configuration example of the focus control unit.
- FIG. 6 is a configuration example of a contrast value calculation unit.
- FIG. 7 shows a configuration example of the attention area detection unit.
- FIG. 8 shows an example of setting an evaluation area by dividing an area.
- FIG. 9 shows another configuration example of the focus control unit.
- FIG. 1 is a configuration example of an endoscope apparatus according to the present embodiment.
- FIG. 2 is a configuration example of a rotating color filter.
- FIG. 3 shows an example of spectral characteristics of the rotating color filter.
- FIG. 4 is a configuration example of the image processing unit.
- FIG. 5 is a configuration example of the focus control unit.
- FIG. 6 is
- FIG. 10 is a diagram for explaining a relationship between an imaging unit and a subject at the time of screening, a point away from the imaging unit corresponding to a focused object position by the best subject distance, and a range in which focus is achieved in the depth direction in the object scene.
- FIG. 11 is a diagram for explaining the relationship between the imaging unit and the subject at the time of close-up observation, a point away from the imaging unit corresponding to the focused object position by the best subject distance, and a range focused in the depth direction in the object scene.
- FIG. 12 shows an example of four-focus switching.
- FIG. 13 shows an example of bifocal switching.
- FIG. 14 is a flowchart for explaining a focused object position switching process in the first embodiment.
- FIG. 15 is a flowchart illustrating focused object position switching processing according to the second embodiment.
- the depth of field tends to be further reduced by increasing the imaging magnification of the imaging unit or shortening the distance from the imaging unit to the subject.
- the focusing operation is very difficult.
- the system automatically performs the focusing operation.
- the normal AF has a structure that continuously switches the in-focus object position (for example, a structure that continuously switches the focus lens position). For this reason, the degree of freedom of selection of the target in-focus position is high and flexible handling is possible, but on the other hand, the time required for setting the target in-focus position becomes long, and in some cases, the mechanism becomes complicated. If the mechanism is complicated, the imaging unit itself becomes large, which is not preferable in an endoscope apparatus in which the imaging unit is inserted into a living body.
- the focused object position is a relative position (object point) of the object with respect to the reference position when a system including the object, the optical system, the _GoBack_GoBack image plane, and the like is in a focused state.
- the image plane coincides with the plane of the imaging element included in the imaging unit. Therefore, when the plane of the imaging element is fixed, the state of the optical system is determined. In this case, the in-focus object position can be determined.
- the target in-focus position is a position where it is desired to focus at a certain point in time (becomes an in-focus target).
- it indicates the relative position (object point) of the object to be focused on at a certain point in time.
- examples of the reference position include the position of the surface of the image sensor and the position of the tip of the optical system.
- the present applicant proposes a method for discretely controlling the focused object position. For example, one of a plurality of target focus position candidates set discretely may be selected as the target focus position, and control may be performed so that the selected target focus position becomes the focus object position.
- processing for example, contrast value calculation processing
- the focusing operation can be executed at high speed.
- the mechanism can be simplified compared to the case of continuous in-focus object position control, and the imaging unit can be downsized. Can be expected.
- the mechanism can be simplified as described above. Also, when it is not possible to determine that all target in-focus position candidates are in focus (for example, when the contrast value is small, etc.), unlike the first embodiment, the determination result by the determination unit 325 is used to determine the target. Determine the in-focus position.
- FIG. 1 shows a configuration example of an endoscope apparatus according to a first embodiment.
- the endoscope apparatus includes a light source unit 100, an imaging unit 200, a control device 300 (processor unit), a display unit 400, and an external I / F unit 500.
- the light source unit 100 includes a white light source 110, a light source stop 120, a light source stop driving unit 130 for driving the light source stop 120, and a rotating color filter 140 having a plurality of spectral transmittance filters.
- the light source unit 100 also includes a rotation driving unit 150 that drives the rotation color filter 140 and a condenser lens 160 that condenses the light transmitted through the rotation color filter 140 onto the incident end face of the light guide fiber 210.
- the light source aperture driving unit 130 adjusts the amount of light by opening and closing the light source aperture 120 based on a control signal from the control unit 340 of the control device 300.
- FIG. 2 shows a detailed configuration example of the rotation color filter 140.
- the rotation color filter 140 includes three primary color red (hereinafter abbreviated as R) filter 701, green (hereinafter abbreviated as G) filter 702, blue (hereinafter abbreviated as B) filter 703, and a rotary motor 704. Yes.
- FIG. 3 shows an example of spectral characteristics of these color filters 701 to 703.
- the rotation driving unit 150 rotates the rotation color filter 140 at a predetermined number of rotations in synchronization with the imaging period of the imaging element 260 based on a control signal from the control unit 340. For example, if the rotating color filter 140 is rotated 20 times per second, each color filter crosses incident white light at 1/60 second intervals. In this case, the image sensor 260 completes imaging and transfer of image signals at 1/60 second intervals.
- the image sensor 260 is, for example, a monochrome single-plate image sensor, and is configured by, for example, a CCD or a CMOS image sensor. That is, in the present embodiment, frame sequential imaging is performed in which images of light of each of the three primary colors (R, G, or B) are captured at 1/60 second intervals.
- the imaging unit 200 is formed to be elongate and bendable, for example, to enable insertion into a body cavity.
- the imaging unit 200 diffuses the light guide fiber 210 for guiding the light collected by the light source unit 100 to the illumination lens 220 and the light guided to the tip by the light guide fiber 210 to irradiate the observation target.
- An illumination lens 220 is included.
- the imaging unit 200 also switches the objective lens 230 that collects the reflected light returning from the observation target, the focus lens 240 for adjusting the focus object position, and the position of the focus lens 240 at discrete positions.
- a switching unit 250 and an image sensor 260 for detecting the collected reflected light are provided.
- the switching unit 250 is, for example, a VCM (Voice Coil Motor) and is connected to the focus lens 240.
- VCM Vehicle Coil Motor
- the switching unit 250 adjusts the focused object position by switching the position of the focus lens 240 at a plurality of discrete positions.
- FIG. 12 shows the relationship between the position of the focus lens 240 and the best subject distance corresponding to the focused object position at this time in the present embodiment.
- the best subject distance is the distance in the depth direction from the imaging unit in the object scene.
- the subject image is focused on the image sensor (ideally, the subject has come out from one point on the subject).
- Distance from the imaging unit 200 to the subject when the light beam converges to one point on the image sensor. That is, the best subject distance is a distance corresponding to the distance from the imaging unit 200 to the focused object position.
- the “subject distance” in the present embodiment refers to a distance from the imaging unit 200 to the subject, and is not limited to a distance from a point having optical characteristics such as a principal point and a focal point.
- the focus lens 240 takes discrete positions A, B, C, and D, and switches the best subject distance corresponding to the focused object position at this time in four stages.
- a point away from the imaging unit by the best subject distance has a one-to-one correspondence with the focus lens positions A to D in order from a point close to the imaging unit 200 (hereinafter referred to as a near point) to a point far away (hereinafter referred to as a far point) To do.
- the depth of field achieves the depth of field required for endoscopic observation by switching the position of the focus lens 240.
- the in-focus range includes a range of 2 to 70 mm from the imaging unit.
- the focus lens position is at a position that focuses on the near point, as shown in FIG. 11, it is suitable for observing a subject with no depth close to it.
- the point away from the imaging unit by the best subject distance is on the far point side, the depth of field becomes deep. Therefore, as shown in FIG. 10, it is suitable for screening a luminal subject.
- the control device 300 controls each part of the endoscope device and performs image processing.
- the control device 300 includes an A / D conversion unit 310, a focus control unit 320, an image processing unit 330, and a control unit 340.
- the image signal converted into a digital signal by the A / D conversion unit 310 is transferred to the image processing unit 330.
- the image signal processed by the image processing unit 330 is transferred to the focus control unit 320 and the display unit 400.
- the focus control unit 320 changes the position of the focus lens 240 by transferring a control signal to the switching unit 250.
- the control unit 340 controls each unit of the endoscope apparatus. Specifically, the control unit 340 synchronizes the light source aperture driving unit 130, the focus control unit 320, and the image processing unit 330. Further, it is connected to the external I / F unit 500, and controls the focus control unit 320 and the image processing unit 330 based on the input from the external I / F unit 500.
- the display unit 400 is a display device capable of displaying a moving image, and includes, for example, a CRT or a liquid crystal monitor.
- the external I / F unit 500 is an interface for performing an input from an operator to the endoscope apparatus.
- the external I / F unit 500 includes, for example, a power switch for turning on / off the power, a mode switching button for switching a photographing mode and other various modes.
- the external I / F unit 500 transfers the input information to the control unit 340.
- FIG. 4 shows a detailed configuration example of the image processing unit 330 according to the first embodiment.
- the image processing unit 330 includes a preprocessing unit 331, a synchronization unit 332, and a postprocessing unit 333.
- the A / D conversion unit 310 is connected to the preprocessing unit 331.
- the preprocessing unit 331 is connected to the synchronization unit 332.
- the synchronization unit 332 is connected to the post-processing unit 333 and the focus control unit 320.
- the post-processing unit 333 is connected to the display unit 400.
- the control unit 340 is bi-directionally connected to the pre-processing unit 331, the synchronization unit 332, and the post-processing unit 333, and performs these controls.
- the preprocessing unit 331 uses the OB clamp value, gain correction value, and WB coefficient value stored in advance in the control unit 340 for the image signal input from the A / D conversion unit 310, Gain correction processing and WB correction processing are performed.
- the preprocessing unit 331 transfers the preprocessed image signal to the synchronization unit 332.
- the synchronizer 332 synchronizes the frame sequential image signal with the image signal processed by the preprocessor 331 based on the control signal of the controller 340. Specifically, the synchronizer 332 accumulates image signals of each color light (R, G, or B) input in frame order one frame at a time, and simultaneously reads the accumulated image signals of each color light. The synchronization unit 332 transfers the synchronized image signal to the post-processing unit 333 and the focus control unit 320.
- the post-processing unit 333 uses the tone conversion coefficient, color conversion coefficient, and edge enhancement coefficient stored in advance in the control unit 340 for the image signal after synchronization, and performs tone conversion processing, color processing, and contour processing. Perform enhancement processing.
- the post-processing unit 333 transfers the post-processed image signal to the display unit 400.
- FIG. 5 shows a detailed configuration example of the focus control unit 320 in the first embodiment.
- the focus control unit 320 includes an attention area detection unit 321 (abnormal part detection unit), an area setting unit 322, a contrast value calculation unit 323, and a switching control unit 324.
- the image processing unit 330 is connected to the attention area detection unit 321 and the area setting unit 322.
- the attention area detection unit 321 is connected to the area setting unit 322.
- the region setting unit 322 is connected to the contrast value calculation unit 323.
- the contrast value calculation unit 323 is connected to the switching control unit 324.
- the switching control unit 324 is connected to the switching unit 250.
- the control unit 340 is connected to the attention area detection unit 321, the area setting unit 322, the contrast value calculation unit 323, and the switching control unit 324, and performs these controls.
- the attention area detection unit 321 detects the attention area of the subject from the image signal transferred from the image processing unit 330.
- the attention area detection unit 321 transfers area information indicating the position of the attention area to the area setting unit 322.
- a control signal indicating that the attention area does not exist is transferred to the switching control unit 324.
- the attention area may be an abnormal part representing a lesion part or the like.
- the attention area detection unit 321 is realized as an abnormal part detection unit. The processing performed by the abnormal part detection unit will be described in detail later.
- the region setting unit 322 sets an evaluation region for calculating a contrast value for the image signal transferred from the image processing unit 330 when the attention region is detected by the attention region detection unit 321.
- the attention area may be set as the evaluation area as it is, or as shown in FIG. 8, the image signal is divided in advance into a plurality of areas, and the area containing the most attention area is set as the evaluation area. You may do it. Thereafter, the region setting unit 322 transfers the set evaluation region information and the image signal to the contrast value calculation unit 323.
- the contrast value calculation unit 323 calculates the contrast value of the evaluation area from the evaluation area information and the image signal.
- the image signal transferred from the region setting unit 322 may calculate a contrast value for an arbitrary channel.
- a luminance signal may be generated from pixel values of three channels of R, G, and B, and a contrast value may be calculated for the pixel value of the generated luminance signal.
- the contrast value calculation unit 323 may perform arbitrary high-pass filter processing on all the pixels included in the evaluation region, and calculate the contrast value by adding all the high-pass filter output values of each pixel. Further, for example, as shown in FIG.
- the contrast value calculation unit 323 may be configured to include a bright spot removal unit 3231 before the high-frequency extraction unit 3232 that performs high-pass filter processing.
- the bright spot removing unit 3231 performs threshold processing on an arbitrary channel or pixel value of a luminance signal in all pixels included in the evaluation region, and determines that a pixel having a pixel value equal to or larger than the threshold is a bright spot. To do.
- the pixel determined to be a bright spot can reduce the influence of the bright spot on the contrast value by transferring a control signal that sets the output value of the subsequent high-pass filter processing to 0. Thereafter, the contrast value calculation unit 323 transfers the contrast value of the evaluation region to the switching control unit 324.
- the switching control unit 324 controls the switching unit 250 based on the contrast value calculated by the contrast value calculation unit 323.
- an attention area (abnormally narrow part) is detected in the attention area detection unit 321 (S101), and when a control signal that does not include the attention area is transferred from the attention area detection unit 321, the switching control unit 324 transfers a control signal for moving the focus lens position to a position corresponding to a point corresponding to the longest subject distance (farthest point) from the imaging unit to the switching unit 250 (S102). This is because, at the farthest point, the depth of field is the deepest, and thus the focused area is wide.
- the region setting unit 322 sets an evaluation region corresponding to the detected region of interest (S103). Then, the contrast value calculation unit 323 calculates the contrast value of the set evaluation area (S104). The switching control unit 324 determines whether the calculated contrast value is larger than the threshold value Tcon (S105). When the contrast value is equal to or less than the threshold value Tcon (including that value), the current focus lens position and the contrast value are stored in a memory (not shown) in the switching control unit, and the switching start flag F is set to 1 to perform the switching process (S106). To S108).
- the position of the focus lens 240 is switched to a position (position A in FIG. 12) corresponding to the point with the longest subject distance (farthest point) from the imaging unit (S107). Thereafter, the switching start flag is set to 0, the stored focus lens position and contrast value are erased, and the process ends.
- the focus lens 240 is moved to the focus lens position (one of them when there are a plurality of focus lens positions). (S108).
- the switching control unit 324 sets the switching start flag F to 0 without moving the focus lens position, and erases the stored focus lens position and contrast value. Then, the process ends (S109).
- the position of the focus lens 240 is switched to a position corresponding to the point (farthest point) with the longest subject distance from the imaging unit.
- the focus lens position may be switched to a predetermined position designated by the surgeon or a focus lens position with the highest contrast value stored in a memory (not shown) in the switching control unit.
- the focus lens position is set discretely, and the focus lens position and the target focus position candidate have a correspondence relationship.
- the endoscope apparatus of this embodiment selects one of a plurality of discrete target focus position candidates set as discrete target focus positions of the objective optical system.
- discrete target focus position candidates can be set by performing discrete focus lens position control. It can. That is, the focus lens position in FIG. 14 represents A to D in FIG. 12, E and F in FIG.
- FIG. 7 shows a detailed configuration example of the attention area detection unit 321 in the first embodiment.
- the attention area detection unit 321 includes a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, an area division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216.
- FIG. 7 is a configuration example assuming that the attention area detection unit 321 is realized as an abnormal part detection unit that detects an abnormal part as an attention area, and an example of detecting an abnormal part is also described in the following description. is doing.
- the present invention is not limited to this.
- the image processing unit 330 is connected to a brightness / color signal calculation unit 3211 and an area division unit 3214.
- the brightness / color signal calculation unit 3211 is connected to the reference signal creation unit 3212 and the difference calculation unit 3213.
- the reference signal creation unit 3212 is connected to the difference calculation unit 3213.
- the difference calculation unit 3213 is connected to the candidate area detection unit 3215.
- the area dividing unit 3214 is connected to the candidate area detecting unit 3215.
- the candidate area detection unit 3215 is connected to the feature amount calculation unit 3216 and the switching control unit 324.
- the feature amount calculation unit 3216 is connected to the region setting unit 322.
- the control unit 340 includes both a brightness / color signal calculation unit 3211, a reference signal creation unit 3212, a difference calculation unit 3213, a region division unit 3214, a candidate region detection unit 3215, and a feature amount calculation unit 3216. Are connected to each other and perform these controls.
- the brightness / color signal calculation unit 3211 calculates a brightness signal and a color signal based on the image signal transferred from the image processing unit 330.
- the color signal is a signal indicating the redness of the image signal.
- the brightness signal Y and the color signal C are calculated from the image signals of the respective color lights included in the image signal using the following expressions (1) and (2).
- R, G, and B indicate image signals of each color light included in the image signal. Further, a is a constant, and a value input in advance from the outside is used.
- the calculated brightness signal and color signal are transferred to the reference signal creation unit 3212 and the difference calculation unit 3213.
- the reference signal creation unit 3212 creates a reference signal based on the brightness signal and the color signal transferred from the brightness / color signal calculation unit 3211. Specifically, low-pass filter processing is performed on the brightness signal and the color signal, respectively.
- the low-pass filter for example, a known method such as an average value filter, a Gaussian filter, or an edge-preserving smoothing filter is used.
- the brightness signal and the color signal subjected to the low-pass filter processing are transferred to the difference calculation unit 3213 as the brightness reference signal and the color reference signal, respectively.
- the reference signal is an image subjected to low-pass filter processing.
- the present invention is not limited to this, and an approximate function is calculated for each of the brightness signal and the color signal, and the approximate value calculated from the approximate function is used as the reference signal. Also good.
- the difference calculation unit 3213 calculates the difference between the brightness signal, the color signal, and each reference signal.
- the calculated difference signals are transferred to the candidate area detection unit 3215 as a brightness difference signal and a color difference signal, respectively.
- the region dividing unit 3214 divides the image signal transferred from the image processing unit 330 into local regions. Specifically, first, a contour pixel is detected from the image signal using a known contour tracking method. The area divided by the contour pixels is subjected to a known labeling process and transferred to the candidate area detection unit 3215.
- the candidate area detection unit 3215 detects a candidate area for an abnormal part (a candidate area for attention that is a candidate area for the attention area in a broad sense) based on the difference signal for each local area. Specifically, for each pixel of the brightness difference signal, first, a pixel whose difference is equal to or greater than a threshold value TmaxY (including its value) is extracted as an outlier pixel. Excluding outlier pixels, the variance ⁇ Y of the brightness difference for each region is calculated. An area where the brightness variance ⁇ y is larger than the threshold value TsY is extracted as an abnormal brightness area (for example, corresponding to an uneven lesion).
- a pixel having a difference equal to or greater than the threshold value Tmaxc is extracted as an outlier pixel.
- the color variance ⁇ c of the color difference for each region is calculated.
- a region where the variance ⁇ c is larger than the threshold value Tsc is extracted as a color abnormality region (for example, corresponding to redness and amber lesion).
- the region extracted as the abnormal brightness region or the abnormal color region is transferred to the feature amount calculation unit 3216 as a candidate region for the abnormal portion. Further, a control signal indicating that an abnormal area is present is transferred to the switching section control 324.
- a control signal indicating that there is no abnormality portion is transferred to the switching control portion 324.
- the threshold values TmaxY, TsY, Tmaxc, and Tsc are constants, and values input in advance from the outside are used. Further, the calculated brightness variance ⁇ y and color variance ⁇ c are transferred to the feature amount calculation unit 3216.
- the feature amount calculation unit 3216 calculates feature amounts for a plurality of candidate regions transferred from the candidate region detection unit 3215, and detects an abnormal portion based on the calculated feature amounts. Specifically, the degree of abnormality E of the candidate area is calculated from the brightness dispersion ⁇ y and the color dispersion ⁇ c of each candidate area. The degree of abnormality Ei is calculated by the following equation (3).
- the abnormal brightness region (uneven lesion) may be more easily detected as an abnormal portion than the reddish abnormal region (redness / scarred lesion). This is because the region that the operator wants to pay more attention to is assumed to be an uneven lesion having a higher severity as a lesion than a redness / amber lesion.
- the area with the highest degree of abnormality Ei calculated is transferred to the area setting unit 322 as an abnormal part.
- the abnormal portion E is a weighted addition value of the brightness variance ⁇ y and the color variance ⁇ c
- the feature amount is not limited thereto, and for example, the area of the candidate region may be obtained. In that case, the candidate area with the largest area is transferred to the area setting unit 322 as an abnormal part.
- the focused object position is switched at discrete positions. This increases the speed at which the subject is focused compared to the AF operation in which the focused object position is changed in a continuous position. For this reason, it is possible to always obtain an image focused on the region of interest (abnormal part) of the operator.
- the endoscope apparatus has an imaging unit 200 that acquires an image signal, and the state of the imaging unit 200 (in a narrow sense, including an objective optical system (for example, the focus lens 240).
- a switching unit 250 that switches the focused object position determined by the state) among a plurality of discrete target focus position candidates, and a focus control unit 320 that controls the switching unit 250 to perform focusing control.
- the focus control unit 320 selects any one of a plurality of target focus position candidates as the target focus position, and outputs the selection result to the switching unit 250.
- the focus control unit 320 includes an attention area detection section 321 that detects an attention area on the subject from the image signal.
- the focus control unit 320 determines whether or not the attention area detected by the attention area detection unit 321 is in focus. When it is determined that the focus area is in focus, the determination process is performed. A target focus position candidate at the performed timing (for example, a position determined according to the position of the focus lens 240 at the time of determination processing) is selected as the target focus position. If it is determined that the subject is not in focus, the switching unit 250 is controlled to switch the focused object position.
- the target in-focus position candidate is a target in-focus position candidate.
- one of the plurality of target in-focus position candidates is selected as the in-focus position.
- Perform focus control For example, the black circles shown in FIGS. 12A to 12D are points corresponding to the target in-focus position candidate and away from the imaging unit by the best subject distance.
- a method of setting a plurality of discrete target focus position candidates is arbitrary, but can be realized by, for example, driving the focus lens 240 discretely. Since the target focus position is determined by determining the position of the focus lens 240, the focus object position can also be controlled discretely by controlling the focus lens 240 discretely.
- the attention area is an area where the priority of observation for the user is relatively higher than other areas. For example, when the user is a doctor and desires treatment, an area in which the mucosa or lesion is copied is displayed. Point to. As another example, if the object that the doctor desires to observe is a bubble or stool, the region of interest is a region in which the bubble or stool portion is copied. That is, the object that the user should pay attention to depends on the purpose of observation, but in any case, in the observation, an area that is relatively higher in observation priority for the user than other areas is the attention area.
- discrete focus lens 240 position control discrete focus lens 240 position control
- AF for example, contrast AF
- the in-focus object position can be continuously controlled. Therefore, there are many target in-focus positions for calculating an AF evaluation value (for example, contrast value) for performing AF. As a result, the amount of calculation until focusing is increased, and the time required is also increased.
- discrete N focus switching N is a small number to some extent, for example, 4 or 2
- the number of AF evaluation value calculations can be suppressed to N times at most, and the amount of calculation is large.
- a focusing operation can be performed at high speed. If the focus area is in focus, the target focus position candidate at that time becomes the target focus position. That is, since the focusing operation is completed at that time, it is possible to reduce the focusing determination processing to N times in some cases. Further, the processing can be switched based on whether or not the attention area has been detected and, if detected, whether or not the attention area is in focus. Therefore, it is possible to automatically switch the processing on and off according to the detection / focusing state of the attention area. In other words, since it is possible to appropriately perform the focusing operation without an instruction from the user (for example, operation of the AF start button), it is possible to realize a system that is easy for the user to use.
- the attention area detection unit 321 detects the attention area on the subject from the image signal after switching, and the focus control unit 320 determines whether or not the detected attention area is in focus. Also good.
- the focus control unit 320 may include a contrast value calculation unit 323 that calculates the contrast value of the region corresponding to the region of interest. Then, the focus control unit 320 determines whether or not the attention area is focused based on the contrast value, and controls the switching unit 250.
- the area (evaluation area) on the image whose contrast value is to be calculated is set in correspondence with the attention area.
- the evaluation area may be matched with the attention area, but is not limited thereto.
- the calculation may be facilitated by using a rectangular area including the attention area.
- the in-focus determination based on the contrast value may be determined, for example, by comparing with a given threshold value and determining that the in-focus state is greater than the threshold value.
- the focus control unit 320 determines that the region of interest is not in focus, and controls the switching unit 250 to perform a focusing object of the objective optical system. The position may be switched.
- the focus object position is switched by controlling the switching unit 250. Specifically, a position different from the current position may be selected from a plurality of target focus position candidates, and focus determination may be performed at the selected position. In order to avoid duplicate processing, the target focus for which focus determination has not been performed in a series of focus operations (processing from the previous focus completion to the next focus completion) A position candidate is selected.
- the focus control unit 320 determines the predetermined target focus position as the target focus position of the objective optical system. Candidates may be selected.
- a target focus position candidate corresponding to a point with the longest focus object distance from the imaging unit may be used, or an external interface unit (external I in FIG. 1). / Corresponding to the F unit 500) may be set based on the input information.
- the in-focus subject distance refers to the distance from the imaging unit 200 to the subject when the subject image is focused on the image sensor 260. Therefore, if an optical condition is determined by selecting a certain target in-focus position candidate, the in-focus subject distance can also be obtained in association with the target in-focus position candidate.
- the in-focus subject distance here may be a value having such a width, but in a narrow sense, it may indicate the above-mentioned best subject distance.
- the focus control unit 320 may use the target focus position candidate having the highest contrast value when the plurality of contrast values respectively calculated for the plurality of target focus position candidates are all lower than a predetermined threshold value. Good.
- the target focus position is set to the given position. Can be set.
- the target focus position candidate with the highest contrast value is used, the target focus position can be determined as a position with less blur compared to other positions, although it cannot be said that it is in focus. .
- the depth of field increases as the subject distance optically increases.
- the depth of field changes not according to the distance from the imaging unit 200 to the subject but according to the distance from the front focal point to the subject, for example), and the depth of field compared to other positions. It becomes possible to take a wide area, and the possibility that the focused area on the image can be enlarged is increased. If the depth of field is wide, manual focusing can be performed. Further, the predetermined target in-focus position candidates are not limited to the above two, and may be arbitrarily set based on, for example, an input value from an external device or a user.
- the attention area detection unit 321 includes a reference signal generation unit 3212 that generates a reference signal from an image signal, and a candidate area detection unit 3215 that detects an attention candidate area based on the image signal and the reference signal.
- the feature amount calculation unit 3216 that calculates the feature amount of the candidate region of interest may be included. Then, the attention area detection unit 321 detects the attention area based on the feature amount.
- the reference signal is a signal obtained based on the image signal.
- the image signal is subjected to filter processing (specifically, low-pass filter processing) to extract a given spatial frequency component. It may be what you did.
- the attention candidate region is a region that is a candidate for the attention region, and is detected based on the image signal and the reference signal. Specifically, it may be detected from the difference value between the image signal and the reference signal. In the above example, attention is paid to both the detection based on the difference between the luminance signals and the detection based on the difference between the color signals. Candidate areas.
- an abnormal part such as a lesion part can be detected.
- the shape for example, by taking the variance of the difference
- a lesion can be detected.
- the color (redness / dark blue color, etc.) is characterized based on the difference between the color component (color signal) of the image signal and the reference signal obtained from the color signal (for example, by taking the variance of the difference).
- a lesion can be detected. Therefore, in this example, an area that is suspected of being a lesion due to features such as unevenness or redness / dark blue is detected as the attention candidate area. Since a plurality of attention candidate areas may be detected, which attention candidate area is selected as the attention area is determined by the feature value. By switching the feature amount calculation method, it is possible to switch the characteristics of the area detected as the attention area.
- the feature amount calculation unit 3216 may calculate the feature amount of the candidate region of interest based on the difference between the image signal and the reference signal. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
- the same signal as that used for detection of the candidate region of interest may be used, or a different signal may be used. That is, any of R, G, and B may be used as the image signal, Y and C in the above equations (1) and (2) may be used, and other image signals may be used. Good.
- the feature amount calculation unit 3216 may use a luminance signal difference and a color signal difference as the differences. Then, a value obtained by weighted addition of the luminance signal difference and the color signal difference may be calculated as the feature amount.
- the luminance signal difference and the color signal difference as the difference between the image signal and the reference signal.
- the luminance signal difference corresponds to an uneven lesion or the like
- the color signal difference corresponds to a redness / amber lesion or the like.
- Ei in the above equation (3) may be used to calculate the feature amount at this time.
- either one of the weighting coefficients ⁇ e and ⁇ e may be zero. That is, the value obtained by weighted addition of the luminance signal difference and the color signal difference in the present embodiment includes a pattern of only the luminance signal difference and a pattern of only the color signal difference.
- the feature amount calculation unit 3216 may calculate the feature amount using a value larger than the weight of the color signal difference as the weight of the luminance signal difference.
- the feature amount calculation unit 3216 may calculate the feature amount of the attention candidate region based on the area of the attention candidate region. Then, the attention area detection unit 321 detects an attention candidate area having the largest feature amount as the attention area.
- the large area means that there are distributions of objects to be noticed over such a wide range (for example, lesions and bubbles, etc.
- the objects to be noticed change depending on the detection method of the candidate region of interest). It is considered that the priority of observation is high.
- the attention area detection unit 321 may detect an abnormal part of the subject as the attention area.
- an abnormal part for example, a lesion part
- In-vivo observation with an endoscopic device is assumed to be used in the medical field.
- the object to be observed is a lesion (such as a tumor or a dense blood vessel).
- the abnormal part is detected as the attention area.
- the number of target in-focus position candidates to be switched as the in-focus object position by the switching unit 250 may be 4 or less (including that value).
- N is a large value
- the load of focusing processing for example, the number of times of switching the in-focus object position, the number of times of calculation of contrast values, etc.
- the focusing operation cannot be performed at a high speed as compared with a typical focus switching. Therefore, in order to use the method of this embodiment effectively, the value of N needs to be small to some extent, and specifically, N ⁇ 4 may be satisfied.
- the number of target focus position candidates is not limited to four as long as the focus operation can be performed at a higher speed than continuous focus switching.
- it may be determined from the condition of the focusing range in the depth direction in the object scene.
- the first to Nth object scenes corresponding to the first to Nth target in-focus positions corresponding to a plurality of points away from the imaging unit by the best subject distance, and the corresponding first to Nth object scenes, respectively.
- the in-focus range in the depth direction in the i-th (1 ⁇ i ⁇ N ⁇ 2) object field is in-focus in the depth direction in the (i + 1) -th object field.
- the target focus position candidate is set so that the range in the depth direction in the i-th object field does not overlap with the range in the depth direction in the i + 2 object field. It is possible. By doing so, the in-focus range in the depth direction overlaps between adjacent target focus position candidates (a certain position is in-depth focus in the i-th object field). And the end point of the in-focus range in the first object field (for example, FIG. 12). And the end point of the focus range in the depth direction in the Nth scene (for example, the depth direction in the scene at D in FIG. 12). In the range of the distance from the left end of the in-focus range, an arbitrary position is included in any of the in-focus ranges in the depth direction in the first to Nth scenes.
- this distance range it is possible to always focus by appropriately setting the focused object position, and it is not necessary to adjust the distance between the imaging unit 200 and the subject.
- it does not overlap with the range in the depth direction in the two adjacent scenes. This is because the predetermined distance range can be covered by overlapping the neighbors so that it is not necessary to increase the number of target in-focus position candidates any further. If the range of focus in the depth direction in the i-th object field overlaps the range of focus in the depth direction of the i + 2 object field, in the i + 1-th object field between them.
- An arbitrary position within the in-focus range in the depth direction is included in at least one of the in-focus range in the depth direction in the i-th object field and the in-depth range in the i + 2 object field. Therefore, the advantage of setting the i + 1th target focus position candidate is not great.
- the width of the region may be limited (for example, an upper limit value is set).
- Second Embodiment A configuration example of an endoscope apparatus according to the second embodiment has the same configuration as that of the first embodiment. Only the parts different from the first embodiment will be described below.
- the function of the switching unit 250 is different.
- the focus lens position is set to point E corresponding to a point (far point) where the best subject distance from the imaging unit is long and point where the best subject distance from the imaging unit is short ( Switching is performed in two stages of F point corresponding to (near point).
- the focus lens position is at point E, the depth of field is suitable for screening a luminal subject in the body cavity, and when the focus lens position is at point F, the subject is closely approached for close examination. Suitable depth of field.
- FIG. 9 shows a detailed configuration example of the focus control unit 320 in the second embodiment.
- the determination unit 325 is added from the first embodiment.
- the image processing unit 330 is connected to the region setting unit 322, the attention region detection unit 321, and the determination unit 325.
- the determination unit 325 is connected to the switching control unit 324.
- Determining unit 325 performs determination related to focusing of the attention area.
- the determination unit 325 may include an estimation unit 3251, and the estimation unit 3251 estimates rough distance information about the attention area transferred from the attention area detection unit 321. Then, the determination unit 325 determines whether the attention area is at the near point or the far point based on the estimation result in the estimation unit 3251. Specifically, first, the brightness Ye of the area corresponding to the attention area is calculated from the image signal. Next, the brightness Ye of the attention area is corrected by the light source aperture L and the gain correction value Gain. The following equation (4) is used for correction.
- L is the opening of the light source aperture 120 obtained through the control unit 340
- Gain is a gain correction value obtained through the control unit 340
- ⁇ y and ⁇ y are weighting constants.
- the determination unit 325 transfers the determination result to the switching control unit 324.
- S201 to S206 are the same as S101 to S106 in FIG. 14, and S208 to S209 are the same as S108 to S109 in FIG.
- the processing in the case where the contrast value is equal to or lower than Tcon (including the value) at all focus lens positions in the first embodiment, the process of S107 in FIG. 14).
- the processing is different from S210 to S212 in FIG.
- the determination unit 325 compares the brightness information Yc of the attention area with the threshold value Tr (S210), and Yc is If it is larger than Tr, the switching control unit 324 moves the focus lens position to a position for focusing on the near point (S211). On the other hand, if YC is equal to or smaller than Tr (including the value thereof), the switching control unit 324 moves the focus lens position to a position for focusing on the far point (S212).
- the focus control unit 320 includes the estimation unit 3251 that estimates distance information between the subject corresponding to the region of interest and the imaging unit 200. If the plurality of contrast values calculated for each of the plurality of target focus position candidates are all lower than the predetermined threshold, the focus object position of the objective optical system is determined based on the estimation result of the estimation unit 3251. select.
- a target in-focus position to be selected by distance estimation is obtained.
- a distance measuring sensor or a complicated image processing method is not used. This is because if the distance measuring sensor or the like is mounted, the imaging unit 200 is increased in size, and complicated image processing increases the processing load and makes high-speed focusing operation difficult.
- the estimation unit 3251 may estimate the distance information with respect to the imaging unit 200 based on the brightness information of the attention area.
- This enables distance estimation based on brightness information. This is as simple as the distance between the imaging unit 200 and the subject is short if it is bright, and long if it is dark. Therefore, the processing load can be very light. In particular, this is effective in N focus switching as in this embodiment (especially when N is small such as N 2). For example, in the case of bifocal switching, it is only necessary to determine whether the estimation by the estimation unit 3251 is near or far. Even if the estimation accuracy is low as in the estimation based on the brightness information, the target focus is achieved. Position selection is possible.
- the switching unit 250 captures an image as a target focus position candidate of the objective optical system compared to the first target focus position candidate having a long in-focus subject distance from the image capturing unit and the first target focus position candidate. You may switch 2 points
- the focused subject distance is the same as that described above in the first embodiment, and may be the best subject distance in a narrow sense.
- the position of the subject corresponding to the focused subject distance in the first target focus position candidate is the far point
- the position of the subject corresponding to the focused subject distance in the second target focus position candidate is the near point.
- the present invention is not limited to the embodiments 1 and 2 and modified examples as they are.
- the constituent elements can be modified and embodied without departing from the spirit of the invention.
- Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described first and second embodiments and modifications. For example, some constituent elements may be deleted from all the constituent elements described in the first and second embodiments and modifications. Furthermore, you may combine suitably the component demonstrated in different embodiment and modification.
- a term described together with a different term having a broader meaning or the same meaning at least once in the specification or the drawings can be replaced with the different term anywhere in the specification or the drawings.
Landscapes
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Astronomy & Astrophysics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
L'invention concerne un dispositif endoscopique et un procédé de commande de foyer pour détecter la région cible, commander un foyer sur la base de positions objet de foyer dispersées, et réaliser facilement et rapidement l'opération de mise au point sur la région cible. Un dispositif endoscopique comprend une unité d'imagerie (200) pour capturer des signaux d'image ; une unité de commutation (250) qui commute la position objet de foyer vers une position de foyer cible quelconque parmi une pluralité de positions de foyer cible candidates réglées à des positions dispersées ; et un contrôleur de foyer (320) qui sélectionne une position de foyer cible quelconque parmi la pluralité de positions de foyer cible candidates en tant que position de foyer cible, commande l'unité de commutation (250) et commande le foyer. Le contrôleur de foyer (320) a un détecteur de région cible (321) et sélectionne la position de foyer cible candidate en tant que position de foyer cible lorsque la décision est de mettre au point la région cible détectée, et commande l'unité de commutation (250) et commute la position objet de foyer lorsque la position de foyer cible candidate est sélectionnée en tant que position de foyer cible et que la décision n'est pas de mettre au point.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011235247 | 2011-10-26 | ||
| JP2011-235247 | 2011-10-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013061939A1 true WO2013061939A1 (fr) | 2013-05-02 |
Family
ID=48167769
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/077282 Ceased WO2013061939A1 (fr) | 2011-10-26 | 2012-10-23 | Dispositif endoscopique et procédé de commande de foyer |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2013061939A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018116225A (ja) * | 2017-01-20 | 2018-07-26 | キヤノン株式会社 | 焦点調節装置、その制御方法およびプログラム、並びに撮像装置 |
| CN110062596A (zh) * | 2016-12-20 | 2019-07-26 | 奥林巴斯株式会社 | 自动焦点控制装置、内窥镜装置以及自动焦点控制装置的工作方法 |
| WO2021149141A1 (fr) * | 2020-01-21 | 2021-07-29 | オリンパス株式会社 | Dispositif de commande de mise au point, système endoscopique, et procédé d'utilisation du dispositif de commande de mise au point |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS55166609A (en) * | 1979-06-12 | 1980-12-25 | Olympus Optical Co Ltd | Method of focusing of optical system having lighting device |
| JP2002153421A (ja) * | 2000-08-23 | 2002-05-28 | Toshiba Corp | 内視鏡装置 |
| JP2006246491A (ja) * | 2006-03-10 | 2006-09-14 | Olympus Corp | ビデオマイクロスコープ |
| JP2006288432A (ja) * | 2005-04-05 | 2006-10-26 | Olympus Medical Systems Corp | 電子内視鏡 |
| JP2008307229A (ja) * | 2007-06-14 | 2008-12-25 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2011193983A (ja) * | 2010-03-18 | 2011-10-06 | Olympus Corp | 内視鏡システム、撮像装置及び制御方法 |
-
2012
- 2012-10-23 WO PCT/JP2012/077282 patent/WO2013061939A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS55166609A (en) * | 1979-06-12 | 1980-12-25 | Olympus Optical Co Ltd | Method of focusing of optical system having lighting device |
| JP2002153421A (ja) * | 2000-08-23 | 2002-05-28 | Toshiba Corp | 内視鏡装置 |
| JP2006288432A (ja) * | 2005-04-05 | 2006-10-26 | Olympus Medical Systems Corp | 電子内視鏡 |
| JP2006246491A (ja) * | 2006-03-10 | 2006-09-14 | Olympus Corp | ビデオマイクロスコープ |
| JP2008307229A (ja) * | 2007-06-14 | 2008-12-25 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2011193983A (ja) * | 2010-03-18 | 2011-10-06 | Olympus Corp | 内視鏡システム、撮像装置及び制御方法 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110062596A (zh) * | 2016-12-20 | 2019-07-26 | 奥林巴斯株式会社 | 自动焦点控制装置、内窥镜装置以及自动焦点控制装置的工作方法 |
| JP2018116225A (ja) * | 2017-01-20 | 2018-07-26 | キヤノン株式会社 | 焦点調節装置、その制御方法およびプログラム、並びに撮像装置 |
| WO2021149141A1 (fr) * | 2020-01-21 | 2021-07-29 | オリンパス株式会社 | Dispositif de commande de mise au point, système endoscopique, et procédé d'utilisation du dispositif de commande de mise au point |
| US12402781B2 (en) | 2020-01-21 | 2025-09-02 | Olympus Corporation | Focus control device, operation method of focus control device, and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6013020B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
| JP5953187B2 (ja) | 合焦制御装置、内視鏡システム及び合焦制御方法 | |
| JP5948076B2 (ja) | フォーカス制御装置、内視鏡装置及びフォーカス制御方法 | |
| JP6137921B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
| JP5951211B2 (ja) | 合焦制御装置及び内視鏡装置 | |
| US20160128545A1 (en) | Endoscope apparatus and method for controlling endoscope apparatus | |
| US9154745B2 (en) | Endscope apparatus and program | |
| JP6049518B2 (ja) | 画像処理装置、内視鏡装置、プログラム及び画像処理装置の作動方法 | |
| JP5973708B2 (ja) | 撮像装置及び内視鏡装置 | |
| CN107005646B (zh) | 对焦控制装置、内窥镜装置以及对焦控制装置的控制方法 | |
| JP5698476B2 (ja) | 内視鏡システム、内視鏡システムの作動方法及び撮像装置 | |
| JP6453905B2 (ja) | フォーカス制御装置、内視鏡装置及びフォーカス制御装置の制御方法 | |
| US20120120305A1 (en) | Imaging apparatus, program, and focus control method | |
| CN105579880B (zh) | 内窥镜用摄像系统、内窥镜用摄像系统的工作方法 | |
| JP6533284B2 (ja) | フォーカス制御装置、撮像装置、内視鏡システム、フォーカス制御装置の制御方法 | |
| JP6120491B2 (ja) | 内視鏡装置及び内視鏡装置のフォーカス制御方法 | |
| JP5996218B2 (ja) | 内視鏡装置及び内視鏡装置の作動方法 | |
| JP2013043007A (ja) | 焦点位置制御装置、内視鏡装置及び焦点位置制御方法 | |
| JP2013076823A (ja) | 画像処理装置、内視鏡システム、画像処理方法及びプログラム | |
| WO2013061939A1 (fr) | Dispositif endoscopique et procédé de commande de foyer | |
| US20250134345A1 (en) | Endoscope system and method of operating the same | |
| JP6177387B2 (ja) | 内視鏡装置の合焦制御装置、内視鏡装置及び内視鏡制御装置の作動方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12843943 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12843943 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |