WO2014175220A1 - 画像取得装置、試料のフォーカスマップを作成する方法及びシステム - Google Patents
画像取得装置、試料のフォーカスマップを作成する方法及びシステム Download PDFInfo
- Publication number
- WO2014175220A1 WO2014175220A1 PCT/JP2014/061182 JP2014061182W WO2014175220A1 WO 2014175220 A1 WO2014175220 A1 WO 2014175220A1 JP 2014061182 W JP2014061182 W JP 2014061182W WO 2014175220 A1 WO2014175220 A1 WO 2014175220A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sample
- objective lens
- focus
- image
- respect
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/002—Scanning microscopes
- G02B21/0024—Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
- G02B21/0052—Optical details of the image generation
- G02B21/006—Optical details of the image generation focusing arrangements; selection of the plane to be imaged
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/244—Devices for focusing using image analysis techniques
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
Definitions
- the present invention relates to an image acquisition apparatus, a method and a system for creating a focus map of a sample.
- image acquisition apparatuses using various focus methods such as a dynamic focus method that images a sample while acquiring in-focus information and a pre-focus method that acquires in-focus information before imaging the sample have been developed. .
- This apparatus includes an image sensor that has a plurality of pixel lines and is driven in a rolling shutter mode. Then, the light source is intermittently emitted with an interval longer than the light receiving time of the pixel line, and the focal position information of the microscope optical system is acquired based on the image taken while moving the stage.
- the focal position of the objective lens is continuously reciprocated while moving the visual field position of the objective lens with respect to the sample, and image data of the sample is acquired using a line scan camera. ing.
- the focal position of the sample is calculated for each predetermined region based on the height of the contrast value of the acquired image data, and a focus map of the entire sample is created.
- the above-described image acquisition device is used as a microscope device that performs imaging at a high magnification of, for example, 20 to 40 times, the depth of subject is small, and the field of view of the microscope optical system is very small with respect to the sample. It is possible. For this reason, in order to acquire in-focus information of the entire sample, it is necessary to perform imaging by moving the field of the microscope optical system. However, in the apparatus described in Patent Document 1, the field position of the objective lens relative to the sample is not moved. Therefore, it takes time to acquire in-focus information over a wide range of the sample. In addition, since light from different positions on the sample is input to the pixel row read out first and the pixel row read out last, the accuracy of the in-focus information calculated based on the contrast value comparison is insufficient. There is also a risk of becoming.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide an image acquisition apparatus capable of quickly and accurately creating a focus map, and a method and system for creating a sample focus map.
- an image acquisition device includes a stage on which a sample is placed, and a light guide optical system including an objective lens arranged to face the sample on the stage, An image sensor that captures an optical image of a sample guided by the light guide optical system, a focus calculation unit that calculates focus information of the sample based on image data from the image sensor, and a sample based on the focus information
- a focus map creating unit that creates a focus map, a first drive unit that moves the visual field position of the objective lens with respect to the sample, a second drive unit that changes the focal position of the objective lens with respect to the sample, an imaging device,
- a control unit that controls the second driving unit, the imaging device is a two-dimensional imaging device having a plurality of pixel columns and capable of rolling readout.
- the first driving unit and the second driving unit are controlled so that the focal position of the objective lens with respect to the sample reciprocates in the optical axis direction of the objective lens, and the imaging element is rolled.
- the image data is acquired by reading, and the focus map creation unit creates a focus map based on a plurality of pieces of focus information calculated by the focus calculation unit.
- the focal position of the objective lens with respect to the sample is reciprocated in the optical axis direction of the objective lens (the height direction of the sample) while moving the visual field position of the objective lens with respect to the sample. Accordingly, it is possible to sequentially acquire the contrast information of the image data at the visual field position along with the movement of the visual field position of the objective lens with respect to the sample.
- image data is acquired by rolling readout of the image sensor. For this reason, the number of in-focus information calculated for each scan of the focal position of the objective lens is also stable. Therefore, the focus map can be created quickly and accurately.
- control unit may include a range determination unit that determines a reciprocating condition in the optical axis direction of the objective lens at the focal position of the objective lens with respect to the sample by the second driving unit.
- the range determining unit determines the focal position of the objective lens at the reciprocating turn-back position based on the focal point information calculated before the one focal point information when calculating the one focal point information. May be. In this case, it is possible to prevent the reciprocation of the focal position of the objective lens from deviating from the surface of the sample even when the unevenness of the surface of the sample or the warp of the slide glass on which the sample is placed cannot be ignored. Therefore, calculation of the focal point information can be more reliably performed.
- the range determination unit may change the change width of the focal position of the objective lens in the direction of the optical axis of the objective lens for each forward and backward path of the reciprocation. In this case, even if the unevenness of the surface of the sample or the warp of the slide glass on which the sample is placed cannot be ignored, by determining the change width of the focal position of the objective lens according to the shape, It is possible to prevent the reciprocation of the focal position from deviating from the surface of the sample. Therefore, calculation of the focal point information can be more reliably performed.
- the focus map creation unit creates a focus map based on at least one of the in-focus information calculated in the forward path and the in-focus information calculated in the return path by reciprocating the focal position of the objective lens. May be.
- the second drive unit can be easily controlled.
- more focus information can be calculated, so the accuracy of the focus map Can be secured sufficiently.
- the focus map creation unit creates a focus map based on both in-focus information calculated in the forward path and in-focus information calculated in the return path among the reciprocation of the focal position of the objective lens. May be. In this case, more focus information can be calculated, so that the accuracy of the focus map can be sufficiently secured.
- the control unit includes an imaging line setting unit that sets a two-dimensional image acquisition region for the sample and sets a plurality of imaging lines extending in one direction in the image acquisition region.
- One or a plurality of imaging lines may be selected from the imaging lines set by the above, and the first driving unit may be controlled so that the visual field position of the objective lens with respect to the sample moves with respect to the selected imaging lines.
- the focus map can be created more quickly. Further, the processing can be further speeded up by selecting the imaging line.
- the focus map creation unit may create a focus map based on in-focus information calculated over a plurality of imaging lines. In this case, the accuracy of the focus map can be improved.
- control unit moves the predetermined part of the sample within the field of view of the objective lens by the first driving unit and the image sensor so that a light image of the predetermined part in the sample is exposed in each pixel row of the image sensor.
- the rolling reading may be synchronized.
- the image data from each pixel column of the image sensor includes the contrast information when the focal position of the objective lens is changed in the same part of the sample, and the in-focus information is quickly obtained based on the information. And it can be calculated accurately.
- An image acquisition method includes a stage on which a sample is placed, a light guide optical system including an objective lens disposed to face the sample on the stage, and a light guide optical system.
- An imaging device that captures a light image of the guided sample, a focal point calculation unit that calculates focal point information of the sample based on image data from the imaging device, and a focus map of the sample based on the focal point information
- a focus map creating unit a first drive unit that moves the visual field position of the objective lens with respect to the sample, a second drive unit that changes the focal position of the objective lens with respect to the sample, an imaging device, a first drive unit, And a control unit that controls the second drive unit, and an image acquisition method in an image acquisition apparatus, the image pickup device having a plurality of pixel rows and capable of rolling readout as an image pickup device And the control unit moves the field position of the objective lens with respect to the sample, and the first drive unit and the second drive so that the focal position of the objective lens with respect to the sample reciprocates in
- a focus map creation method is a method for creating a focus map of a sample using a two-dimensional imaging device having a plurality of pixel columns and capable of rolling readout, and is held on a stage. While moving the visual field position of the objective lens with respect to the sample, at least one of the stage and the objective lens is moved so that the focal position of the objective lens with respect to the sample moves in the optical axis direction of the objective lens. Acquisition of the image data is performed, the focal point information of the sample is acquired based on the image data, and the focus map is created based on the plurality of focal point information in the sample.
- the focus position of the objective lens relative to the sample is moved in the height direction of the objective lens (the sample height direction) while moving the visual field position of the objective lens relative to the sample. ing. Accordingly, it is possible to sequentially acquire the contrast information of the image data at the visual field position along with the movement of the visual field position of the objective lens with respect to the sample.
- image data is acquired by rolling readout of the image sensor. For this reason, the number of in-focus information calculated for each scan of the focal position of the objective lens is also stable. Therefore, the focus map can be created quickly and accurately.
- the movement of the focal position of the objective lens with respect to the sample in the optical axis direction (the height direction of the sample) of the objective lens may be reciprocating. In this case, it is possible to quickly obtain in-focus information at different predetermined parts of the sample.
- a system for creating a focus map of a sample includes a stage that holds a sample, a light guide optical system that includes an objective lens disposed to face the sample on the stage, and a plurality of pixel rows And a two-dimensional image pickup device capable of rolling readout, and an image pickup device for picking up an optical image of the sample guided by the light guide optical system, and in-focus information of the sample based on image data from the image pickup device And a focus map creating unit that creates a focus map based on a plurality of pieces of focal point information on the sample, and the imaging device moves the sample while moving the visual field position of the objective lens relative to the sample. While moving at least one of the stage and the objective lens so that the focal position of the objective lens relative to is moved in the optical axis direction of the objective lens , To implement the acquisition of image data by the rolling readout of the image sensor.
- the focal position of the objective lens with respect to the sample is moved in the optical axis direction of the objective lens (the height direction of the sample) while moving the visual field position of the objective lens with respect to the sample. Accordingly, it is possible to sequentially acquire the contrast information of the image data at the visual field position along with the movement of the visual field position of the objective lens with respect to the sample.
- image data is acquired by rolling readout of the image sensor. For this reason, the number of in-focus information calculated for each scan of the focal position of the objective lens is also stable. Therefore, the focus map can be created quickly and accurately.
- the movement of the focal position of the objective lens with respect to the sample in the optical axis direction (the height direction of the sample) of the objective lens may be reciprocating. As a result, it is possible to quickly obtain in-focus information at predetermined portions of the sample.
- a focus map can be created quickly and accurately.
- FIG. 1 It is a figure which shows one Embodiment of the image acquisition apparatus which concerns on this invention. It is a figure which shows an example of an image pick-up element, (a) shows the light-receiving surface of an image pick-up element, (b) shows rolling reading in an image pick-up element. It is a figure which shows an example of the scan of the image acquisition area
- FIG. 1 It is a figure which shows another example of the conditions of the reciprocation of the focus position of an objective lens. It is a figure which shows an example of the contrast information processed with a focusing calculation part. 3 is a flowchart illustrating an operation of the image acquisition device illustrated in FIG. 1.
- FIG. 1 is a diagram showing an embodiment of an image acquisition apparatus according to the present invention.
- the image acquisition device 1 is disposed so as to face the stage 2 on which the sample S is placed, the light source 3 that irradiates light toward the sample, and the sample S on the stage 2.
- a light guide optical system 5 including an objective lens 25 and an image sensor 6 that captures an optical image of the sample S guided by the light guide optical system 5 are provided.
- the image acquisition apparatus 1 also includes a stage drive unit (first drive unit) 11 that moves the visual field position of the objective lens 25 with respect to the sample S, and an objective lens drive unit that changes the focal position of the objective lens 25 with respect to the sample S. (Second drive unit) 12, image sensor 6, stage drive unit 11, control unit 13 for controlling objective lens drive unit 12, and image processing unit for processing image data of sample S imaged by image sensor 6. 14.
- the sample S to be observed with the image acquisition device 1 is a biological sample such as a tissue cell, for example, and is placed on the stage 2 in a state of being sealed in a slide glass.
- the light source 3 is disposed on the bottom surface side of the stage 2.
- a lamp type light source such as a laser diode (LD), a light emitting diode (LED), a super luminescent diode (SLD), or a halogen lamp is used.
- the light guide optical system 5 includes an illumination optical system 21 disposed between the light source 3 and the stage 2 and a microscope optical system 22 disposed between the stage 2 and the image sensor 6.
- the illumination optical system 21 has a Koehler illumination optical system composed of, for example, a condenser lens 23 and a projection lens 24.
- the illumination optical system 21 guides light from the light source 3 and irradiates the sample S with uniform light.
- the microscope optical system 22 includes an objective lens 25 and an imaging lens 26 disposed on the rear stage side (imaging element 6 side) of the objective lens 25, and guides the optical image of the sample S to the imaging element 6. To do.
- the light image of the sample S is an image by transmitted light in the case of bright field illumination, scattered light in the case of dark field illumination, and light emission (fluorescence) in the case of light emission measurement. Moreover, the image by the reflected light from the sample S may be sufficient.
- the light guide optical system 5 an optical system corresponding to image acquisition of a transmitted light image, a scattered light image, and a light emission (fluorescence) image of the sample S can be employed.
- the image sensor 6 is a two-dimensional image sensor having a plurality of pixel rows and capable of rolling readout.
- An example of such an image sensor 6 is a CMOS image sensor.
- the image sensor 6 outputs a reset signal, a readout start signal, and a readout end signal based on the drive cycle of the drive clock, thereby exposing and reading out each pixel column 31.
- the exposure period of one pixel column 31 is a period from discharge of charge accompanying a reset signal to reading of charge accompanying a read start signal.
- the readout period of one pixel column 31 is a period from the start of readout of charge accompanying a readout start signal to the end of readout of charge accompanying a readout end signal. Note that a read start signal for the next pixel column can also be used as a read end signal.
- readout start signals output for each pixel column 31 are sequentially output with a predetermined time difference. For this reason, unlike global reading in which all pixel columns are read simultaneously, reading for each pixel column 31 is sequentially performed with a predetermined time difference.
- the reading speed in the rolling reading is controlled by the time interval of the reading start signal for reading each pixel column 31. If the time interval of the read start signal is shortened, the read speed is increased, and if the time interval of the read start signal is increased, the read speed is decreased.
- the adjustment of the reading interval between the adjacent pixel columns 31 and 31 can be performed by, for example, a method of adjusting the frequency of the drive clock, setting a delay period during the reading period, and changing the number of clocks that define the reading start signal.
- the stage drive unit 11 is configured by a motor or actuator such as a stepping motor (pulse motor) or a piezoelectric actuator.
- the stage drive unit 11 moves the stage 2 in the XY directions with respect to a plane having a predetermined angle (for example, 90 degrees) with respect to a plane orthogonal to the optical axis of the objective lens 25 based on control by the control unit 13.
- a predetermined angle for example, 90 degrees
- the objective lens driving unit 12 is configured by a motor or actuator such as a stepping motor (pulse motor) or a piezo actuator, for example, like the stage driving unit 11.
- the objective lens driving unit 12 moves the objective lens 25 in the Z direction along the optical axis of the objective lens 25 based on the control by the control unit 13. Thereby, the focal position of the objective lens 25 with respect to the sample S moves.
- the control unit 13 uses the stage drive unit 11 to move the Z of the stage 2 relative to the objective lens 25. You may change the position of a direction and change the space
- the stage driving unit 11 since the stage driving unit 11 serves as a driving unit that moves the focal position of the objective lens 25 with respect to the sample S, the stage driving unit 11 performs the same function as the objective lens driving unit 12.
- the sample S is imaged at a high magnification such as 20 times or 40 times.
- the field of view V of the objective lens 25 is small with respect to the sample S, and the region where an image can be acquired by one imaging is also small with respect to the sample S as shown in FIG. Therefore, in order to image the entire sample S, it is necessary to scan the field of view V of the objective lens 25 with respect to the sample S.
- the image acquisition apparatus 1 employs an image acquisition method called a tiling scan method.
- the image acquisition region 32 is set so as to include the sample S with respect to a sample container (for example, a slide glass) holding the sample S by the control unit 13 described later.
- a plurality of divided regions (tiles) 33 are set based on the visual field V on the sample S of the objective lens 25 and the objective lens 25. Then, each partial image (tile image) of the sample S corresponding to the divided region 33 is captured, and the partial image data is combined by the image processing unit 14, thereby generating the entire image data of the sample S.
- the image acquisition apparatus 1 when a partial image of the sample S corresponding to the divided region 33 is imaged, the focus map of the objective lens 25 with respect to the sample S for each divided region 33 is determined in advance. Is created.
- the image acquisition apparatus 1 includes a control unit 13. An imaging line setting unit 15 and a range determining unit 16 are provided. Further, the image acquisition device 1 calculates a focal point information of the sample S based on the image data from the image sensor 6, and a focus map that creates a focus map of the sample S based on the focal point information. And a creation unit 18.
- the imaging line setting unit 15 is a part that sets a two-dimensional image acquisition region 32 for the sample S and sets a plurality of imaging lines extending in one direction in the image acquisition region 32.
- imaging lines L (L1, L2, L3... Ln) are set for each column of the divided areas 33 in the image acquisition area 32.
- the control unit 13 moves the visual field position V of the objective lens 25 with respect to the sample S, while the focal position of the objective lens with respect to the sample S is in the height direction of the sample.
- the stage drive unit 11 and the objective lens drive unit 12 are controlled to reciprocate.
- the control unit 13 performs rolling reading by the image sensor 6 in accordance with the reciprocation of the focal position of the objective lens 25.
- the control unit 13 moves the sample S within the field of view V of the objective lens 25 by the stage driving unit 11 so that the light image of the same part in the sample S is exposed in each pixel row 31 of the imaging device 6. It is preferable to synchronize with the rolling readout of the image sensor 6.
- the predetermined portions Sa of the sample S used for calculating the in-focus information appear at regular intervals.
- At least one in-focus information calculation position P is formed for each divided region 33 included in the imaging line L.
- the range determination unit 16 is a part that determines conditions for reciprocation of the objective lens 25 at the focal position of the objective lens 25 with respect to the sample S in the optical axis direction (the height direction of the sample S) by the objective lens driving unit 12. Based on the condition determined by the range determination unit 16, the control unit 13 is controlled by the objective lens driving unit 12 so that the distance (interval) in the Z direction between the objective lens 25 and the stage 2 repeats enlargement and reduction. The driving of the objective lens 25 is controlled (a specific example will be described later).
- the focal point calculation unit 17 is a part that calculates the focal point information of the sample S based on the image data from the image sensor 6. Specifically, the image processing unit 14 calculates the focal point information of the sample S based on the image data from each pixel row 31 of the image sensor 6.
- the in-focus information for example, position information in the Z direction of the objective lens 25 or the stage 2 in which the sample S coincides with the focal position of the objective lens 25 can be cited. Further, for example, the position of the objective lens 25 in the Z direction, the height (interval) of the objective lens 25 with respect to the sample S (stage 2), the position of the stage 2 in the Z direction, and the height of the sample S (stage 2) with respect to the objective lens 25. (Interval) may be used.
- control unit 13 moves the predetermined portion Sa of the sample S within the field of view V of the objective lens 25 by the stage driving unit 11 while changing the focal position of the objective lens 25 by the objective lens driving unit 12. Rolling readout of the image sensor 6 is performed. More preferably, the control unit 13 in the field of view V of the objective lens 25 by the stage driving unit 11 is exposed so that the light image of the predetermined portion Sa in the sample S is exposed in each pixel row 31 of the imaging device 6.
- the movement of the predetermined part Sa of the sample S and the rolling readout of the image sensor 6 are synchronized. For example, the movement of the stage 2 by the stage drive unit 11 and the rolling reading of the image sensor 6 are synchronized.
- each pixel column 31 exposes a light image of the predetermined part Sa of the sample S. Therefore, the image data from the image sensor 6 when the acquisition of the focal position is performed includes objectives at the predetermined part Sa of the sample S. Contrast information when the focal position of the lens 25 is changed is included.
- FIG. 6 is a diagram illustrating an example of contrast information processed by the in-focus calculation unit.
- the contrast values of the image data from the first pixel column 31 to the n-th pixel column 31 in the imaging region are shown, and the image in the i-th pixel column 31 is shown.
- the contrast value of the data is the peak value.
- the in-focus calculation unit 17 generates in-focus information on the assumption that the focus position of the objective lens 25 is the in-focus position when the predetermined region Sa of the sample S is exposed in the i-th pixel row 31.
- the contrast value may be a contrast value in a specific pixel among the pixels included in each pixel column 31, or an average value of the contrast values of all or a part of the pixels included in each pixel column 31 is used. May be.
- FIG. 7 is a diagram showing an example of the reciprocating condition of the focal position of the objective lens 25 determined by the range determining unit 16 described above.
- the change width W1 in the forward direction of the focal position of the objective lens 25 here, the direction in which the height with respect to the sample S decreases
- the backward direction of the focal position of the objective lens 25 (here)
- the change width W2 in the direction in which the height with respect to the sample S increases) is the same.
- the start position a1 when the focal position of the objective lens 25 is changed in the forward direction matches the end position b2 when the focal position of the objective lens 25 is changed in the backward direction, and the focal position of the objective lens 25 is determined.
- the start position b1 when changing in the backward direction coincides with the end position a2 when changing the focal position of the objective lens 25 in the forward direction.
- the focal position of the objective lens 25 is changed with the same width in the forward direction and the backward direction for each in-focus information calculation position P in the sample S. Is done. Then, a position that coincides with the surface of the sample S within the change range of the focal position of the objective lens 25 is calculated as the in-focus position F (a calculation method will be described later).
- FIG. 8 is a diagram showing another example of conditions for reciprocal movement of the focal position of the objective lens 25 determined by the range determining unit 16.
- the return position of the reciprocating movement of the focal position of the objective lens 25 is based on the in-focus information already calculated at the in-focus information calculation position P before the one in-focus information calculation position P. decide. More specifically, in the example shown in FIG. 8, when the focal position of the objective lens 25 is changed to the forward direction or the backward direction at one focused information calculation position P, the previous focused information calculation position P is changed.
- the start position a1 or b1 when changing the focal position of the objective lens 25 is determined with reference to the in-focus position F calculated in step S1 so that the in-focus position F is substantially the center position of the change widths W1 and W2.
- FIG. 9 is a diagram showing still another example of the reciprocating condition of the focal position of the objective lens 25 determined by the range determining unit 16.
- the change width of the focal position of the objective lens 25 in the optical axis direction (the height direction of the sample S) of the objective lens 25 is variable for each forward and backward path of the reciprocation. More specifically, in the example shown in FIG. 9, as shown in FIG. 8, the return position of the reciprocating movement of the focal position of the objective lens 25 is the in-focus information before the in-focus information calculation position P.
- the change width W3 when the focal position of the objective lens 25 is first changed with respect to the sample S is the subsequent objective lens.
- the change width W1 in the forward direction and the change width W2 in the return direction of the 25 focal positions are larger.
- the focal position change widths W1 and W2 of the objective lens 25 are made smaller than the change width W3, so that each in-focus information calculation position P is obtained.
- the in-focus information calculation position P for changing the change width is not limited to the initial in-focus information calculation position P. If a position where the influence of the unevenness of the sample and the warpage of the slide glass can be predicted is arbitrary, the arbitrary in-focus information calculation position P The change range of the focus position of the objective lens 25 can be made variable at the focus information calculation position P.
- the focus map creation unit 18 is a part that creates a focus map based on a plurality of pieces of focus information calculated by the focus calculation unit 17. More specifically, the focus map creating unit 18 creates a focus map based on in-focus information calculated over a plurality of imaging lines L. In creating the focus map, the in-focus position F itself at each in-focus information calculation position P may be used, and the focus plane is applied by applying the least square method based on the in-focus position F at each in-focus information calculation position P. And the focus planes may be combined.
- the focus map created by the focus map creating unit 18 is output to the control unit 13 and is referred to when controlling the focal position of the objective lens 25 with respect to the sample S when taking a partial image of the sample S.
- the focus map creating unit 18 performs focusing based on one of the in-focus information calculated in the forward path and the in-focus information calculated in the return path among the reciprocation of the focal position of the objective lens 25.
- a map may be created, or a focus map may be created based on both in-focus information calculated in the outbound path and in-focus information calculated in the inbound path.
- the reciprocating motion of the objective lens 25 only the driving accuracy of either the forward path or the backward path is required, so that the objective lens 25 can be easily controlled.
- more focus information can be calculated, so that the accuracy of the focus map can be sufficiently secured.
- FIG. 10 is a flowchart showing the operation of the image acquisition apparatus 1.
- the image acquisition region 32 is set for the sample S placed on the stage 2 (step S01).
- a plurality of imaging lines L are set for the set image acquisition region 32 (step S02).
- the focal position of the objective lens with respect to the sample S reciprocates in the height direction of the sample based on the set reciprocating condition.
- the movement of the predetermined part Sa of the sample S within the visual field V and the rolling reading of the image sensor 6 are performed, and the focal point information for each imaging line L is calculated (step S03).
- the predetermined portion Sa of the sample S in the field of view V of the objective lens 25 is exposed so that the light image of the light of the predetermined portion Sa in the sample S is exposed in each pixel row 31 of the imaging element 6. It is preferable to synchronize the movement and the rolling readout of the image sensor 6.
- a focus map is created based on the calculated focal point information (step S04). Then, based on the created focus map, an image (partial image) of each divided region 33 of the image acquisition region 32 is acquired while adjusting the focal position of the objective lens 25 to the sample S (step S05). An entire image of the sample S is created by combining the partial images (step S06).
- the focal position of the objective lens 25 with respect to the sample S is reciprocated in the height direction of the sample while moving the visual field position V of the objective lens 25 with respect to the sample S.
- the contrast information of the image data at the visual field position V can be sequentially acquired along with the movement of the visual field position V of the objective lens 25 with respect to the sample S.
- rolling readout by the image sensor 6 is performed in accordance with the reciprocation of the focal position of the objective lens 25. For this reason, the number of in-focus information calculated for each scan of the focal position of the objective lens 25 is also stable. Therefore, the focus map can be created quickly and accurately.
- noise at the time of acquiring image data can be reduced, and the accuracy of the focus map can be ensured.
- the range determining unit 16 determines the reciprocating condition in the optical axis direction (the height direction of the sample S) of the objective lens 25 at the focal position of the objective lens 25 with respect to the sample S by the objective lens driving unit 12. Determined by. Although it may be assumed that the unevenness of the surface of the sample S or the warp of the slide glass on which the sample S is placed cannot be ignored, by determining the reciprocating condition of the focal position of the objective lens 25, various samples S can be obtained. On the other hand, the calculation of in-focus information can be performed more reliably.
- a two-dimensional image acquisition region 32 for the sample S is set, and a plurality of imaging lines L extending in one direction are set in the image acquisition region 32, and along the imaging line L.
- the stage drive unit 11 is controlled so that the visual field position V of the objective lens 25 with respect to the sample S moves. By such control, it is possible to prevent unnecessary focusing information from being calculated for a portion where the sample S does not exist. Therefore, the focus map can be created more quickly.
- the sample S within the field of view V of the objective lens 25 by the stage driving unit 11 is exposed so that the light image of the predetermined portion Sa in the sample S is exposed in each pixel row 31 of the imaging device 6. It is preferable to synchronize the movement of the predetermined part Sa and the rolling readout of the image sensor 6.
- the image data from each pixel row 31 of the image sensor 6 includes the contrast information when the focal position of the objective lens 25 is changed at the same part of the sample S. Based on the information, Focus information can be calculated quickly and accurately.
- the present invention is not limited to the above embodiment.
- the focus position of the objective lens 25 is scanned only once in either the forward direction or the return direction.
- the position P may be scanned a plurality of times.
- the thickness is, for example, about 10 ⁇ m. Therefore, when the moving distance of the focal position of the objective lens 25 for each pixel row 31 is set to about 0.1 ⁇ m, contrast information can be acquired for the entire thickness of the sample S in about 100 pixel rows.
- a two-dimensional image sensor such as a CMOS image sensor has, for example, about several thousand pixel columns
- contrast information can be acquired a plurality of times during one frame. Therefore, by scanning the objective lens 25 a plurality of times in the height direction, a plurality of pieces of in-focus information can be calculated for one in-focus information calculation position P, and a focus map can be created with higher accuracy. Become.
- a partial image of the sample S is captured based on the focus map created for the entire sample S. It is also possible to acquire in-focus information and create a focus map for the line L, acquire a partial image of the sample S, and repeat this for all the imaging lines L. Further, not only acquiring the focal point information for all the imaging lines L, but one or a plurality of imaging lines L among the set imaging lines L are selected by the control unit 13, and the selected imaging lines L are aligned. You may make it acquire focus information.
- the selection of the imaging line L is not particularly limited, and a plurality of imaging lines L including adjacent imaging lines L may be selected, or the imaging lines L may be selected every other row or every plurality of rows. In this case, it is possible to speed up the creation of the focus map.
- the tiling scan method is exemplified as the image acquisition method, but the present invention can also be applied to the line scan method.
- a beam splitter that branches the light from the objective lens 25 is disposed between the objective lens 25 and the imaging lens 26, and the light branched by the beam splitter is used as a line sensor or a TDI (Time Delay Integration) sensor. Imaging may be performed with a CCD image sensor capable of charge transfer.
- SYMBOLS 1 ... Image acquisition apparatus, 2 ... Stage, 3 ... Light source, 5 ... Light guide optical system, 6 ... Image pick-up element, 11 ... Stage drive part (1st drive part), 12 ... Objective lens drive part (2nd drive) Part), 13 ... control part, 15 ... imaging line setting part, 16 ... range determination part, 17 ... focusing point calculation part, 18 ... focus map creation part, 25 ... objective lens, 31 ... pixel row, 32 ... image acquisition area , L: imaging line, S: sample, Sa: predetermined part of sample, V: field of view of objective lens.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Signal Processing (AREA)
- Microscoopes, Condenser (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (10)
- 試料が載置されるステージと、
前記ステージ上の前記試料と対峙するように配置された対物レンズを含む導光光学系と、
前記導光光学系によって導光された前記試料の光像を撮像する撮像素子と、
前記撮像素子からの画像データに基づいて前記試料の合焦点情報を算出する合焦点算出部と、
前記合焦点情報に基づいて前記試料のフォーカスマップを作成するフォーカスマップ作成部と、
前記試料に対する前記対物レンズの視野位置を移動させる第1の駆動部と、
前記試料に対する前記対物レンズの焦点位置を変更する第2の駆動部と、
前記撮像素子、前記第1の駆動部、及び前記第2の駆動部を制御する制御部と、を備え、
前記撮像素子は、複数の画素列を有すると共にローリング読み出しが可能な二次元撮像素子であり、
前記制御部は、前記試料に対する前記対物レンズの視野位置を移動させながら、前記試料に対する前記対物レンズの焦点位置が前記対物レンズの光軸方向に往復動するように前記第1の駆動部及び前記第2の駆動部を制御すると共に、前記撮像素子のローリング読み出しによる前記画像データの取得を実施し、
前記フォーカスマップ作成部は、前記合焦点算出部で算出された複数の合焦点情報に基づいてフォーカスマップを作成する、画像取得装置。 - 前記制御部は、前記第2の駆動部による前記試料に対する前記対物レンズの焦点位置の前記対物レンズの光軸方向についての往復動の条件を決定する範囲決定部を有している、請求項1記載の画像取得装置。
- 前記範囲決定部は、一の合焦点情報の算出の際、前記一の合焦点情報よりも前に算出された合焦点情報に基づいて、前記往復動の折り返し位置における前記対物レンズの焦点位置を決定する、請求項2記載の画像取得装置。
- 前記範囲決定部は、前記対物レンズの焦点位置の前記対物レンズの光軸方向についての変更幅を前記往復動の往路ごと及び復路ごとに可変とする、請求項2又は3記載の画像取得装置。
- 前記フォーカスマップ作成部は、前記対物レンズの焦点位置の往復動により往路で算出された合焦点情報及び復路で算出された合焦点情報のうち、少なくとも一方の合焦点情報に基づいて前記フォーカスマップを作成する、請求項1~4のいずれか一項記載の画像取得装置。
- 前記制御部は、前記試料に対する2次元の画像取得領域を設定すると共に前記画像取得領域中で一方向に延びる複数の撮像ラインを設定する撮像ライン設定部を有し、
前記制御部は、前記撮像ライン設定部によって設定された前記撮像ラインから一又は複数の前記撮像ラインを選択し、選択した前記撮像ラインについて前記試料に対する前記対物レンズの視野位置が移動するように前記第1の駆動部を制御する、請求項1~5のいずれか一項記載の画像取得装置。 - 前記フォーカスマップ作成部は、前記複数の撮像ラインにわたって算出された前記合焦点情報に基づいて前記フォーカスマップを作成する、請求項6記載の画像取得装置。
- 前記制御部は、前記試料における所定部位の光像が前記撮像素子の各画素列で露光されるように、前記第1の駆動部による前記対物レンズの視野内での前記試料の所定部位の移動と前記撮像素子のローリング読み出しとを同期させる、請求項1~7のいずれか一項記載の画像取得装置。
- 複数の画素列を有すると共にローリング読み出しが可能な二次元撮像素子を用いて試料のフォーカスマップを作成する方法であって、
ステージに保持された前記試料に対する対物レンズの視野位置を移動させながら、前記試料に対する前記対物レンズの焦点位置が前記対物レンズの光軸方向に移動するように前記ステージ及び前記対物レンズの少なくとも一方を移動させる間に、前記撮像素子のローリング読み出しによる画像データの取得を実施し、
前記画像データに基づいて前記試料の合焦点情報を取得し、
前記試料における複数の合焦点情報に基づいてフォーカスマップを作成する、方法。 - 試料のフォーカスマップを作成するシステムであって、
前記試料を保持するステージと、
前記ステージ上の前記試料と対峙するように配置された対物レンズを含む導光光学系と、
複数の画素列を有すると共にローリング読み出しが可能な二次元撮像素子によって構成され、前記導光光学系によって導光された前記試料の光像を撮像する撮像素子と、
前記撮像素子からの画像データに基づいて前記試料の合焦点情報を算出する合焦点算出部と、
前記試料における複数の合焦点情報に基づいてフォーカスマップを作成するフォーカスマップ作成部と、を備え、
前記撮像素子は、前記試料に対する対物レンズの視野位置を移動させながら、前記試料に対する前記対物レンズの焦点位置が前記対物レンズの光軸方向に移動するように前記ステージ及び前記対物レンズの少なくとも一方を移動させる間に、前記撮像素子のローリング読み出しによる前記画像データの取得を実施する、システム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/786,581 US10348954B2 (en) | 2013-04-26 | 2014-04-21 | Image acquisition device and method and system for creating focus map for specimen |
| EP14787817.7A EP2990849B1 (en) | 2013-04-26 | 2014-04-21 | Image acquisition device and method and system for creating focus map for specimen |
| CN201480023710.4A CN105143953B (zh) | 2013-04-26 | 2014-04-21 | 图像取得装置、制作试样的焦点图的方法以及系统 |
| JP2015513749A JP6266601B2 (ja) | 2013-04-26 | 2014-04-21 | 画像取得装置、試料のフォーカスマップを作成する方法及びシステム |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013094079 | 2013-04-26 | ||
| JP2013-094079 | 2013-04-26 | ||
| JP2013-221175 | 2013-10-24 | ||
| JP2013221175 | 2013-10-24 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014175220A1 true WO2014175220A1 (ja) | 2014-10-30 |
Family
ID=51791794
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/061182 Ceased WO2014175220A1 (ja) | 2013-04-26 | 2014-04-21 | 画像取得装置、試料のフォーカスマップを作成する方法及びシステム |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US10348954B2 (ja) |
| EP (1) | EP2990849B1 (ja) |
| JP (1) | JP6266601B2 (ja) |
| CN (1) | CN105143953B (ja) |
| HU (1) | HUE052489T2 (ja) |
| WO (1) | WO2014175220A1 (ja) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019088030A1 (ja) * | 2017-11-02 | 2019-05-09 | 富士フイルム株式会社 | 撮影制御装置、撮影制御装置の作動方法、及び撮影制御プログラム |
| CN111656163A (zh) * | 2018-01-29 | 2020-09-11 | 加利福尼亚大学董事会 | 用于在荧光显微成像期间扩展景深的方法和装置 |
| WO2020203700A1 (ja) * | 2019-04-02 | 2020-10-08 | 富士フイルム株式会社 | 観察制御装置、方法およびプログラム |
| JP2021502593A (ja) * | 2017-11-28 | 2021-01-28 | ライカ バイオシステムズ イメージング インコーポレイテッドLeica Biosystems Imaging, Inc. | デュアルプロセッサ画像処理 |
| JP2021524049A (ja) * | 2018-09-28 | 2021-09-09 | 麦克奥迪▲実▼▲業▼集▲団▼有限公司 | デジタルスライドスキャナのモデル化のスピードアップ方法 |
| JP2024093373A (ja) * | 2022-12-27 | 2024-07-09 | 浜松ホトニクス株式会社 | 試料観察装置及び試料観察方法 |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6455829B2 (ja) * | 2013-04-01 | 2019-01-23 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
| DE112016006056T5 (de) * | 2016-01-27 | 2018-09-06 | Hitachi High-Technologies Corporation | Betrachtungsvorrichtung |
| DE102016121649A1 (de) * | 2016-11-11 | 2018-05-17 | Cl Schutzrechtsverwaltungs Gmbh | Verfahren zur automatisierbaren bzw. automatisierten Ermittlung der Fokuslage eines von einer Belichtungseinrichtung erzeugten Laserstrahls |
| US10613308B2 (en) * | 2016-11-30 | 2020-04-07 | Yuefeng YIN | Method and microscope for measuring and calculating heights on curved surface of microscope slide |
| EP3396430B1 (en) | 2017-04-27 | 2023-08-16 | Euroimmun Medizinische Labordiagnostika AG | Optical scanning arrangement and method |
| CN114779459A (zh) | 2017-09-29 | 2022-07-22 | 徕卡生物系统成像股份有限公司 | 实时自动聚焦扫描 |
| EP3760947B1 (en) | 2018-03-30 | 2024-10-23 | Daikin Industries, Ltd. | Refrigeration device |
| US11523046B2 (en) * | 2019-06-03 | 2022-12-06 | Molecular Devices, Llc | System and method to correct for variation of in-focus plane across a field of view of a microscope objective |
| CN110996002B (zh) * | 2019-12-16 | 2021-08-24 | 深圳市瑞图生物技术有限公司 | 显微镜聚焦方法、装置、计算机设备和存储介质 |
| CN111314571A (zh) * | 2020-01-21 | 2020-06-19 | 许之敏 | 一种扫描成像方法、计算机设备和存储介质 |
| WO2025189030A1 (en) * | 2024-03-07 | 2025-09-12 | Stellaromics, Inc. | Methods and systems for volumetric imaging |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006343573A (ja) * | 2005-06-09 | 2006-12-21 | Olympus Corp | 顕微鏡システム、観察方法および観察プログラム |
| JP2009069197A (ja) * | 2007-09-10 | 2009-04-02 | Sanyo Electric Co Ltd | 自動焦点調節装置 |
| US7518652B2 (en) | 2000-05-03 | 2009-04-14 | Aperio Technologies, Inc. | Method and apparatus for pre-focus in a linear array based slide scanner |
| JP2012042970A (ja) * | 2005-03-17 | 2012-03-01 | Hamamatsu Photonics Kk | 顕微鏡画像撮像装置 |
| JP2012108184A (ja) | 2010-11-15 | 2012-06-07 | Sony Corp | 焦点位置情報検出装置、顕微鏡装置及び焦点位置情報検出方法 |
Family Cites Families (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08320430A (ja) | 1995-05-23 | 1996-12-03 | Nikon Corp | 自動焦点検出装置 |
| US6677565B1 (en) | 1998-08-18 | 2004-01-13 | Veeco Tucson Inc. | High speed autofocus and tilt for an optical imaging system |
| WO2000056302A2 (en) * | 1999-03-19 | 2000-09-28 | Parker Hughes Institute | Vanadium (iv) complexes containing catacholate ligand and having spermicidal activity |
| US7243370B2 (en) * | 2001-06-14 | 2007-07-10 | Microsoft Corporation | Method and system for integrating security mechanisms into session initiation protocol request messages for client-proxy authentication |
| TW552803B (en) * | 2002-01-18 | 2003-09-11 | Nucam Corp | Image pickup apparatus and exposure control method therefor |
| US20040051030A1 (en) * | 2002-09-17 | 2004-03-18 | Artur Olszak | Method and apparatus for acquiring images from a multiple axis imaging system |
| US20050025833A1 (en) * | 2003-07-16 | 2005-02-03 | Chaim Aschkenasy | Pharmaceutical composition and method for transdermal drug delivery |
| US7813579B2 (en) | 2004-05-24 | 2010-10-12 | Hamamatsu Photonics K.K. | Microscope system |
| US7232980B2 (en) * | 2004-05-24 | 2007-06-19 | Hamamatsu Photonics K.K. | Microscope system |
| EP1787157B1 (en) | 2004-07-23 | 2014-09-24 | GE Healthcare Niagara Inc. | Apparatus for fluorescent confocal microscopy |
| US7297910B2 (en) | 2005-12-30 | 2007-11-20 | General Electric Company | System and method for utilizing an autofocus feature in an automated microscope |
| JP2009526272A (ja) | 2006-02-10 | 2009-07-16 | モノジェン インコーポレイテッド | 顕微鏡媒体ベースの標本からデジタル画像データを収集するための方法および装置およびコンピュータプログラム製品 |
| JP4917331B2 (ja) | 2006-03-01 | 2012-04-18 | 浜松ホトニクス株式会社 | 画像取得装置、画像取得方法、及び画像取得プログラム |
| JP2008052140A (ja) | 2006-08-25 | 2008-03-06 | Olympus Corp | 観察装置 |
| JP2008221299A (ja) * | 2007-03-14 | 2008-09-25 | Hitachi Via Mechanics Ltd | レーザ加工装置 |
| US8153949B2 (en) * | 2008-12-18 | 2012-04-10 | Palo Alto Research Center Incorporated | Obtaining sensing results indicating time variation |
| US8743195B2 (en) | 2008-10-24 | 2014-06-03 | Leica Biosystems Imaging, Inc. | Whole slide fluorescence scanner |
| JP2010256530A (ja) | 2009-04-23 | 2010-11-11 | Olympus Corp | 顕微鏡装置 |
| JP2011081211A (ja) | 2009-10-07 | 2011-04-21 | Olympus Corp | 顕微鏡システム |
| JP5829621B2 (ja) | 2009-12-30 | 2015-12-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 顕微鏡センサ |
| US10061108B2 (en) | 2010-05-18 | 2018-08-28 | Koninklijke Philips N.V. | Autofocus imaging for a microscope |
| JP5938401B2 (ja) | 2010-06-24 | 2016-06-22 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 差分測定に基づき顕微鏡検査をスキャンするためのオートフォーカス |
| US10027855B2 (en) | 2010-06-30 | 2018-07-17 | Ge Healthcare Bio-Science Corp. | System for synchronization in a line scanning imaging microscope |
| JP2012073285A (ja) | 2010-09-27 | 2012-04-12 | Olympus Corp | 撮像方法および顕微鏡装置 |
| JP5751986B2 (ja) | 2010-12-08 | 2015-07-22 | キヤノン株式会社 | 画像生成装置 |
| JP5668227B2 (ja) | 2011-02-17 | 2015-02-12 | 株式会社ミツトヨ | 画像測定装置 |
| GB201113071D0 (en) | 2011-07-29 | 2011-09-14 | Ffei Ltd | Method and apparatus for image scanning |
| US9488823B2 (en) * | 2012-06-07 | 2016-11-08 | Complete Genomics, Inc. | Techniques for scanned illumination |
| US9632303B2 (en) * | 2012-11-26 | 2017-04-25 | Osaka University | Optical microscope, and autofocus device for optical microscope |
| US9134523B2 (en) | 2013-07-19 | 2015-09-15 | Hong Kong Applied Science and Technology Research Institute Company Limited | Predictive focusing for image scanning systems |
-
2014
- 2014-04-21 US US14/786,581 patent/US10348954B2/en active Active
- 2014-04-21 WO PCT/JP2014/061182 patent/WO2014175220A1/ja not_active Ceased
- 2014-04-21 EP EP14787817.7A patent/EP2990849B1/en active Active
- 2014-04-21 JP JP2015513749A patent/JP6266601B2/ja active Active
- 2014-04-21 HU HUE14787817A patent/HUE052489T2/hu unknown
- 2014-04-21 CN CN201480023710.4A patent/CN105143953B/zh active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7518652B2 (en) | 2000-05-03 | 2009-04-14 | Aperio Technologies, Inc. | Method and apparatus for pre-focus in a linear array based slide scanner |
| JP2012042970A (ja) * | 2005-03-17 | 2012-03-01 | Hamamatsu Photonics Kk | 顕微鏡画像撮像装置 |
| JP2006343573A (ja) * | 2005-06-09 | 2006-12-21 | Olympus Corp | 顕微鏡システム、観察方法および観察プログラム |
| JP2009069197A (ja) * | 2007-09-10 | 2009-04-02 | Sanyo Electric Co Ltd | 自動焦点調節装置 |
| JP2012108184A (ja) | 2010-11-15 | 2012-06-07 | Sony Corp | 焦点位置情報検出装置、顕微鏡装置及び焦点位置情報検出方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2990849A4 |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019088030A1 (ja) * | 2017-11-02 | 2019-05-09 | 富士フイルム株式会社 | 撮影制御装置、撮影制御装置の作動方法、及び撮影制御プログラム |
| JPWO2019088030A1 (ja) * | 2017-11-02 | 2020-11-19 | 富士フイルム株式会社 | 撮影制御装置、撮影制御装置の作動方法、及び撮影制御プログラム |
| JP6993423B2 (ja) | 2017-11-02 | 2022-01-13 | 富士フイルム株式会社 | 撮影制御装置、撮影制御装置の作動方法、及び撮影制御プログラム |
| JP2021502593A (ja) * | 2017-11-28 | 2021-01-28 | ライカ バイオシステムズ イメージング インコーポレイテッドLeica Biosystems Imaging, Inc. | デュアルプロセッサ画像処理 |
| CN111656163A (zh) * | 2018-01-29 | 2020-09-11 | 加利福尼亚大学董事会 | 用于在荧光显微成像期间扩展景深的方法和装置 |
| JP2021524049A (ja) * | 2018-09-28 | 2021-09-09 | 麦克奥迪▲実▼▲業▼集▲団▼有限公司 | デジタルスライドスキャナのモデル化のスピードアップ方法 |
| JP7037262B2 (ja) | 2018-09-28 | 2022-03-16 | 麦克奥迪▲実▼▲業▼集▲団▼有限公司 | デジタルスライドスキャナのモデル化のスピードアップ方法 |
| WO2020203700A1 (ja) * | 2019-04-02 | 2020-10-08 | 富士フイルム株式会社 | 観察制御装置、方法およびプログラム |
| JP2024093373A (ja) * | 2022-12-27 | 2024-07-09 | 浜松ホトニクス株式会社 | 試料観察装置及び試料観察方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2990849B1 (en) | 2020-09-02 |
| EP2990849A1 (en) | 2016-03-02 |
| HUE052489T2 (hu) | 2021-04-28 |
| CN105143953A (zh) | 2015-12-09 |
| JPWO2014175220A1 (ja) | 2017-02-23 |
| CN105143953B (zh) | 2018-04-10 |
| EP2990849A4 (en) | 2017-01-04 |
| US20160080632A1 (en) | 2016-03-17 |
| US10348954B2 (en) | 2019-07-09 |
| JP6266601B2 (ja) | 2018-01-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6266601B2 (ja) | 画像取得装置、試料のフォーカスマップを作成する方法及びシステム | |
| JP6433888B2 (ja) | 画像取得装置、試料の合焦点情報を取得する方法及びシステム | |
| US10602087B2 (en) | Image acquisition device, and imaging device | |
| WO2019123869A1 (ja) | 画像取得装置及び画像取得方法 | |
| WO2014174919A1 (ja) | 画像取得装置及び画像取得装置のフォーカス方法 | |
| JP6134249B2 (ja) | 画像取得装置及び画像取得装置の画像取得方法 | |
| JP2005275199A (ja) | 3次元共焦点顕微鏡システム | |
| JP6496772B2 (ja) | 画像取得装置及び画像取得方法 | |
| WO2025152202A1 (zh) | 线扫共聚焦扫描光场显微成像装置及方法 | |
| JP7773351B2 (ja) | 顕微鏡システム、及び、顕微鏡制御装置 | |
| JP6240056B2 (ja) | 画像取得装置及び撮像装置 | |
| JP6010506B2 (ja) | 画像取得装置及び画像取得装置のフォーカス方法 | |
| JP5770958B1 (ja) | 画像取得装置及び撮像装置 | |
| JP2014240887A (ja) | 画像取得装置及び画像取得装置のフォーカス方法 | |
| JP6475307B2 (ja) | 画像取得装置、撮像装置、及び算出ユニット | |
| WO2014174920A1 (ja) | 画像取得装置及び画像取得装置のフォーカス方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201480023710.4 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14787817 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2015513749 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014787817 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14786581 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |