WO2020218393A1 - 細胞トラッキング方法、画像処理装置、及びプログラム - Google Patents
細胞トラッキング方法、画像処理装置、及びプログラム Download PDFInfo
- Publication number
- WO2020218393A1 WO2020218393A1 PCT/JP2020/017426 JP2020017426W WO2020218393A1 WO 2020218393 A1 WO2020218393 A1 WO 2020218393A1 JP 2020017426 W JP2020017426 W JP 2020017426W WO 2020218393 A1 WO2020218393 A1 WO 2020218393A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cell
- tracking
- state
- region
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12M—APPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
- C12M41/00—Means for regulation, monitoring, measurement or control, e.g. flow regulation
- C12M41/30—Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration
- C12M41/36—Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration of biomass, e.g. colony counters or by turbidity measurements
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12M—APPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
- C12M41/00—Means for regulation, monitoring, measurement or control, e.g. flow regulation
- C12M41/48—Automatic or computerized control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- the present invention relates to cell tracking methods, image processing devices, and programs.
- the present application claims priority based on Japanese Patent Application No. 2019-85323 filed in Japan on April 26, 2019, the contents of which are incorporated herein by reference.
- a cell tracking technique for measuring the position of an imaged cell at each time in image analysis of an image in which a cell is imaged is known (Patent Document 1).
- tracking when tracking a cell accompanied by division, conventionally, tracking may continue to one of the cells after cell division, and tracking may be newly started for the other cell.
- the movement of the center of gravity occurs instantaneously during cell division, which may cause noise in, for example, analysis of the movement state of cells by tracking. It is required to improve the accuracy of analysis of cell migration state.
- One aspect of the present invention is a cell tracking method for tracking cells based on a plurality of cell images captured in time series, and a tracking region corresponding to the cell is extracted for each of the plurality of cell images.
- the extraction step and the position where the change in the position of the tracking region extracted in the extraction step is calculated based on the plurality of cell images, and the cell is tracked based on the calculated change in the position of the tracking region.
- the determination step of determining whether or not the cell to be tracked is in the cell division state based on the cell image, and the determination step, the cell is determined to be in the cell division state.
- a cell tracking method comprising a stop step of stopping the tracking of the cells.
- one aspect of the present invention is an image processing apparatus that performs cell tracking processing based on a plurality of cell images captured in time series, and for each of the plurality of cell images, a tracking region corresponding to the cell.
- the change in the position of the region extraction unit and the tracking region extracted by the region extraction unit is calculated based on the plurality of cell images, and the change in the position of the tracking region is calculated based on the calculated change in the position of the tracking region.
- a position calculation unit for tracking cells, a state determination unit for determining whether or not the cell to be tracked is in a cell division state based on the cell image, and a cell division of the cell in the state determination unit.
- It is an image processing device including a stop control unit that stops tracking of the cells when it is determined to be in a state.
- One aspect of the present invention is an extraction step of extracting a tracking region corresponding to the cells for each of the plurality of cell images on a computer that performs cell tracking based on the plurality of cell images captured in time series.
- a position calculation step of calculating the change in the position of the tracking region extracted in the extraction step based on the plurality of cell images and tracking the cell based on the calculated change in the position of the tracking region.
- a determination step for determining whether or not the cell to be tracked is in the cell division state based on the cell image, and a determination step for determining whether the cell is in the cell division state, the above. It is a program for executing a stop step of stopping cell tracking.
- FIG. 2 It is a figure which shows an example of the structure of the image processing apparatus which concerns on 1st Embodiment. It is a figure which shows an example of the image of the image analysis which concerns on 1st Embodiment. In order to make the image shown in FIG. 2 easy to understand, it is the figure which represented the image of FIG. 2 schematically. It is a figure which shows an example of cell tracking which concerns on 1st Embodiment. It is a figure which shows an example of the image processing which concerns on 1st Embodiment. It is a figure which shows an example of the moving state calculation process which concerns on 1st Embodiment. It is a figure which shows an example of the histogram of the movement rate of the cell which concerns on 1st Embodiment.
- FIG. 12 shows an example of the evaluation result of the ratio of the cell with a high migration rate which concerns on 1st Embodiment. It is a figure which shows an example of the evaluation result of the variation of the cell migration rate which concerns on 1st Embodiment. It is a figure which shows an example of the structure of the image processing apparatus which concerns on 2nd Embodiment. It is a figure which shows an example of the moving state calculation process which concerns on 2nd Embodiment. It is a figure which shows an example of the image used for the measurement of the floating time which concerns on 2nd Embodiment. In order to make the image shown in FIG. 12 according to the second embodiment easy to understand, it is the figure which represented the image of FIG. 12 schematically.
- FIG. 1 is a diagram showing an example of the configuration of the image processing device 1 according to the present embodiment.
- the image processing device 1 analyzes an image P, which is a plurality of images in which cells C are detected at a plurality of times (for example, from time T1 to time T7 described later), by tracking the cells C.
- the tracking of the cell C includes, for example, calculating the locus of the cell C in the image P based on a specific condition.
- Cell C is, for example, adherent cells (eg, mesenchymal stem cells, nerve cells, epithelial cells, etc.).
- the tracking process in the present embodiment by the image processing device 1 may be executed for one cell C every hour, or a plurality of cells C (eg, first cell, second cell, second cell). It may be executed in parallel every hour for 3 cells, etc.).
- the automatic culture observation device 200 in the present embodiment includes a microscope 2 and a culture room (culture device) 20.
- the microscope 2 is an optical microscope, and as an example, a phase contrast microscope. As an example, the microscope 2 performs phase-contrast observation in dark contrast and detects one or more cells. The microscope 2 images the detected cells as an image P.
- the culture chamber 20 includes a chamber in which the internal temperature and humidity are controlled by a control unit or the like for culturing the cells C stored in the container. Further, in the automatic culture observation device 200 of the present embodiment, the microscope 2 and the culture chamber 20 may be arranged separately, or the microscope 2 may be arranged inside the culture chamber 20. In this embodiment, the microscope 2 may have an apparatus configuration separated from the culture chamber 20.
- the image P is, for example, a moving image composed of a plurality of frames.
- the i-th frame of the image P may be referred to as an image Pi or the like.
- the image P may be a time-lapse image taken at a plurality of shooting times.
- the image processing device 1 includes an image acquisition unit 10, a control unit 11, an output unit 12, and a storage unit 13.
- the image processing device 1 is, for example, a computer. In the present embodiment, as an example, the case where the image processing apparatus 1 is provided independently of the microscope 2 will be described, but the present invention is not limited to this.
- the image processing device 1 may be provided integrally with the microscope 2.
- the image acquisition unit 10 receives and acquires the image P output from the microscope 2.
- the control unit 11 includes a tracking area extraction unit (area extraction unit) 110, a position calculation unit 111, a state determination unit 112, a stop control unit 113, and a movement state calculation unit 114.
- the control unit 11 is realized by a CPU (Central Processing Unit), and the tracking area extraction unit 110, the position calculation unit 111, the state determination unit 112, the stop control unit 113, and the movement state calculation unit 114 are respectively CPUs. It is a module realized by reading a program from ROM (Read Only Memory) and executing processing.
- ROM Read Only Memory
- the tracking region extraction unit 110 uses the luminance information to extract the cell region R (eg, region R1, region R2, etc.) from the image P in which the cell is imaged.
- the region R includes at least one of a tracking region TR in which the luminance state indicated by the luminance information is the first state X and a tracking stop region SR in which the luminance state indicated by the luminance information is the second state Y.
- the luminance information is, for example, a luminance value based on the phase difference of the observation light of cells in a phase contrast microscope, and the luminance state indicated by the luminance information is, for example, a state indicated by this luminance value (example). , Large value, small value, etc.).
- the luminance information may be a value indicating the luminance other than the luminance value, and may be, for example, a contrast or a phase difference before being converted into the luminance (eg, brightness).
- a predetermined region (eg, region R) of the image P is defined by one pixel or a plurality of pixels among all the pixels constituting the image P.
- the luminance information of a predetermined region (eg, region R) in the image P may be the luminance value or the average luminance value of each of the plurality of pixels.
- the first state X is, for example, a state in which the brightness value of a specific portion (eg, tracking area, etc.) in the image P is smaller than the predetermined value when compared with a predetermined value (eg, calculation).
- a predetermined value e.g., a brightness value of the background BG of the first frame of the image P (eg, background BG1 described later) or a threshold value related to a preset brightness value is used.
- the brightness value of the background BG may be an average value of the brightness values acquired from the entire background BG, or may be a brightness value acquired from a specific portion of the background BG.
- a predetermined value an example in which the average value of the brightness values acquired from the entire background BG of the first frame of the image P is commonly used for all the frames included in the image P. Will be described.
- the first state X includes, for example, a state in which the contrast is smaller than a predetermined value in the dark contrast.
- the contrast is, for example, a difference in luminance value based on a phase difference of observation light, a difference in brightness, or a shade in an image.
- the first state X is, for example, a state in which the brightness value based on the phase difference is smaller than the brightness value of the background BG of the image P.
- the tracking region TR includes a region in which the cell C to be tracked is imaged in the image P.
- the tracking area extraction unit 110 extracts the area R based on the brightness indicated by the brightness information.
- the tracking area extraction unit 110 first extracts pixels whose brightness value is smaller than the brightness value of the background BG from the image P.
- the tracking area extraction unit 110 detects a boundary (edge) based on the extracted pixels.
- the tracking area extraction unit 110 extracts the tracking area TR by determining a continuous curve indicating the boundary of the tracking area TR based on the detected boundary (edge).
- Region R in this embodiment is a region corresponding to one cell C.
- the cell C is, for example, an adherent cell, and the cell C becomes a floating cell at the time of division.
- the tracking region TR in which the luminance state indicated by the luminance information is the first state X is, for example, a region corresponding to the adherent cells.
- the tracking stop region SR in which the luminance state indicated by the luminance information is the second state Y is, for example, a region corresponding to floating cells.
- the position calculation unit 111 calculates the time-varying TC of the position of the tracking area TR extracted by the tracking area extraction unit 110 based on a plurality of frames of the image P.
- the position calculation unit 111 calculates the position of the tracking area TR based on the representative point G of the tracking area TR.
- the representative point G is one of the points included in the tracking area TR that represents the position of the tracking area TR in the image P.
- the representative point G is, for example, the center of gravity G1 of the tracking region TR including cells.
- the point includes one pixel of the image P.
- the state determination unit 112 indicates that the brightness state indicated by the brightness information of the tracking area TR extracted by the tracking area extraction unit 110 is the second state Y different from the first state X, that is, the tracking area TR that is the target of tracking. It is determined whether or not the state of the brightness of is changed from the first state X to the second state Y.
- the second state Y is, for example, a state in which the brightness value of a specific portion (eg, tracking area, etc.) in the image P is larger than a predetermined value.
- the example in which the luminance value is larger than the predetermined value in the second state Y is an example in the case of dark contrast.
- the second state Y includes, for example, a state in which the brightness value of the specific portion is relatively or absolutely high with respect to a predetermined value. Further, the second state Y includes, for example, a state in which the contrast is larger than a predetermined value in the dark contrast. In the present embodiment, the second state Y is, for example, a state in which the luminance value based on the phase difference is larger than the luminance value of the background BG.
- the stop control unit 113 causes the position calculation unit 111 to stop the calculation of the time change TC based on the determination result of the state determination unit 112. In this case, for example, the stop control unit 113 calculates the time change TC in the position calculation unit 111 based on the determination result indicating that the luminance state indicated by the luminance information of the tracking region TR has become the second state Y. To stop. Further, here, the area where the state determination unit 112 determines that the state of the brightness indicated by the brightness information is the second state Y is referred to as a tracking stop area SR.
- the moving state calculation unit 114 calculates the moving state M of the tracking area TR based on the calculation result of the position calculation unit 111 and the determination result of the state determination unit 112.
- the moving state M includes the mobility of the tracking area TR.
- the mobility includes, for example, the movement distance which is the distance traveled by the representative point G of the tracking area TR, the movement speed which is the speed of the representative point G, the movement acceleration which is the acceleration of the representative point G, and the average movement speed. Includes average movement acceleration and so on.
- the state determination unit 112 determines the brightness state (first state, second state, etc.) using the magnitude relationship of the brightness value in the tracking area TR, and the movement state calculation unit 114 determines the state determination unit 112.
- the moving state M of the tracking area TR is calculated based on the state of the brightness determined by.
- the output unit 12 includes the movement state M calculated by the movement state calculation unit 114 in the analysis result A and outputs it to the presentation unit 3.
- the storage unit 13 stores various types of information used by the control unit 11 for processing. In addition, the storage unit 13 stores the above-mentioned calculation result, determination result, and / or movement state M for each cell. In this case, the storage unit 13 stores and manages the stored cell tracking information (eg, the calculation result of the position calculation unit 111, the determination result of the state determination unit 112, and / or the movement state M of the tracking area TR). It is a database that has been created.
- the presentation unit 3 presents the analysis result A output from the image processing device 1.
- the presentation unit 3 is, for example, a display.
- FIG. 2 is a diagram showing an example of an image P of the image analysis according to the present embodiment.
- FIG. 3 is an example of image analysis according to the present embodiment, and in order to make the image shown in FIG. 2 easy to understand, FIG. 3 is a diagram corresponding to each figure of FIG. 2 (example, FIG. 2A, etc.). It is a figure which drawn the image of 2 schematically.
- the images P1 to P7 shown in FIGS. 2 and 3 are frames at time T1 to time T7 of the image P, respectively.
- the tracking area extraction unit 110 extracts the tracking area TR1 from the image P1 as the tracking area TR.
- the brightness value of the tracking area TR1 in the image P1 is smaller than the brightness value of the background BG1 (hereinafter, also referred to as the background BG) in the image P1 used as a predetermined value (threshold value), and the brightness of the tracking area TR1 is in the first state X. is there.
- the position calculation unit 111 calculates the locus TC1 based on a plurality of frames of the image P as the time change TC of the center of gravity G1 of the tracking region TR1. In FIG. 3A, the locus TC1 is calculated based on a plurality of frames of the image P before the time T1 including the image P1. As described above, the position calculation unit 111 calculates the tracking area TR1 specified by the tracking area extraction unit 110 by using the image P including a plurality of frames (eg, the image P1, the image before the time T1).
- the tracking area TR1 in the image P1 is changed to the tracking area TR2, and the center of gravity G1 is moved to the center of gravity G2 accordingly. Further, the locus TC2 is calculated by the position calculation unit 111 as the time change TC.
- the tracking area TR2 in the image P2 is changed to the tracking area TR3.
- the brightness value of the tracking area TR3 in the image P3 is larger than the brightness value of the background BG.
- the state determination unit 112 determines that the state of the brightness indicated by the brightness information of the tracking area TR3 is the second state Y.
- the stop control unit 113 causes the position calculation unit 111 to stop the calculation of the time change TC based on the determination result that the brightness state indicated by the brightness information of the tracking area TR3 is the second state Y.
- the tracking area TR4 to the tracking area TR7 in the images P4 to the image P7 of FIGS. 3 (D) to 3 (G) are changes in the tracking area TR3 of the image P3, respectively.
- the brightness value of the tracking area TR4 to the tracking area TR5 is larger than the brightness value of the background BG.
- the tracking area TR4 to the tracking area TR5 are imaged as bright areas in dark contrast as compared with the background BG.
- the brightness value of the tracking area TR6 to the tracking area TR7 is smaller than the brightness value of the background BG.
- the tracking area TR6 to the tracking area TR7 are imaged as dark areas in dark contrast as compared with the background BG.
- FIG. 4 is a diagram showing an example of cell C tracking processing according to the present embodiment.
- the cells C1 and C2 correspond to, for example, the tracking region TR1 of the image P1 and the tracking region TR2 of the image P2 in FIG. 3, and are the second frames from the first frame, which is the first frame in which the tracking of the cells C is started. Tracking is done in the frame.
- the cell C3 corresponds to, for example, the tracking region TR3 of the image P3 in FIG. 3, and is a cell floating immediately before cell division.
- floating cells appear during the division of adherent cells.
- the floating cells are imaged in image P as a brighter region than the background BG. In the cell tracking process according to the present embodiment, tracking is stopped for the cells determined to be floating.
- the cell C41 and the cell C42 correspond to, for example, the tracking region TR6 of the image P6 in FIG. 3, and are cells that have divided from the cell C3.
- the tracking process of the present embodiment is newly started independently for each of the cell C41 and the cell C42.
- the cell C41 and the cell C51 are tracked in the first frame to the second frame counted after the new tracking process is started.
- the cell C42 and the cell C52 are tracked in the first frame to the second frame counted after the new tracking process is started.
- cell migration and cell division are distinguished in order to improve the accuracy of analysis of the migration of one cell.
- FIG. 18 is a diagram showing an example of a conventional cell tracking process.
- Cell C11, cell C12, cell C13, cell C141, cell C151, cell C142, and cell C152 are the same as cell C1, cell C2, cell C3, cell C41, cell C51, cell C42, and cell C52, respectively, in FIG. Cell.
- cell C11, cell C12, cell C13, cell C141, and cell C151 are tracked in the first to fifth frames.
- cell C13 which is a cell floating immediately before division, is continuously tracked without being distinguished from cell C11 and cell C12, which are in a state of cell migration. ..
- the cells C141 and C151 after division are continuously tracked from the cells C11, C12, and C13.
- tracking processing has been newly started for the cells C142 and C152 which are the cells after division.
- FIG. 5 is a diagram showing an example of image processing according to the present embodiment.
- Step S10 The image acquisition unit 10 acquires the image P output from the microscope 2.
- the image acquisition unit 10 supplies the acquired image P to the control unit 11.
- Image P is composed of n frames from the first to the nth obtained by moving a sample (eg, a cell) into a moving image or continuously taking a still image (eg, time-lapse photography).
- Step S20 The control unit 11 executes the movement state calculation process.
- the movement state calculation process is a process of calculating the movement state M of the cell C imaged in the image P.
- the movement state calculation process is executed in units of each frame of the image P (for example, the first frame and the second frame described above).
- the control unit 11 performs the movement state calculation process for one frame in the movement state calculation process of one step S20.
- the control unit 11 targets the first frame of the image P for processing.
- the frame to be processed in the current movement state calculation process is referred to as the i-th frame.
- the frame to be processed in the movement state calculation process immediately before the current movement state calculation process is called the i-1st frame or the like.
- FIG. 6 is a diagram showing an example of the moving state calculation process according to the present embodiment. Steps S110 to S160 of FIG. 6 are executed as step S20 of FIG.
- Step S110 The tracking area extraction unit 110 extracts from the image P the area R whose brightness value is smaller than the brightness value of the background BG of the image P as the tracking area TR. For example, the tracking area extraction unit 110 extracts the tracking area TR whose luminance state indicated by the luminance information is the first state X from the image P.
- the tracking area extraction unit 110 extracts a region in which the luminance value is larger than the luminance value of the background BG of the image P (second state Y) from the image P as the tracking stop region SR.
- the tracking area extraction unit 110 extracts one tracking area TR and one tracking stop area SR.
- the tracking area extraction unit 110 may extract a plurality of tracking area TRs and a plurality of tracking stop areas SR based on the above-mentioned luminance value.
- the tracking area extraction unit 110 when neither the area R in which the brightness value is larger than the brightness value of the background BG or the area R in which the brightness value is smaller than the brightness value of the background BG exists in the image P from the image P. Does not have to extract any region.
- the tracking area TR is a region having a lower brightness than the background BG
- the tracking stop region SR is a region having a higher brightness than the background BG.
- Step S120 The position calculation unit 111 extracts the center of gravity G1 of the tracking area TR as the representative point G of the tracking area TR extracted by the tracking area extraction unit 110. When there are a plurality of tracking area TRs, the position calculation unit 111 extracts the center of gravity G1 for each of the plurality of tracking areas TRs. The position calculation unit 111 stores the center of gravity position information GI1, which is information indicating the position of the extracted center of gravity G1, in the storage unit 13.
- the position calculation unit 111 may extract a point selected as the representative point G from the tracking area TR based on a predetermined reference instead of the center of gravity G1 as the representative point G.
- the predetermined criterion is, for example, that the luminance value of the tracking area TR is larger or smaller than the predetermined value. Further, the position calculation unit 111 may extract any one point in the area inside the tracking area TR as the representative point G based on a predetermined position or an input instruction from the user.
- Step S130 The position calculation unit 111 calculates the time change TC of the center of gravity G1 of the tracking region TR extracted in step S120 based on the plurality of frames of the image P.
- the position calculation unit 111 calculates the locus TC1 of the center of gravity G1 as an example of the time-varying TC of the center of gravity G1.
- the position calculation unit 111 calculates the time-varying TC of the center of gravity G1 for each of the plurality of tracking areas TRs based on the plurality of frames of the image P.
- the position calculation unit 111 calculates the locus TC1 of the center of gravity G1 by connecting the center of gravity G1i and the center of gravity G1i-1 with a straight line or a curved line. That is, the position calculation unit 111 calculates the locus TC1 by calculating the center of gravity G1 of the tracking area TR extracted by the tracking area extraction unit 110 for each of a plurality of frames of the image P.
- the position calculation unit 111 When a plurality of tracking areas TR are extracted in step S110, the position calculation unit 111 includes a plurality of tracking areas TRi extracted in the current i-th frame and a plurality of tracking areas TR extracted in the i-1st frame. It is associated with the tracking area TRi-1.
- this association processing is referred to as an area association process.
- the position calculation unit 111 is extracted in the tracking areas TRi-1, k and the i-th frame, which are the k-th tracking area TR among the plurality of tracking areas TRi-1 extracted in the i-1st frame.
- the j-th tracking areas TR, the tracking areas TRi and j are associated with each other by the area association processing.
- an image processing technique for associating a plurality of areas in a frame between different frames is used.
- tracking areas TRs located closest to each other between different frames eg, areas having a small distance between tracking areas TRs
- the position calculation unit 111 selects the region corresponding to the tracking regions TRi-1, k extracted in the i-1st frame from the tracking regions TRi extracted in the i-th frame.
- the position calculation unit 111 executes this selection process for all of the tracking areas TRi-1 extracted in the i-1st frame.
- the plurality of tracking areas TRi may be included. There is a tracking area TRi that cannot be associated with any of the plurality of tracking areas TRi-1 extracted in the i-1th frame.
- the tracking areas TRi that are not associated with any of the plurality of tracking areas TRi-1 are referred to as tracking areas TRi and u.
- the position calculation unit 111 sets the starting point of the locus TC1 of the tracking areas TRi and u as the center of gravity G1 of the tracking areas TRi and u.
- the position of the tracking area TR is used to calculate the distance between the tracking areas TR.
- the center of gravity G1 calculated in step S120 is used as an example at the position of the tracking area TR.
- the position calculation unit 111 reads the center of gravity position information GI1 from the storage unit 13 and performs a process of calculating the above-mentioned locus TC1 based on the read center of gravity position information GI1. Further, the position calculation unit 111 stores the calculated locus TC1 in the storage unit 13.
- Step S140 The state determination unit 112 determines that the brightness value of the tracking area TR extracted by the tracking area extraction unit 110 is larger than the brightness value of the background BG of the image P. When there are a plurality of tracking area TRs, the state determination unit 112 indicates that the brightness value of the tracking area TR extracted by the tracking area extraction unit 110 is larger than the brightness value of the background BG of the image P for each of the plurality of tracking area TRs. Judge to.
- the tracking area TR extracted in the current i-th frame is defined as the tracking area TRi.
- the tracking stop area SR extracted in the current i-th frame is defined as the tracking stop area SRi.
- the tracking area TR extracted in the i-1th frame is defined as the tracking area TRi-1.
- the tracking stop area SR extracted in the i-1th frame is defined as the tracking stop area SRi-1.
- the state determination unit 112 associates the tracking stop area SRi extracted in the i-th frame with the tracking area TRi-1 or the tracking stop area SRi-1 extracted in the i-1th frame as described above. Correspond by processing. Here, the state determination unit 112 associates the tracking stop area SRi with the tracking area TRi-1 or the tracking stop area SRi-1 based on the calculated locus TC1.
- the state determination unit 112 stops tracking.
- the region SRi is associated with the tracking region TRi-1 or the tracking stop region SRi-1.
- the associated regions correspond to the same cell.
- the state determination unit 112 causes the position calculation unit 111 to calculate the center of gravity G1 of the tracking stop region SR.
- the state determination unit 112 determines that the tracking area TRi-1 has changed to the tracking stop area SRi. To do. That is, the state determination unit 112 determines that the brightness value of the tracking area TRi-1 in the i-1st frame has changed to a state larger than the brightness value of the background BG of the image P in the i-th frame.
- Step S150 The stop control unit 113 stops the position calculation unit 111 from calculating the time change TC based on the determination result of the state determination unit 112, and ends the tracking of the cell C being tracked. That is, the position calculation unit 111 stops the calculation of the time change TC for the tracking region TR in which the luminance state indicated by the luminance information is larger than the luminance value of the background BG of the image P.
- the first auxiliary condition is, for example, that the area of the tracking area TR is equal to or less than a predetermined value.
- the second auxiliary condition is, for example, that the distance between the tracking area TRi in the i-th frame and the tracking area TRi-1 in the i-1st frame in the image P is equal to or larger than a predetermined value.
- the third auxiliary condition is, for example, that the number of frames used to calculate the locus TC1 is equal to or less than a predetermined value.
- the predetermined value for the number of frames is, for example, two frames.
- Step S160 The moving state calculation unit 114 calculates the moving state M of the tracking area TR based on the calculation result of the position calculation unit 111.
- the calculation result of the position calculation unit 111 is the result until the calculation of the time change TC is stopped by the stop control unit 113 based on the determination result of the state determination unit 112, so that the determination result of the state determination unit 112 is used. Is based.
- the moving state calculation unit 114 calculates, for example, the moving distance of the center of gravity G1 and the moving speed of the center of gravity G1 as the moving state M.
- the moving state calculation unit 114 supplies the calculated moving state M to the output unit 12. With the above, the control unit 11 ends the movement state calculation process of the tracking region TR including the cell C.
- Step S30 The control unit 11 determines whether or not the end condition is satisfied.
- the end condition is a condition for ending the repetition of the movement state calculation process in step S20.
- the end condition is, for example, that the movement state calculation process is executed for a predetermined number of frames of the image P.
- the predetermined number is, for example, the number of all frames (n) constituting the image P.
- the predetermined number may be less than the number of all frames constituting the image P.
- step S30 determines that the end condition is satisfied (step S30; YES)
- the control unit 11 executes the process of step S40.
- step S30; NO the control unit 11 shifts the frame of the image P to be the target of the movement state calculation process from the current frame to the next frame. change. For example, when the i-th frame is the processing target in the immediately preceding movement state calculation processing, the control unit 11 changes the target of the next processing to the i + 1th frame. After that, the control unit 11 returns to step S20 and executes the movement state calculation process again.
- Step S40 The output unit 12 has a movement distance which is the distance traveled by the representative point G of the tracking area TR, a movement speed which is the speed of the representative point G, a movement acceleration which is an acceleration of the representative point G, an average movement speed, and The analysis result A including at least one such as the average moving acceleration is output to the presentation unit 3.
- the output unit 12 includes the movement state M calculated by the movement state calculation unit 114 in the analysis result A and outputs it to the presentation unit 3.
- the image processing apparatus 1 ends the image processing (movement state calculation processing).
- step S30 When the end condition is not satisfied in step S30 as described above, the control unit 11 returns to step S20 and executes the movement state calculation process again, so that the process of extracting the tracking area TR in step S110 and the process of step S130 The process of calculating the time change TC of the above is executed again.
- the tracking area extraction unit 110 extracts the tracking area TR, which is the first state X, from the plurality of frames of the image P again after the stop control unit 113 stops the position calculation unit 111 from calculating the time change TC. ..
- the position calculation unit 111 displays a plurality of images P of the time change TC of the position of the tracking area TR extracted again by the tracking area extraction unit 110 after the stop control unit 113 stops the position calculation unit 111 from calculating the time change TC. Calculated based on.
- the position calculation unit 111 stops the calculation of the time change TC for the tracking area TR in which the brightness state indicated by the brightness information is larger than the brightness value of the background BG of the image P. From the time-varying TC calculated by the position calculation unit 111, the center of gravity G1 of the tracking region TR in which the state of brightness indicated by the brightness information is larger than the brightness value of the background BG of the image P is excluded. That is, the position calculation unit 111 calculates the time change TC by excluding the center of gravity G1 when the state of the brightness indicated by the brightness information of the tracking area TR is the second state Y.
- the time in which the cell C is not moving can be excluded by the moving state calculation process, so that the cell C is not moving.
- the accuracy of the analysis of the mobile state M of the cell C can be improved as compared with the case where the time is not excluded.
- the state in which the cell C is not migrating includes a state in which the cell C is suspended immediately before cell division.
- the analysis result A when the moving speeds of the plurality of cells C imaged in the image P are analyzed by using the image processing of the image processing device 1 described above will be described.
- the influence of the culture environment such as the coating agent and the nutritional factor contained in the medium on the migration ability is evaluated by using the index related to the migration ability of the cell C.
- FIG. 7 is a diagram showing an example of a histogram of the migration rate of cell C according to the present embodiment.
- the distributions of the migration rates for the plurality of cells C are shown as histograms H1a, histograms H1b, and histograms H1c.
- the number Z of a plurality of cells C is shown for each migration rate.
- FIG. 7 (A) shows a histogram H1a of the migration rates of a plurality of cells C under the reference culture conditions.
- FIG. 7B shows a histogram H1b of the migration rates of a plurality of cells C cultured under the first culture condition changed from the reference culture condition.
- FIG. 7C shows a histogram H1c of the migration rates of a plurality of cells C cultured under the second culture condition changed from the reference culture condition.
- FIG. 8 is a diagram showing an example of the evaluation result of the proportion of cells C having a high migration rate according to the present embodiment.
- the proportion of cells C having a moving speed greater than a predetermined value eg, a threshold value as an evaluation index
- a predetermined value eg, a threshold value as an evaluation index
- FIG. 9 is a diagram showing an example of the evaluation result of the variation in the migration speed of the cell C according to the present embodiment.
- the variation DSa, the variation DSb, and the variation DSc of the moving speed are shown for each of the histogram H1a, the histogram H1b, and the histogram H1c of FIG.
- the microscope 2 may capture the image P in bright contrast.
- the first state X is a state in which the luminance value is larger than a predetermined value
- the second state Y is a state in which the luminance value is smaller than a predetermined value.
- the present invention is limited to this. Absent.
- Two kinds of predetermined values may be used to determine the first state X and the second state Y.
- the first state X is, for example, a state in which the luminance value is smaller than the first predetermined value
- the second state Y is, for example, ,
- the brightness value is larger than the second predetermined value.
- the first predetermined value is smaller than the second predetermined value.
- the first predetermined value is, for example, the brightness value of the background BG.
- the second predetermined value is, for example, a luminance value preset by a user or the like, a luminance value preset based on a preliminary experimental result, or the like.
- the region R is, for example, the first state. Neither the state of X nor the second state Y is determined, or it is determined to be the first state X.
- the average value of the brightness values obtained from the entire background BG of the first frame of the image P is set as a predetermined value of the brightness value.
- the image P is commonly used for all frames included in the image P
- the present invention is not limited to this.
- the luminance value acquired from the specific portion of the background BG may be used as a predetermined value commonly used for all the frames included in the image P.
- the background BG of the frames other than the first frame of the image P (eg, the i-th frame other than the first frame and the last frame)
- the brightness value obtained from may be used.
- the average value of the brightness values acquired from the background BG of one or more frames included in the image P is the average value of the brightness values of all the frames included in the image P. It may be commonly used in.
- the predetermined value commonly used for all the frames included in the image P may be a value preset by a user or the like. The preset value may be set not based on the image P, or may be set based on the result of an experiment in advance or the like.
- the predetermined value may be different for each frame included in the image P.
- the predetermined value different for each frame included in the image P the luminance value acquired from the background BG of each frame of the image P may be used for each frame.
- a plurality of preset values may be used for each frame.
- the image processing apparatus 1 includes an area extraction unit (tracking area extraction unit 110 in this example), a position calculation unit 111, a state determination unit 112, and a moving state calculation unit. It is equipped with 114.
- the region extraction unit (tracking region extraction unit 110 in this example) has a luminance (this) indicated by luminance information from a plurality of images in which cells C are detected at a plurality of times (a plurality of frames of the image P in this example).
- the tracking region TR in which the state of the first state X (in this example, the brightness value is smaller than the brightness value of the background BG of the image P in the dark contrast) is extracted.
- the position calculation unit 111 bases the time-varying TC of the position of the tracking area TR extracted by the area extraction unit (tracking area extraction unit 110 in this example) on a plurality of images (a plurality of frames of the image P in this example). To calculate.
- the state determination unit 112 the state of the brightness (luminance value in this example) indicated by the brightness information of the tracking area TR extracted by the area extraction unit (tracking area extraction unit 110 in this example) is the first state X (this example). In the dark contrast, the luminance value is smaller than the luminance value of the background BG of the image P) and the second state Y (in this example, the luminance value is larger than the luminance value of the background BG of the image P). Is determined to be.
- the movement state calculation unit 114 calculates the movement state M (movement speed in this example) of the tracking area TR based on the calculation result of the position calculation unit 111 and the determination result of the state determination unit 112.
- the calculation result of the position calculation unit 111 includes, for example, the result of calculating the time change TC of the center of gravity G1 of the tracking area TR.
- the determination result of the state determination unit 112 includes, for example, a result of determining that the luminance value of the tracking area TR is larger than the luminance value of the background BG.
- the image processing apparatus 1 tracks the single cells by using the determination result of whether or not the single cells are suspended immediately before cell division. Since the tracking accuracy can be improved, the accuracy of the analysis of the cell migration state M can be improved as compared with the case where the cell is not based on the determination result of whether or not the cell is suspended immediately before cell division.
- the position calculation unit 111 determines the position of the tracking area TR (center of gravity G1 in this example) extracted by the area extraction unit (tracking area extraction unit 110 in this example).
- the time-varying TC is calculated by calculating for each of a plurality of images (in this example, a plurality of frames of the image P).
- the image processing apparatus 1 according to the present embodiment can analyze the cell migration state M based on the time-varying TC, so that the cell migration state M can be analyzed as compared with the case where it is not based on the time-varying TC. The accuracy of can be improved.
- the state of the luminance (luminance value in this example) indicated by the luminance information of the tracking area TR is the second state Y (dark contrast in this example).
- the time change TC is calculated by excluding the position of the tracking region TR (in this example, the center of gravity G1) when the luminance value is larger than the luminance value of the background BG of the image P.
- the center of gravity of the cell moves instantly. Therefore, if tracking is performed on cells at the time of cell division in the migration ability analysis by the tracking process, the average speed may be overestimated, which may cause noise for the analysis result of the migration ability analysis. Further, as described above, in the conventional cell tracking process, the analysis conditions cannot be unified, and the numerical values of the results of the migration ability analysis may vary. In the image processing apparatus 1 according to the present embodiment, by distinguishing between the cell movement distance and the division in the tracking process, noise with respect to the analysis result of the migration ability analysis can be removed, so that the accuracy of the analysis of the cell movement state M can be improved. Can be enhanced.
- the moving state M includes the mobility of the tracking area TR.
- the mobility of the tracking region TR can be calculated, so that the accuracy of the analysis of the mobility of the cells is determined by whether or not the cells are suspended immediately before cell division. It can be increased as compared with the case where it is not based on the judgment result.
- the moving state M includes the mobility of the tracking area TR (moving speed in this example).
- the accuracy of tracking a plurality of cells imaged in the image P can be improved, so that it is determined whether or not the cells are floating immediately before cell division.
- the accuracy of analysis of the migration state M of a plurality of cells can be improved as compared with the case where it is not based on the result.
- the first state X is set. Is a state in which the luminance (in this example, the luminance value) indicated by the luminance information is smaller than a predetermined value (in this example, the luminance value of the background BG), and the second state Y is the luminance indicated by the luminance information (in this example). , Luminance value) is larger than a predetermined value (in this example, the brightness value of the background BG).
- the first state X is the luminance (in this example, the luminance value) indicated by the luminance information is a predetermined value (in this example, the background BG).
- the second state Y is a state in which the brightness (in this example, the brightness value) indicated by the brightness information is smaller than a predetermined value (in this example, the brightness value of the background BG).
- the accuracy of the analysis of the cell migration state M can be improved.
- the predetermined value includes the luminance indicated by the luminance information of the background BG of the image P (in this example, the luminance value).
- the position calculation unit 111 calculates the position of the tracking area TR based on the representative point G of the tracking area TR.
- cell tracking can be performed based on the representative point G of the tracking region TR, so that the analysis of the cell migration state M is performed as compared with the case where it is not based on the representative point G. The accuracy of can be improved.
- the representative point G is the center of gravity G1 of the tracking region TR.
- the image processing apparatus 1 according to the present embodiment can execute the tracking process based on the center of gravity of the cells, so that the moving state M of the cells can be analyzed based on the center of gravity of the cells.
- the image processing device 1 includes an area extraction unit (in this example, a tracking area extraction unit 110), a position calculation unit 111, a state determination unit 112, and a stop control unit 113.
- the stop control unit 113 causes the position calculation unit 111 to stop the calculation of the time change TC based on the determination result of the state determination unit 112.
- the state of the luminance (luminance value in this example) indicated by the luminance information of the tracking region TR is the second state Y (in this example, the luminance value is dark contrast.
- the tracking of the dividing cells can be stopped based on the determination result of whether or not the image P is in a state larger than the brightness value of the background BG), compared with the case where the tracking of the dividing cells is not stopped.
- the accuracy of the analysis of the cell migration state M can be improved.
- the area extraction unit (in this example, the tracking area extraction unit 110) stops the position calculation unit 111 from calculating the time change TC after the stop control unit 113 stops the calculation of the time change TC.
- the tracking area TR which is the first state X (in this example, the brightness value is smaller than the brightness value of the background BG of the image P in the dark contrast) from the plurality of images (the plurality of frames of the image P in this example) is re-set. Extract.
- the time-varying TC of is calculated based on a plurality of images (in this example, a plurality of frames of the image P).
- the tracking process can be restarted for the cells that have stopped tracking, so that the accuracy of the analysis of the cell movement state M can be improved as compared with the case where the tracking process is not restarted. it can.
- the image processing apparatus 1 according to the present embodiment using the above-mentioned image processing can start tracking of new cells after division by re-extracting the tracking region TR which is the first state X after cell division. ..
- FIG. 10 is a diagram showing an example of the configuration of the image processing device 1a according to the present embodiment. Comparing the image processing device 1a (FIG. 10) according to the present embodiment with the image processing device 1 (FIG. 1) according to the first embodiment, the control unit 11a is different.
- the functions provided by the other components image acquisition unit 10, output unit 12, and storage unit 13 are the same as those in the first embodiment. The description of the same function as that of the first embodiment will be omitted, and in the second embodiment, the parts different from those of the first embodiment will be mainly described.
- the control unit 11a includes a tracking area extraction unit 110, a position calculation unit 111, a state determination unit 112, a stop control unit 113, a movement state calculation unit 114, and a division time measurement unit (time measurement unit) 115a. ..
- the functions provided by the tracking area extraction unit 110, the position calculation unit 111, the state determination unit 112, the stop control unit 113, and the movement state calculation unit 114 are the same as those in the first embodiment.
- the split time measuring unit 115a measures the floating time TD based on the determination result of the state determination unit 112.
- the floating time TD includes the length of time in which the state of brightness indicated by the brightness information of the tracking area TR is the second state Y.
- the division time measuring unit 115a measures the floating time TD by the number of frames of the image P determined to be the second state Y.
- the division time measuring unit 115a measures the floating time TD by converting the number of frames of the image P including the tracking region TR having a high luminance value based on the frame rate interval.
- FIG. 11 is a diagram showing an example of the moving state calculation process according to the present embodiment.
- the movement state calculation process of FIG. 11 is executed in step S20 of the image processing of FIG.
- the processes of step S210, step S220, step S230, step S240, step S260, and step S270 are the same as the processes of step S110, step S120, step S130, step S140, step S150, and step S160 in FIG. Since the same is true, the description thereof will be omitted.
- Step S250 The split time measuring unit 115a measures the floating time TD based on the determination result of the state determination unit 112.
- the division time measuring unit 115a uses a plurality of frames when it is determined that the luminance value of the tracking region TR extracted by the tracking region extraction unit 110 is larger than the luminance value of the background BG of the image P.
- the floating time TD of the tracking area TR is measured.
- step S140 of FIG. 6 when the state determination unit 112 associates the tracking area TRi-1 in the i-1st frame with the tracking stop area SRi in the i-th frame, the tracking area TRi- It is determined that 1 has changed to the tracking stop region SRi. Based on the determination result of the state determination unit 112, the division time measurement unit 115a starts the above-mentioned region association processing (also referred to as division cell association processing in this case) with respect to the tracking stop region SR.
- region association processing also referred to as division cell association processing in this case
- the division time measurement unit 115a associates the tracking stop area SRi in the i-th frame with the tracking stop area SRi + 1 in the i + 1th frame by the area association processing.
- the division time measuring unit 115a ends the region association processing when the tracking stop region SRi + 2 in the i + second frame associated with the tracking stop region SRi + 1 does not exist.
- the division time measurement unit 115a measures the number of frames from which the tracking stop regions SRi associated with each other in the region association processing are extracted as the floating time TD.
- FIG. 12 is a diagram showing an example of an image used for measuring the floating time TD according to the present embodiment.
- FIG. 13 is a diagram in which the image of FIG. 12 is schematically drawn in order to make the image shown in FIG. 12 easy to understand.
- the image P of the present embodiment is a plurality of images (eg, image Pi-1, image Pi) in which cells C are detected at a plurality of times, respectively, as in the first embodiment.
- the image Pi-1, image Pi, image Pi + 1, and image Pi + 2 shown in FIGS. 12 and 13 are the i-1st frame, the i-th frame, the i + 1st frame, and the i + 2nd frame in the image P, respectively.
- the tracking area extraction unit 110 extracts the tracking area TRi-1, the tracking stop area SRi, the tracking stop area SRi + 1, and the tracking area TRi + 2 from each of the image Pi-1, the image Pi, the image Pi + 1, and the image Pi + 2.
- the state determination unit 112 associates the tracking area TRi-1 with the tracking stop area SRi, and determines that the tracking area TRi-1 has changed to the tracking stop area SRi.
- the division time measuring unit 115a starts the area association processing based on the determination result of the state determination unit 112.
- the division time measuring unit 115a associates the tracking stop region SRi with the tracking stop region SRi + 1. Since the tracking stop region SR is not extracted from the image Pi + 2, the division time measuring unit 115a ends the region association processing.
- the split time measuring unit 115a measures the number of frames from which the tracking stop region SRi associated in the region association processing and the tracking stop region SRi + 1 are extracted.
- the state determination unit 112 measures 2 as the floating time TD, which is the number of the image Pi and the image Pi + 1.
- the split time measuring unit 115a converts the measured floating time TD from the number of frames (eg, the number of the above image Pi and the number of image Pi + 1 2) to the time based on the frame interval in the imaging of the image P. It may be converted.
- the output unit 12 outputs the analysis result Aa based on the floating time TD to the presentation unit 3. Further, the output unit 12 may output the analysis result including the moving state M of the tracking area TR and the floating time TD to the presentation unit 3.
- the suspension time TD is considered to correspond to the length of the M phase (division phase) in the cell cycle.
- the division time measuring unit 115a estimates and measures the length of the M phase of the cell cycle by tracking the floating cells that appear during the division of adherent cells. By measuring the length of the M phase, the control unit 11a can indirectly measure the duration of the M phase checkpoint and evaluate the effect on the cells.
- FIG. 14 is a diagram showing an example of a histogram of the floating time TD according to the present embodiment.
- the distribution of the floating time TD for the plurality of cells C is shown as the histogram H2a and the histogram H2b.
- the number Z of a plurality of cells C is shown for each floating time TD.
- FIG. 14A shows a histogram H2a of the suspension time TDs of a plurality of cells C under the reference culture conditions.
- FIG. 14B shows a histogram H2b of the suspension time TD of a plurality of cells C cultured under the first culture condition changed from the reference culture condition.
- the peak PKb of the histogram H2b has a longer floating time TD than the peak PKa of the histogram H2a, and the floating time TD tends to be longer than the reference culture condition under the first culture condition.
- the image processing apparatus 1a can evaluate the influence of cancer cell formation and the like based on the above analysis result Aa.
- the image processing apparatus 1a includes a time measuring unit (in this example, a split time measuring unit 115a).
- the time measuring unit determines the time (floating time TD) in which the luminance state indicated by the luminance information of the tracking region TR is the second state Y based on the determination result of the state determining unit 112. To measure.
- the length of the M phase in the cell cycle can be estimated and measured using the suspension time TD, so that the state of the cell is analyzed based on the length of the M phase. can do.
- the third embodiment will be described in detail with reference to the drawings.
- the image processing device according to this embodiment is referred to as an image processing device 1b.
- FIG. 15 is a diagram showing an example of the configuration of the image processing device 1b according to the present embodiment. Comparing the image processing device 1b (FIG. 15) according to the present embodiment with the image processing device 1 (FIG. 1) according to the first embodiment, the control unit 11b is different.
- the functions of the other components image acquisition unit 10, output unit 12, and storage unit 13 are the same as those of the first embodiment. The description of the same function as that of the first embodiment will be omitted, and in the third embodiment, the parts different from those of the first embodiment will be mainly described.
- the control unit 11b includes a tracking area extraction unit 110, a position calculation unit 111, a state determination unit 112, a stop control unit 113, a movement state calculation unit 114, and a labeling unit 116b.
- the functions provided by the tracking area extraction unit 110, the position calculation unit 111, the state determination unit 112, the stop control unit 113, and the movement state calculation unit 114 are the same as those in the first embodiment.
- the label assigning unit 116b associates the regions R with each other between the different frames described above, and assigns a label (eg, group number, identifier, etc.) L for identifying the associated region R to the associated region R.
- the region R to which the label-imparting unit 116b assigns the label L includes a tracking region TR and a tracking stop region SR.
- FIG. 16 is a diagram showing an example of the moving state calculation process according to the present embodiment.
- the moving state calculation process of FIG. 16 is executed in step S20 of the image processing of FIG.
- the processes of step S310, step S320, step S330, step S360, and step S370 are the same as the processes of step S110, step S120, step S130, step S150, and step S160 in FIG. Is omitted.
- Step S340 The labeling unit 116b performs a labeling process for assigning the label L to the associated region R.
- the labeling unit 116b associates the region Ri-1 extracted in the i-1st frame with the region Ri extracted in the i-th frame by the region association processing described above.
- FIG. 17 is a diagram showing an example of the area association processing according to the present embodiment.
- an image Pi-1 which is an i-1st frame
- an image Pi which is an i-th frame
- an image Pi + 1 which is an i + 1th frame
- an image Pi + 2 which is an i + 2nd frame are shown. ing.
- the labeling unit 116b associates the region R1i + 2 extracted in the i + 2nd frame with the region R1i + 1 extracted in the i + 1th frame by the region association processing. Similarly, the labeling unit 116b associates the region R2i + 1 extracted in the i + 2nd frame with the region R2i + 1 extracted in the i + 1th frame by the region association processing.
- the labeling unit 116b associates the region R1i + 1 extracted in the i + 1th frame with the region Ri extracted in the i-th frame by the region association processing. Similarly, the labeling unit 116b associates the region R2i + 1 extracted in the i + 1th frame with the region Ri extracted in the i-th frame by the region association processing. Here, the region Ri and the region R1i + 1 are already associated with each other, but the labeling unit 116b further associates the region R2i + 1 with the region Ri. That is, the labeling unit 116b may associate a plurality of regions from the region Ri + 1 extracted in the i + 1th frame with one region Ri extracted in the i-th frame.
- the labeling unit 116b associates the region Ri extracted in the i-th frame with the region Ri-1 extracted in the i-1st frame by the region association processing.
- the labeling unit 116b assigns the same label L to the associated region Ri-1, the region Ri, the region R1i + 1, the region R2i + 1, the region R1i + 2, and the region R2i + 2. That is, the label giving unit 116b assigns the same label (eg, the first label L1) L to all the associated regions R to form one group.
- assigning the label L to the region R means associating the region R with the label L.
- the label giving unit 116b further assigns the same label (eg, the second label L21) L to the area R1i + 1 and the area R1i + 2 to form one group, and the same label to the area R2i + 1 and the area R2i + 2.
- the second label L22 L may be added to form one group.
- the labeling unit 116b may use the result of the area association processing executed by the position calculation unit 111 in step S330 in the labeling processing.
- Step S350 The state determination unit 112 combines the fact that the brightness value of the tracking area TR extracted by the tracking area extraction unit 110 is larger than the brightness value of the background BG of the image P with the result of the labeling process in step S340. To judge. The state determination unit 112 determines, for example, that the tracking area TRi is divided into the area R1i + 1 and the area R2i + 1 by the label L. When the tracking area TRi is divided into two areas and the brightness value is larger than the brightness value of the background BG of the image P, the state determination unit 112 sets the determined tracking area TRi as the tracking stop area SRi.
- the label giving unit 116b may include the information indicating the given label L in the analysis result Ab. That is, the analysis result Ab includes information on the genealogy analysis of cells.
- the output unit 12 outputs the analysis result Ab including the label L to the presentation unit 3.
- the image processing apparatus 1b in cell tracking, it can be determined that the cells have divided based on the label L, so that the accuracy of the analysis of the cell migration state M can be improved as compared with the case where the cells are not based on the label L. Can be enhanced. Further, in the image processing apparatus 1b according to the present embodiment, it is based on the label L that the tracking area TRi extracted in the i-th frame is divided into the area R1i + 1 extracted in the i + 1th frame and the area R2i + 1. Therefore, it is possible to analyze the genealogy of cells.
- the microscope 2 may be a differential interference microscope.
- the optical path difference eg, the difference in optical path length
- the luminance indicated by the luminance information in the tracking region TR is a luminance value based on the optical path difference of the observed light.
- the first state X is a state in which the optical path difference is smaller than a predetermined value and the brightness value is smaller than the predetermined value
- the second state Y is a state in which the optical path difference is larger than the predetermined value and the brightness. The value is larger than the predetermined value.
- the luminance indicated by the luminance information of the tracking region TR is a luminance value based on the phase difference or the optical path difference. Since the contrast of the tracking region TR is a phase difference or an optical path difference, the image processing apparatus 1 of each of the above-described embodiments can determine the state of the tracking region TR using the phase difference or the optical path difference as the contrast. , A phase contrast image or an image captured by a differential interference microscope can be analyzed.
- a part of the image processing devices 1, 1a, 1b in the above-described embodiment for example, tracking area extraction unit 110, position calculation unit 111, state determination unit 112, stop control unit 113, movement state calculation unit 114, division time.
- the measuring unit 115a and the labeling unit 116b may be realized by a computer such as a server or a client.
- the program for realizing this control function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read by the computer system and executed.
- the "computer system” referred to here is a computer system built into the image processing devices 1, 1a, and 1b, and includes hardware such as an OS and peripheral devices.
- the "computer-readable recording medium” refers to a portable medium such as a flexible disk, a magneto-optical disk, a ROM, or a CD-ROM, or a storage device such as a hard disk built in a computer system.
- a "computer-readable recording medium” is a medium that dynamically holds a program for a short period of time, such as a communication line when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In that case, a program may be held for a certain period of time, such as a volatile memory inside a computer system serving as a server or a client.
- the above-mentioned program may be a program for realizing a part of the above-mentioned functions, and may be a program for realizing the above-mentioned functions in combination with a program already recorded in the computer system.
- the region extraction step of extracting the tracking region TR whose luminance state indicated by the luminance information is the first state X from the image in which the cells are imaged and the region extraction step are extracted in the region extraction step.
- the first is the state of the brightness indicated by the luminance information of the tracking region TR extracted in the region extraction step and the position calculation unit step that calculates the time change of the position of the tracking region TR based on the plurality of images.
- the moving state of the tracking area TR is calculated based on the state determination step for determining that the second state Y is different from the state X, the calculation result of the position calculation unit step, and the determination result of the state determination step. It is a program for executing the movement state calculation step to be performed.
- the first position calculation step is to calculate the time change of the position of the tracking area TR based on the plurality of images, and the brightness state indicated by the brightness information of the tracking area TR extracted in the area extraction step is the first.
- This is a program for executing a state determination step for determining that the second state Y is different from the state X, and a stop control step for stopping the calculation of the time change based on the determination result of the state determination step. ..
- a part or all of the image processing devices 1, 1a and 1b in the above-described embodiment may be realized as an integrated circuit such as an LSI (Large Scale Integration).
- LSI Large Scale Integration
- Each functional block of the image processing devices 1, 1a, and 1b may be made into a processor individually, or a part or all of them may be integrated into a processor.
- the method of making an integrated circuit is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. Further, when an integrated circuit technology that replaces an LSI appears due to advances in semiconductor technology, an integrated circuit based on this technology may be used.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Image Analysis (AREA)
- Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
- Investigating Or Analysing Biological Materials (AREA)
Abstract
Description
本願は、2019年4月26日に、日本に出願された特願2019-85323号に基づき優先権を主張し、その内容をここに援用する。
以下、図面を参照しながら第1の実施形態について詳しく説明する。図1は、本実施形態に係る画像処理装置1の構成の一例を示す図である。画像処理装置1は、複数の時刻(例、後述の時刻T1から時刻T7など)において細胞Cがそれぞれ検出された複数の画像である画像Pを、細胞Cのトラッキングによって解析する。細胞Cのトラッキングとは、一例として、特定の条件に基づいて画像Pにおいて細胞Cの軌跡を算出して得ることを含む。細胞Cは、一例として、接着細胞(例、間葉系幹細胞、神経細胞、上皮細胞など)である。なお、画像処理装置1による本実施形態におけるトラッキング処理は、1つの細胞Cに対して時間ごとに実行されてもよいし、複数の細胞C(例、第1の細胞、第2の細胞、第3の細胞など)に対して時間ごとに並列に実行されてもよい。
画像Pは、一例として、複数のフレームからなる動画である。以下では、画像Pのi番目のフレームを画像Piなどということがある。なお、画像Pは、複数の撮影時刻において撮影されるタイムラプス画像であってもよい。
制御部11は、追跡領域抽出部(領域抽出部)110と、位置算出部111と、状態判定部112と、停止制御部113と、移動状態算出部114とを備える。
このように、状態判定部112は、追跡領域TRにおける輝度値の大小関係を用いて輝度の状態(第1状態、第2状態など)を判定し、移動状態算出部114は、状態判定部112が判定した輝度の状態に基づいて追跡領域TRの移動状態Mを算出する。
記憶部13は、制御部11が処理に用いる各種の情報を記憶する。また、記憶部13は、上記した算出結果、判定結果、及び/又は移動状態Mを細胞ごとに記憶する。この場合、記憶部13は、記憶した細胞のトラッキング情報(例、位置算出部111の算出結果、状態判定部112の判定結果、及び/又は追跡領域TRの移動状態Mなど)を格納して管理されたデータベースである。
提示部3は、画像処理装置1から出力される解析結果Aを提示する。提示部3は、一例として、ディスプレイである。
図2は、本実施形態に係る画像解析の画像Pの一例を示す図である。図3は、本実施形態に係る画像解析の一例であり、図2に示した画像を分かりやすくするために、図2の各図(例、図2(A)など)に対応させて、図2の画像を模式的に描画した図である。図2及び図3に示す画像P1から画像P7はそれぞれ、画像Pの時刻T1から時刻T7におけるフレームである。
本実施形態に係る細胞のトラッキング処理では、1つの細胞の移動の解析の精度を高めるために、細胞移動と細胞分裂とが区別される。
ステップS10:画像取得部10は、顕微鏡2から出力される画像Pを取得する。画像取得部10は、取得した画像Pを制御部11に供給する。画像Pは、サンプル(例、細胞)の動画撮影又は静止画の連続撮影(例、タイムラプス撮影)によって得られる、1番目からn番目までのn枚のフレームからなるとする。
移動状態算出処理は画像Pの各フレーム(例、上記の1番目のフレーム、2番目のフレームなど)を単位として実行される。制御部11は、1回のステップS20の移動状態算出処理において1枚のフレームに対して移動状態算出処理を行う。制御部11は、初回のステップS20の移動状態算出処理においては、画像Pの1番目のフレームを処理の対象とする。
なお、追跡領域抽出部110は、画像Pから輝度値が背景BGの輝度値より大きい状態である領域Rまたは背景BGの輝度値より小さい状態である領域Rのいずれも画像Pに存在しない場合には、いずれの領域も抽出しなくてよい。
つまり、位置算出部111は、追跡領域抽出部110が抽出した追跡領域TRの重心G1を画像Pの複数のフレームごとに算出することによって軌跡TC1を算出する。
例えば、位置算出部111は、i-1番目のフレームにおいて抽出された複数の追跡領域TRi-1のうちk番目の追跡領域TRである追跡領域TRi-1,kとi番目のフレームにおいて抽出された複数の追跡領域TRiのうちj番目の追跡領域TRである追跡領域TRi,jとを、領域対応づけ処理によって対応づける。
位置算出部111は、i-1番目のフレームにおいて抽出された追跡領域TRi-1,kに対応する領域を、i番目のフレームにおいて抽出された追跡領域TRiのなかから選択する。位置算出部111は、この選択の処理をi-1番目のフレームにおいて抽出された追跡領域TRi-1の全てについて実行する。
なお追跡停止領域SRの重心G1については、ステップS140において状態判定部112が位置算出部111に算出させる。
1番目の補助条件は、一例として、追跡領域TRの面積が所定の値以下であることである。2番目の補助条件は、一例として、i番目のフレームにおける追跡領域TRiと、i-1番目のフレームにおける追跡領域TRi-1との画像Pにおける距離が所定の値以上であることである。3番目の補助条件は、一例として、軌跡TC1を算出するのに用いられたフレームの数が所定の値以下であることである。3番目の補助条件において、フレームの数についての所定の値は、一例として2フレームである。
移動状態算出部114は、移動状態Mとして、例えば、重心G1の移動距離、及び重心G1の移動速度を算出する。移動状態算出部114は、算出した移動状態Mを出力部12に供給する。
以上で制御部11は、細胞Cを含む追跡領域TRの移動状態算出処理を終了する。
ステップS30:制御部11は、終了条件が満たされた否かを判定する。ここで終了条件とは、ステップS20の移動状態算出処理を繰り返すことを終了するための条件である。終了条件は、一例として、画像Pの所定の数のフレームに対して移動状態算出処理が実行されることである。所定の数とは、例えば、画像Pを構成する全てのフレームの枚数(n枚)である。所定の数は、画像Pを構成する全てのフレームの枚数未満の数であってもよい。
以上で、画像処理装置1は、画像処理(移動状態算出処理)を終了する。
つまり、位置算出部111は、追跡領域TRの輝度情報が示す輝度の状態が第2状態Yである場合の重心G1を除外して時間変化TCを算出する。本実施形態に係る画像処理装置1では、細胞Cの移動状態Mの解析において、移動状態算出処理によって細胞Cが移動していない状態の時間を除外できるため、細胞Cが移動していない状態の時間を除外しない場合に比べて細胞Cの移動状態Mの解析の精度を高めることができる。ここで細胞Cが移動していない状態とは、細胞Cが細胞分裂の直前に浮遊している状態を含む。
2種類の所定の値が用いられる場合、ダークコントラストの一例では、第1状態Xとは、例えば、輝度値が第1の所定の値よりも小さい状態であり、第2状態Yとは、例えば、輝度値が第2の所定の値よりも大きい状態である。ここで第1の所定の値は、第2の所定の値よりも小さい。第1の所定の値は、例えば、背景BGの輝度値である。第2の所定の値は、例えば、ユーザ等によって予め設定された輝度値や事前の実験結果などに基づいて予め設定された輝度値である。
なお、2種類の所定の値が用いられる場合、領域Rの輝度値が第1の所定の値と第2の所定の値との中間の値である場合、領域Rは、例えば、第1状態Xと第2状態Yとのいずれの状態とも判定されないか、もしくは第1状態Xと判定される。
上述したように、画像Pに含まれる全てのフレームに共通して用いられる所定の値として、背景BGの特定部分から取得された輝度値が用いられてもよい。また、画像Pに含まれる全てのフレームに共通して用いられる所定の値として、画像Pの1番目のフレーム以外のフレーム(例、1番目以外のi番目のフレームや最後のフレーム)の背景BGから取得される輝度値が用いられてもよい。画像Pに含まれる全てのフレームに共通して用いられる所定の値として、画像Pに含まれる1以上のフレームの背景BGから取得される輝度値の平均値が、画像Pに含まれる全てのフレームに共通して用いられてもよい。
また、画像Pに含まれる全てのフレームに共通して用いられる所定の値は、ユーザ等によって予め設定された値であってもよい。予め設定された値は、画像Pに基づかずに設定されてよいし、事前の実験結果などに基づいて設定されてもよい。
領域抽出部(この一例において、追跡領域抽出部110)は、複数の時刻において細胞Cがそれぞれ検出された複数の画像(この一例において、画像Pの複数のフレーム)から輝度情報が示す輝度(この一例において、輝度値)の状態が第1状態X(この一例において、ダークコントラストにおいて輝度値が画像Pの背景BGの輝度値より小さい状態)である追跡領域TRを抽出する。
位置算出部111は、領域抽出部(この一例において、追跡領域抽出部110)が抽出した追跡領域TRの位置の時間変化TCを複数の画像(この一例において、画像Pの複数のフレーム)に基づいて算出する。
状態判定部112は、領域抽出部(この一例において、追跡領域抽出部110)が抽出した追跡領域TRの輝度情報が示す輝度(この一例において、輝度値)の状態が第1状態X(この一例において、ダークコントラストにおいて輝度値が画像Pの背景BGの輝度値より小さい状態)とは異なる第2状態Y(この一例において、ダークコントラストにおいて輝度値が画像Pの背景BGの輝度値より大きい状態)であることを判定する。
移動状態算出部114は、位置算出部111の算出結果と、状態判定部112の判定結果とに基づいて追跡領域TRの移動状態M(この一例において、移動速度)を算出する。ここで位置算出部111の算出結果とは、一例として、追跡領域TRの重心G1の時間変化TCが算出された結果を含む。また、状態判定部112の判定結果とは、一例として、追跡領域TRの輝度値が背景BGの輝度値より大きい状態であることが判定された結果を含む。
この構成により、本実施形態に係る画像処理装置1では、時間変化TCに基づいて細胞の移動状態Mの解析を実行できるため、時間変化TCに基づかない場合に比べて細胞の移動状態Mの解析の精度を高めることができる。
この構成により、本実施形態に係る画像処理装置1では、細胞分裂の直前に浮遊している細胞を移動状態Mの解析対象から除外してトラッキングができるため、細胞分裂の直前に浮遊している細胞を除外しない場合に比べて細胞の移動状態Mの解析の精度を高めることができる。
本実施形態に係る画像処理装置1では、トラッキング処理において細胞の移動距離と分裂とを区別することによって、遊走能解析の解析結果に対するノイズを除去できるため、細胞の移動状態Mの解析の精度を高めることができる。
この構成により、本実施形態に係る画像処理装置1では、追跡領域TRの移動度を算出できるため、細胞の移動度の解析の精度を、細胞が細胞分裂の直前に浮遊しているか否かの判定結果に基づかない場合に比べて高めることができる。
この構成により、本実施形態に係る画像処理装置1では、画像Pに撮像された複数の細胞のトラッキングの精度を高めることができるため、細胞が細胞分裂の直前に浮遊しているか否かの判定結果に基づかない場合に比べて、複数の細胞の移動状態Mの解析の精度を高めることができる。
この構成により、本実施形態に画像処理装置1では、輝度情報が示す輝度(この一例において、輝度値)と所定の値(この一例において、背景BGの輝度値)との大小に基づいて細胞のトラッキングができるため、輝度情報が示す輝度(この一例において、輝度値)と所定の値(この一例において、背景BGの輝度値)との大小に基づいて細胞のトラッキングを行わない場合に比べて、細胞の移動状態Mの解析の精度を高めることができる。
この構成により、本実施形態に係る画像処理装置1では、上述した画像処理の実行において、所定の値を設定する必要がない。複数の画像処理装置1を用いて上述した画像処理を実行させる場合に、複数の画像処理装置1毎に所定の値を設定しなくてもよいため複数の画像処理装置1間において、解析結果Aを共有したり統合したりする場合などに汎用性が向上する。
この構成により、本実施形態に係る画像処理装置1では、追跡領域TRの代表点Gに基づいて細胞のトラッキングを実行できるため、代表点Gに基づかない場合に比べて細胞の移動状態Mの解析の精度を高めることができる。細胞のトラッキングを追跡領域TRの代表点Gに基づかず行う場合、例えば、フレーム毎に追跡領域TRの適当な点を選択して、選択した点に基づいて軌跡を算出することが考えられる。この場合、算出した軌跡は実際の細胞の軌跡よりも余計に細かく折れ曲がってしまう可能性がある。
この構成により、本実施形態に係る画像処理装置1では、細胞の重心に基づいてトラッキング処理を実行できるため、細胞の重心に基づいて細胞の移動状態Mの解析ができる。
この構成により、本実施形態に係る画像処理装置1では、追跡領域TRの輝度情報が示す輝度(この一例において、輝度値)の状態が第2状態Y(この一例において、ダークコントラストにおいて輝度値が画像Pの背景BGの輝度値より大きい状態)であるか否かの判定結果に基づいて、分裂している細胞のトラッキングを停止できるため、分裂している細胞のトラッキングを停止しない場合に比べて細胞の移動状態Mの解析の精度を高めることができる。
この構成により、本実施形態に係る画像処理装置1では、トラッキングを停止した細胞についてトラッキング処理を再開できるため、トラッキング処理を再開しない場合に比べて細胞の移動状態Mの解析の精度を高めることができる。また、上記した画像処理を用いる本実施形態に係る画像処理装置1は、細胞分裂後において第1状態Xである追跡領域TRを再度抽出することによって、分裂後の新たな細胞のトラッキングを開始できる。
以下、図面を参照しながら第2の実施形態について詳しく説明する。
上記第1の実施形態では、画像処理装置が、第1状態である追跡領域の移動状態を算出する場合について説明をした。本実施形態では、画像処理装置が、追跡領域の輝度情報が示す輝度の状態が第2状態である時間を計測する場合について説明をする。
本実施形態に係る画像処理装置を画像処理装置1aという。
なお、ステップS210、ステップS220、ステップS230、ステップS240、ステップS260、及びステップS270の各処理は、図6におけるステップS110、ステップS120、ステップS130、ステップS140、ステップS150、及びステップS160の各処理と同様であるため、説明を省略する。
なお、分裂時間計測部115aは、計測した浮遊時間TDを、画像Pの撮像におけるフレーム間隔に基づいて、フレームの数(例、上記の画像Pi、及び画像Pi+1の数である2)から時間に換算してもよい。
ここで浮遊時間TDは、細胞周期におけるM期(分裂期)の長さに相当していると考えられる。分裂時間計測部115aは、接着細胞の分裂時に出現する浮遊細胞をトラッキングすることによって、細胞周期のM期の長さを推定して測定する。制御部11aは、M期の長さを測定することにより、M期チェックポイントの期間を間接的に測定し、細胞に与える影響を評価することができる。
ヒストグラムH2bのピークPKbは、ヒストグラムH2aのピークPKaよりも浮遊時間TDが長い位置にあり、第1の培養条件では浮遊時間TDが、基準となる培養条件よりも長い傾向がある。
この構成により、本実施形態に係る画像処理装置1aでは、浮遊時間TDを用いて細胞周期におけるM期の長さを推定して測定できるため、M期の長さに基づいて細胞の状態を解析することができる。
以下、図面を参照しながら第3の実施形態について詳しく説明する。
本実施形態では、画像処理装置が、画像内の領域が複数の領域に分かれた場合に、分かれる前の領域と、分かれた後の複数の領域とを対応づける場合について説明をする。
本実施形態に係る画像処理装置を画像処理装置1bという。
なお、ステップS310、ステップS320、ステップS330、ステップS360、及びステップS370の各処理は、図6におけるステップS110、ステップS120、ステップS130、ステップS150、及びステップS160の各処理と同様であるため、説明を省略する。
図17では、i-1番目のフレームである画像Pi-1と、i番目のフレームである画像Piと、i+1番目のフレームである画像Pi+1と、i+2番目のフレームである画像Pi+2とが示されている。
つまり、ラベル付与部116bは、i番目のフレームにおいて抽出された1つの領域Riに対してi+1番目のフレームにおいて抽出された領域Ri+1のなから複数の領域を対応づけてよい。
状態判定部112は、例えば、ラベルLによって、追跡領域TRiが領域R1i+1と、領域R2i+1とに分かれたことを判定する。状態判定部112は、追跡領域TRiが2つの領域に分かれて、かつ輝度値が画像Pの背景BGの輝度値より大きい場合に、判定した追跡領域TRiを追跡停止領域SRiとする。
図5のステップS40において出力部12は、ラベルLを含む解析結果Abを提示部3に出力する。
Claims (12)
- 時系列に撮像された複数の細胞画像に基づき細胞のトラッキングを行う細胞トラッキング方法であって、
前記複数の細胞画像の各々につき、前記細胞に対応する追跡領域を抽出する抽出工程と、
前記抽出工程で抽出された前記追跡領域の位置の変化を前記複数の細胞画像に基づいて算出し、前記算出された前記追跡領域の位置の変化に基づいて前記細胞をトラッキングする位置算出工程と、
前記細胞画像に基づいてトラッキングの対象である前記細胞が細胞分裂状態にあるか否かを判定する判定工程と、
前記判定工程で前記細胞が細胞分裂状態にあると判定されると、前記細胞のトラッキングを停止させる停止工程と
を有する細胞トラッキング方法。 - 前記判定工程にて、前記複数の細胞画像に基づいて前記細胞の細胞分裂直前の状態変化を検出して細胞分裂状態にあると判定し、
前記停止工程にて、前記判定工程で前記細胞分裂状態にあると判定されると、前記細胞のトラッキングの停止処理として前記トラッキング工程の算出動作を停止させる
請求項1に記載の細胞トラッキング方法。 - 前記抽出工程にて、前記複数の細胞画像に基づいて前記細胞の細胞分裂後にあることが検出されると、前記細胞のトラッキングを開始するために細胞分裂後の複数の細胞に対して新たな追跡領域を抽出する
請求項1に記載の細胞トラッキング方法。 - 前記細胞のトラッキング情報として、前記抽出工程で抽出された前記追跡領域の重心位置、前記トラッキング工程の算出結果、及び前記判定工程の判定結果を前記細胞ごとに記憶させる記憶工程、
をさらに有する請求項1に記載の細胞トラッキング方法。 - 時系列に撮像された複数の細胞画像に基づき細胞のトラッキング処理を行う画像処理装置であって、
前記複数の細胞画像の各々につき、前記細胞に対応する追跡領域を抽出する領域抽出部と、
前記領域抽出部で抽出された前記追跡領域の位置の変化を前記複数の細胞画像に基づいて算出し、前記算出された前記追跡領域の位置の変化に基づいて前記細胞をトラッキングする位置算出部と、
前記細胞画像に基づいて前記トラッキングの対象である細胞が細胞分裂状態にあるか否かを判定する状態判定部と、
前記状態判定部にて前記細胞の細胞分裂状態にあると判定されると、前記細胞のトラッキングを停止させる停止制御部と、
を備える画像処理装置。 - 前記状態判定部にて、前記複数の細胞画像に基づいて前記細胞の細胞分裂直前の状態変化を検出して細胞分裂状態にあると判定し、
前記制御部にて、前記状態判定部にて前記細胞分裂状態にあると判定されると、前記細胞のトラッキングの停止処理として前記位置算出部の算出動作を停止させる
請求項5に記載の画像処理装置。 - 前記領域抽出部にて、前記複数の細胞画像に基づいて前記細胞の細胞分裂後にあることが検出されると、前記細胞のトラッキングを開始するために細胞分裂後の複数の細胞に対して新たな追跡領域を抽出する
請求項5に記載の画像処理装置。 - 前記領域抽出部により抽出された前記追跡領域を識別するラベルを付与するラベル付与部をさらに備え、
前記ラベル付与部は前記トラッキングの対象である細胞の細胞分裂前後で異なる撮像時刻のフレーム間での領域対応づけ処理により同一のラベルを前記追跡領域に付与する
請求項5に記載の画像処理装置。 - 時系列に撮像された複数の細胞画像に基づき細胞のトラッキングを実行するコンピュータに、
前記複数の細胞画像の各々につき、前記細胞に対応する追跡領域を抽出する抽出ステップと、
前記抽出ステップで抽出された前記追跡領域の位置の変化を前記複数の細胞画像に基づいて算出し、前記算出された前記追跡領域の位置の変化に基づいて前記細胞をトラッキングする位置算出ステップと、
前記細胞画像に基づいて前記トラッキングの対象である細胞が細胞分裂状態にあるか否かを判定する判定ステップと、
前記判定ステップにて前記細胞の細胞分裂状態にあると判定されると、前記細胞のトラッキングを停止させる停止ステップと
を実行させるためのプログラム。 - 前記判定ステップにて、前記複数の細胞画像に基づいて前記細胞の細胞分裂直前の状態変化を検出して細胞分裂状態にあると判定し、
前記停止ステップにて、前記判定ステップで前記細胞分裂状態にあると判定されると、前記細胞のトラッキングの停止処理として前記位置算出ステップの算出動作を停止させる
請求項9に記載のプログラム。 - 前記抽出ステップにて、前記複数の細胞画像に基づいて前記細胞の細胞分裂後にあることが検出されると、前記細胞のトラッキングを開始するために細胞分裂後の複数の細胞に対して新たな追跡領域を抽出する
請求項9に記載のプログラム。 - 前記コンピュータに、
前記細胞のトラッキング情報として、前記抽出ステップで抽出された前記追跡領域の重心位置、前記位置算出ステップの算出結果、及び前記判定ステップの判定結果を前記細胞ごとに記憶させる記憶ステップ、
をさらに実行させるための請求項9に記載のプログラム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP20796277.0A EP3964562B1 (en) | 2019-04-26 | 2020-04-23 | Cell tracking method, image processing device, and program |
| JP2021516190A JP7375815B2 (ja) | 2019-04-26 | 2020-04-23 | 細胞トラッキング方法、画像処理装置、及びプログラム |
| US17/509,882 US12469312B2 (en) | 2019-04-26 | 2021-10-25 | Cell tracking method, image processing device, and program |
| JP2023183169A JP7694629B2 (ja) | 2019-04-26 | 2023-10-25 | 細胞トラッキング方法、画像処理装置、及びプログラム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019085323 | 2019-04-26 | ||
| JP2019-085323 | 2019-04-26 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/509,882 Continuation US12469312B2 (en) | 2019-04-26 | 2021-10-25 | Cell tracking method, image processing device, and program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020218393A1 true WO2020218393A1 (ja) | 2020-10-29 |
Family
ID=72942124
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/017426 Ceased WO2020218393A1 (ja) | 2019-04-26 | 2020-04-23 | 細胞トラッキング方法、画像処理装置、及びプログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12469312B2 (ja) |
| EP (1) | EP3964562B1 (ja) |
| JP (2) | JP7375815B2 (ja) |
| WO (1) | WO2020218393A1 (ja) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20230035169A (ko) * | 2021-09-03 | 2023-03-13 | 연세대학교 원주산학협력단 | 유전영동을 이용한 다중 세포 추적 방법 |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2589763B (en) * | 2018-06-28 | 2023-05-24 | Nikon Corp | Device, system, and program |
| JP7532754B2 (ja) * | 2019-09-04 | 2024-08-14 | 株式会社ニコン | 画像解析装置、細胞培養観察装置、画像解析方法、及びプログラム |
| JP7530709B2 (ja) * | 2019-10-11 | 2024-08-08 | 株式会社島津製作所 | 細胞画像解析方法及び細胞解析装置 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02171866A (ja) * | 1988-10-28 | 1990-07-03 | Carl Zeiss:Fa | 細胞画像の評価方法と評価装置 |
| JP2002218995A (ja) * | 2000-11-22 | 2002-08-06 | Masahito Taya | 細胞増殖能評価方法、装置及びプログラム |
| JP2006238802A (ja) | 2005-03-03 | 2006-09-14 | Olympus Corp | 細胞観察装置、細胞観察方法、顕微鏡システム、及び細胞観察プログラム |
| JP2007020422A (ja) * | 2005-07-12 | 2007-02-01 | Olympus Corp | 生体試料培養観察装置、生体試料培養観察方法、および生体試料培養観察用プログラム |
| JP2007327843A (ja) * | 2006-06-07 | 2007-12-20 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2009089630A (ja) * | 2007-10-05 | 2009-04-30 | Nikon Corp | 細胞観察装置および観察方法 |
| JP2019085323A (ja) | 2017-11-06 | 2019-06-06 | 東洋ガラス株式会社 | 黄緑色系ガラス及び黄緑色系ガラス容器 |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7960135B2 (en) * | 2006-03-20 | 2011-06-14 | Northwestern University | Compositions and methods for cell characterization |
| CN102428498A (zh) * | 2009-05-19 | 2012-04-25 | 通用电气健康护理生物科学股份公司 | 样本中的动态细胞跟踪的方法 |
| JP5762764B2 (ja) | 2011-02-09 | 2015-08-12 | オリンパス株式会社 | 細胞画像解析システム |
| JP6000699B2 (ja) | 2012-07-05 | 2016-10-05 | オリンパス株式会社 | 細胞分裂過程追跡装置、及び細胞分裂過程追跡プログラム |
| JP6045292B2 (ja) | 2012-10-19 | 2016-12-14 | オリンパス株式会社 | 細胞計数装置及び細胞計数プログラム |
| EP2917719B1 (en) | 2012-11-09 | 2024-04-17 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Receptacle and system for optically analyzing a sample without optical lenses |
| JP2017023055A (ja) | 2015-07-22 | 2017-02-02 | 大日本印刷株式会社 | 細胞管理システム、プログラム、及び、細胞管理方法 |
| JP6977293B2 (ja) * | 2017-03-31 | 2021-12-08 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、プログラム及び観察システム |
| US10510150B2 (en) | 2017-06-20 | 2019-12-17 | International Business Machines Corporation | Searching trees: live time-lapse cell-cycle progression modeling and analysis |
| WO2019069446A1 (ja) | 2017-10-06 | 2019-04-11 | 株式会社ニコン | 画像処理装置、画像処理方法及び画像処理プログラム |
-
2020
- 2020-04-23 JP JP2021516190A patent/JP7375815B2/ja active Active
- 2020-04-23 EP EP20796277.0A patent/EP3964562B1/en active Active
- 2020-04-23 WO PCT/JP2020/017426 patent/WO2020218393A1/ja not_active Ceased
-
2021
- 2021-10-25 US US17/509,882 patent/US12469312B2/en active Active
-
2023
- 2023-10-25 JP JP2023183169A patent/JP7694629B2/ja active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02171866A (ja) * | 1988-10-28 | 1990-07-03 | Carl Zeiss:Fa | 細胞画像の評価方法と評価装置 |
| JP2002218995A (ja) * | 2000-11-22 | 2002-08-06 | Masahito Taya | 細胞増殖能評価方法、装置及びプログラム |
| JP2006238802A (ja) | 2005-03-03 | 2006-09-14 | Olympus Corp | 細胞観察装置、細胞観察方法、顕微鏡システム、及び細胞観察プログラム |
| JP2007020422A (ja) * | 2005-07-12 | 2007-02-01 | Olympus Corp | 生体試料培養観察装置、生体試料培養観察方法、および生体試料培養観察用プログラム |
| JP2007327843A (ja) * | 2006-06-07 | 2007-12-20 | Olympus Corp | 画像処理装置および画像処理プログラム |
| JP2009089630A (ja) * | 2007-10-05 | 2009-04-30 | Nikon Corp | 細胞観察装置および観察方法 |
| JP2019085323A (ja) | 2017-11-06 | 2019-06-06 | 東洋ガラス株式会社 | 黄緑色系ガラス及び黄緑色系ガラス容器 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20230035169A (ko) * | 2021-09-03 | 2023-03-13 | 연세대학교 원주산학협력단 | 유전영동을 이용한 다중 세포 추적 방법 |
| KR102705807B1 (ko) * | 2021-09-03 | 2024-09-11 | 연세대학교 원주산학협력단 | 유전영동을 이용한 다중 세포 추적 방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3964562B1 (en) | 2025-11-05 |
| US12469312B2 (en) | 2025-11-11 |
| EP3964562A4 (en) | 2023-05-24 |
| US20220114819A1 (en) | 2022-04-14 |
| EP3964562A1 (en) | 2022-03-09 |
| JP2024010094A (ja) | 2024-01-23 |
| JPWO2020218393A1 (ja) | 2020-10-29 |
| JP7375815B2 (ja) | 2023-11-08 |
| JP7694629B2 (ja) | 2025-06-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7694629B2 (ja) | 細胞トラッキング方法、画像処理装置、及びプログラム | |
| KR102275452B1 (ko) | 색상과 형태를 동시에 고려한 실시간 영상 추적 방법 및 이를 위한 장치 | |
| KR101333347B1 (ko) | 화상 처리 방법과 화상 처리 장치 | |
| US9213896B2 (en) | Method for detecting and tracking objects in image sequences of scenes acquired by a stationary camera | |
| JP5603403B2 (ja) | 対象物計数方法、対象物計数装置および対象物計数プログラム | |
| Rees et al. | Elements of image processing in localization microscopy | |
| CN109697725B (zh) | 一种背景过滤方法、装置及计算机可读存储介质 | |
| JP2017522647A (ja) | 背景追跡を介する物体追跡及びセグメント化の方法及び装置 | |
| CN114283124B (zh) | 一种脏污检测方法、装置、设备和存储介质 | |
| JP2023161956A (ja) | 物体追跡装置、物体追跡方法、及びプログラム | |
| EP2787741A1 (en) | Video processing system and method | |
| CN112784832A (zh) | 一种物体标记点识别方法及装置 | |
| CN109255799B (zh) | 一种基于空间自适应相关滤波器的目标跟踪方法及系统 | |
| JP2011070629A5 (ja) | ||
| JP2019164450A (ja) | 画像処理方法、コンピュータプログラムおよび記録媒体 | |
| JP6045292B2 (ja) | 細胞計数装置及び細胞計数プログラム | |
| JP2018081654A (ja) | 検索装置、表示装置および検索方法 | |
| CN105844664A (zh) | 基于改进tld的监控视频车辆检测跟踪方法 | |
| KR101677171B1 (ko) | 픽셀 기반의 배경추정을 통한 이동물체 분할 방법 | |
| JP2020079986A (ja) | 判定装置及び判定方法 | |
| CN107958462A (zh) | 一种视频抖动判定方法及装置 | |
| JP5941351B2 (ja) | 画像処理装置及びその制御方法 | |
| JP7131690B2 (ja) | 制御装置、制御方法、及びプログラム | |
| WO2021045214A1 (ja) | 画像解析装置、細胞培養観察装置、画像解析方法、プログラム、及び情報処理システム | |
| KR20120138376A (ko) | 다중 노출 카메라 영상을 이용한 자동 고속 이동 물체 검출 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20796277 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021516190 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2020796277 Country of ref document: EP Effective date: 20211126 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2020796277 Country of ref document: EP |