WO2014066231A1 - Procédé et dispositif d'identification de cellule, et analyseur d'urine - Google Patents
Procédé et dispositif d'identification de cellule, et analyseur d'urine Download PDFInfo
- Publication number
- WO2014066231A1 WO2014066231A1 PCT/US2013/065879 US2013065879W WO2014066231A1 WO 2014066231 A1 WO2014066231 A1 WO 2014066231A1 US 2013065879 W US2013065879 W US 2013065879W WO 2014066231 A1 WO2014066231 A1 WO 2014066231A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- filter
- frequency
- cell
- cell identification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/431—Frequency domain transformation; Autocorrelation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
Definitions
- the present invention relates to the technical field of cell identification. More particularly, it relates to a method and device capable of identifying red blood cells and white blood cells in urine sediment with greater accuracy and speed, and further to a urine analyzer comprising the device.
- the urine of a normal person may contain very small amounts of red blood cells, white blood cells, epithelial cells, crystals and mucus strands, and in rare cases transparent casts. However, it is also possible for excessive amounts of red blood cells, abnormal epithelial cells, casts, bacteria, trichomonas, tumor cells and viral inclusion bodies to be present.
- the urine sediment examinations we carry out are generally examinations aimed at quantifying the types of urine sediment mentioned above.
- the object of a urine sediment examination is to identify various pathological components of urine such as cells, crystals, bacteria and parasites; urine sediment can generally reflect quite accurately the actual situation regarding cell components, casts, epithelial cells and crystals in urine. Therefore urine sediment testing is an important routine test item which assists in diagnosing, locating and distinguishing urinary system diseases and making prognoses for them. In cases where pathological changes cannot be found in ordinary examinations of properties or chemical tests, minute changes can be discerned by means of a sediment examination .
- Urine sediment examination indices generally include testing for red blood cells and white blood cells, etc.
- the background of microscope images introduces noise, while wide differences are apparent in cell size, shape and texture.
- Fig. 1 shows examples of different classes of object requiring identification. It can be seen from Fig. 1 that within each class there are several more groups of cells or particles. For instance, the class of red blood cells further comprises four particular forms. For this reason, the identification of red blood cells (red blood corpuscles) and white blood cells (white blood corpuscles) in urine sediment is a difficult task.
- the original image is first segmented to extract target objects (e.g. cell to be identified) .
- Cells are then classified by way of feature extraction.
- Common segmentation methods include a Sobel, Robert or Canny kernel used in combination with an active contour or level set method.
- active contour and level set methods are extremely time-consuming due to the iterative curve evolution step, and neither these contour methods nor the Sobel, Robert or Canny kernel is able to remove defocusing
- Complicated methods include SIFT (scale-invariant feature transform) and local grayscale invariant methods, which are extremely time-consuming because they involve derivative of Gaussian (DOG) scale space construction.
- DOG Gaussian
- haar feature (Adaboost training) method is comparatively simple in theory, the training processing thereof is extremely time- consuming because it only uses simple features (e.g. rectangular feature) .
- a cell identification method comprising the following steps:
- an image acquisition step for acquiring an original image
- a defocusing interference removal step for transforming the original image to the frequency domain, acquiring image high-frequency information by means of a first filter, acquiring image edge information by means of a second filter, and performing an inverse transformation to the time domain and extracting image energy, so as to obtain a denoised high-frequency edge image comprising only high-frequency edges ;
- a segmentation step for subjecting the high- frequency edge image to Gaussian blur processing, then choosing a suitable threshold to perform binarization, marking out a cell area bounded by edges, and retrieving cell detail information covered by the marked area from the original image, so as to remove interference from noise points outside the focal plane; and
- a classification step for calculating multiple features for each cell, and classifying each target on the basis of the multiple features.
- the first filter is a logGabor filter
- the second filter is a complex-valued monogenic filter
- the transfer function of the logGabor filter is
- the multiple features comprise at least one of the following features: circularity, rectangularity, gray-level co-occurrence matrix contrast property, gray-level co ⁇ occurrence matrix homogeneity property, gray-level co ⁇ occurrence matrix energy property, and mutual information between a target object image and an average template.
- phase feature pf is obtained by the following formula:
- F (k) is the result of the original image undergoing a Fourier transform.
- a cell identification device comprising :
- an image acquisition unit for acquiring an original image
- a defocusing interference removal unit for subjecting the original image to denoising processing to obtain a high-frequency edge image, and comprising:
- a Fourier transform component for transforming the original image to the frequency domain
- a first filter for acquiring image high- frequency information
- a second filter for acquiring image edge information
- an image energy extraction component for extracting image energy, so as to obtain a denoised high- frequency edge image comprising only high-frequency edges
- a segmentation unit for subjecting the high- frequency edge image to Gaussian blur processing, then choosing a suitable threshold to perform binarization, marking out a cell area bounded by edges, and retrieving cell detail information covered by the marked area from the original image, so as to remove interference from noise points outside the focal plane;
- a classification unit for calculating multiple features for each cell, and classifying each target on the basis of the multiple features.
- the first filter is a logGabor filter
- the second filter is a complex-valued monogenic filter
- the transfer function of the logGabor filter is
- the multiple features comprise at least one of the following features: circularity, rectangularity, gray-level co-occurrence matrix contrast property, gray-level co-occurrence matrix homogeneity property, gray-level co-occurrence matrix energy property, and mutual information between a target object image and an average template.
- the mutual information between the target object image and the average template is obtained by matching of phase features, and the phase feature pf is obtained by the following formula:
- F (k) is the result of the original image undergoing a Fourier transform.
- a urine analyzer comprising any one of the above cell identification devices.
- adaboost training processing is accelerated by first eliminating defocusing interference in the image background, and then using a set of excellent features; since defocusing interference is removed from the original image background before segmentation, a good foundation is laid for subsequent processing. Furthermore, in the method for extracting features of red blood cells and white blood cells, the present invention proposes a set of new combined features, thereby enabling genuine (but not typical) red blood cells and white blood cells to be distinguished more effectively from amongst urine sediment objects.
- Fig. 1 shows examples of particular forms of red blood cells, white blood cells and crystals.
- Fig. 2 is a flow chart showing the procedure of the cell identification method according to the embodiments of the present invention.
- Fig. 3 shows an example of an original image.
- Fig. 4 shows an image obtained by subjecting the original image to denoising and segmentation processing.
- Figs. 5A - 5F show 3 types of bowl-shaped red blood cells and their corresponding phase features.
- Fig. 6 is a block diagram showing the configuration of the cell identification device according to the embodiments of the present invention. DETAILED DESCRIPTION OF THE DRAWINGS
- Fig. 2 is a flow chart showing the procedure of the cell identification method according to the embodiments of the present invention. As Fig. 2 shows, the cell identification method comprises the following steps:
- step S201 an original image f (x) is acquired.
- Fig. 3 shows an example of an original image so acquired. It can be seen from Fig. 3 that very strong defocusing (fuzzy) noise is present in the original image.
- step S202 the original image is transformed to the frequency domain by means of a Fourier transform.
- the 2-dimensional Fourier transform F (k) of the original image f (x) is obtained by the following formula:
- step S203 image high-frequency information is acquired by means of a first filter.
- step S204 image edge information is acquired by means of a second filter.
- step S205 an inverse transformation to the time domain is performed.
- step S206 image energy is extracted, so as to obtain a denoised, high-frequency edge image comprising only high-frequency edges.
- the first filter may be a logGabor filter (also called a Log-Gabor filter) , the transfer function thereof being as follows:
- ⁇ is frequency
- ⁇ is the center frequency of the logGabor filter
- ⁇ is a constant. It must be pointed out that the value of ⁇ should be chosen so as to keep the value of ⁇ / ⁇ constant. For instance, when the value of ⁇ / ⁇ is 0.74, 0.55 and 0.41, the bandwidth of the logGabor filter is approximately 1, 2 and 3 octaves, respectively.
- the second filter may for example be a complex-valued monogenic filter.
- the transfer function of the complex-valued monogenic filter is:
- the steps of inverse-transforming the image back to the time domain and extracting image energy specifically comprise : [ 0062 ] (i) calculating the original image restored to the time domain after undergoing wavelet filtering
- step S207 the high-frequency edge image is subjected to Gaussian blur processing; a suitable threshold is then chosen to perform binarization, and a cell area bounded by edges is marked; cell detail information covered by the marked area is retrieved from the original image, so as to remove interference from noise points outside the focal plane.
- Fig. 4 shows an image obtained by subjecting the original image to denoising and segmentation processing. It can be seen from Fig. 4 that defocusing noise has been completely eliminated from the original image, leaving behind only the objects of interest.
- step S208 multiple features are calculated for each cell, and each target is classified on the basis of these multiple features.
- Adaboost is taken as an example of a classification method.
- those skilled in the art should appreciate that other classification methods are also possible.
- Extraction of features of red blood cells and white blood cells for the construction of a training cluster comprises: using basic image object properties, such as area, circularity, rectangularity, low image brightness to area ratio, gray-level co-occurrence matrix properties (mainly constrast, homogeneity and energy) and mutual information about average templates of a small set (a normal red blood cell average template, a wrinkled red blood cell average template and a white blood cell average template) to distinguish red blood cells and white blood cells from objects of all other types in the urine sediment.
- basic image object properties such as area, circularity, rectangularity, low image brightness to area ratio, gray-level co-occurrence matrix properties (mainly constrast, homogeneity and energy) and mutual information about average templates of a small set (a normal red blood cell average template, a wrinkled red blood cell average template and a white blood cell average template) to distinguish red blood cells and white blood cells from objects of all other types in the urine sediment.
- gray-level co-occurrence matrix properties mainly constrast, homogeneity and energy
- S is the area of a cell or particle
- L is the diameter thereof.
- the object area is used to separate small cells (such as red blood cells, white blood cells and crystals) from minute single yeast cells and large cells (such as epithelial cells and casts) .
- small cells such as red blood cells, white blood cells and crystals
- large cells such as epithelial cells and casts
- rectangularity may be used to further distinguish square crystals from round crystals.
- the rectangularity R is calculated by the following formula :
- W is the width of an object
- H is the height thereof.
- P(x,y) is the junction point probability.
- P(x,y) is the number of times x and y appear at the same time divided by the total number of points (samples) in the image;
- P (x) is the number of times x appears divided by the total number of points (samples) in the image;
- P (y) is the number of times y appears divided by the total number of points (samples) in the image. If X and Y are unrelated, the value of MI(X;Y) is 0.
- phase feature is a solid texture feature unrelated to strong luminance from different directions.
- Figs. 5A - 5F show 3 types of bowl-shaped red blood cells and their corresponding phase features. It can be seen from Fig. 5A that when the intensity of illumination varies, the cell edges may be clear
- Fig. 5D shows the phase feature corresponding to the cell in Fig. 5A. It can be seen from Fig. 5D that in comparison, the intensity of illumination has no bearing on whether the cell edges are clear. Therefore the precision of matching can be increased effectively if phase features are matched to obtain mutual information.
- a phase feature pf can be further extracted from the result obtained in steps S203 - S206.
- texture information based on gray-level co ⁇ occurrence matrix properties mainly comprising the following parameters is used to further distinguish between round crystals and other cells of similar size (mainly red blood cells and white blood cells) .
- the gray-level co-occurrence matrix contrast is calculated by the following formula:
- Contrast ⁇ i - j ⁇ 2 P(i, j)
- P(i,j) is the probability of the gray-level co-occurrence matrix.
- the gray-level co-occurrence matrix homogeneity is calculated by the following formula:
- the gray-level co-occurrence matrix energy property is calculated by the following formula:
- Fig. 6 is a block diagram showing the configuration of the cell identification device according to the embodiments of the present invention.
- the cell identification device 600 comprises an image acquisition unit 601, a defocusing interference removal unit 602, a segmentation unit 603 and a classification unit 604.
- the image acquisition unit 601 acquires an original image, which it then supplies to the defocusing interference removal unit 602.
- the defocusing interference removal unit 602 subjects the original image to denoising processing, to obtain a high-frequency edge image.
- the defocusing interference removal unit 602 further comprises the following components: a Fourier transform component 6021, for transforming the original image to the frequency domain; a first filter 6022, for acquiring image high-frequency information; a second filter 6023, for acquiring image edge information; a Fourier inverse transform component 6024, for inverse-transforming the filtered image to the time domain; an image energy extraction component 6025, for extracting image energy in order to obtain a denoised, high-frequency edge image comprising only high-frequency edges.
- the segmentation unit 603 subjects the high- frequency edge image to Gaussian blur processing, then chooses a suitable threshold to perform binarization, marks out a cell area bounded by edges, and retrieves cell detail information covered by the marked area from the original image, so as to remove interference from noise points outside the focal plane.
- the classification unit 604 calculates multiple features for each cell, and classifies each target on the basis of the multiple features.
- the multiple features comprise at least one of the following features: circularity, rectangularity, gray-level co-occurrence matrix contrast property, gray-level co-occurrence matrix homogeneity property, gray-level co-occurrence matrix energy property, and mutual information between a target object image and an average template.
- the first filter is a logGabor filter
- the second filter is a complex-valued monogenic filter.
- the mutual information between the target object image and the average template is obtained by matching of phase features.
- a urine analyzer is provided, comprising any one of the above cell identification devices.
- the cell identification method comprises the following steps: an image acquisition step, for acquiring an original image; a defocusing interference removal step, for transforming the original image to the frequency domain, acquiring image high- frequency information by means of a first filter, acquiring image edge information by means of a second filter, and performing an inverse transformation to the time domain and extracting image energy, so as to obtain a denoised image comprising only high-frequency edges; a segmentation step, for subjecting the high-frequency edge image to Gaussian blur processing, then choosing a suitable threshold to perform binarization, marking out a cell area bounded by edges, and retrieving cell detail information covered by the marked area from the original image, so as to remove interference from noise points outside the focal plane; and a classification step, for calculating multiple features for each cell, and classifying each target on the basis of the multiple features.
- denoising can be performed before segmentation.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé et un dispositif d'identification de cellule, ainsi qu'un analyseur d'urine. Le procédé d'identification de cellule comprend les étapes suivantes : une étape d'acquisition d'image pour acquérir un image d'origine; une étape d'élimination d'interférence de défocalisation pour transformer l'image d'origine en domaine de fréquence, acquérir des informations haute fréquence d'image au moyen d'un premier filtre, acquérir des informations de contour d'image au moyen d'un second filtre, et réaliser une transformation inversée en domaine temps et extraire l'énergie de l'image, de manière à obtenir une image exempte de bruit comprenant uniquement des bords haute fréquence.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201210418733.8 | 2012-10-26 | ||
| CN201210418733.8A CN103793709A (zh) | 2012-10-26 | 2012-10-26 | 细胞识别方法和装置、以及尿液分析仪 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014066231A1 true WO2014066231A1 (fr) | 2014-05-01 |
Family
ID=50545147
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2013/065879 Ceased WO2014066231A1 (fr) | 2012-10-26 | 2013-10-21 | Procédé et dispositif d'identification de cellule, et analyseur d'urine |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN103793709A (fr) |
| WO (1) | WO2014066231A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113379671A (zh) * | 2021-02-23 | 2021-09-10 | 华北电力大学 | 一种开关类设备局部放电诊断系统及诊断方法 |
| CN114913374A (zh) * | 2022-05-16 | 2022-08-16 | 浙江中烟工业有限责任公司 | 一种卷烟烟包的识别方法及装置 |
| CN117593746A (zh) * | 2024-01-18 | 2024-02-23 | 武汉互创联合科技有限公司 | 基于目标检测的细胞分裂均衡度评估系统及装置 |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105760878A (zh) * | 2014-12-19 | 2016-07-13 | 西门子医疗保健诊断公司 | 选择聚焦最佳的尿沉渣显微镜图像的方法及装置 |
| KR20170128454A (ko) * | 2015-03-11 | 2017-11-22 | 지멘스 악티엔게젤샤프트 | 세포 이미지들 및 비디오들의 디컨볼루셔널 네트워크 기반 분류를 위한 시스템들 및 방법들 |
| CN110472472B (zh) * | 2019-05-30 | 2022-04-19 | 北京市遥感信息研究所 | 基于sar遥感图像的机场检测方法与装置 |
| CN110415212A (zh) * | 2019-06-18 | 2019-11-05 | 平安科技(深圳)有限公司 | 异常细胞检测方法、装置及计算机可读存储介质 |
| CN112634338A (zh) * | 2020-12-30 | 2021-04-09 | 东北大学 | 基于灰度共生矩阵的脑脊液细胞特征提取方法 |
| CN115797253A (zh) * | 2022-09-08 | 2023-03-14 | 厦门博视源机器视觉技术有限公司 | 一种表面喷砂检测方法、终端设备及存储介质 |
| CN115688028B (zh) * | 2023-01-05 | 2023-08-01 | 杭州华得森生物技术有限公司 | 肿瘤细胞生长状态检测设备 |
| CN116524496B (zh) * | 2023-03-22 | 2025-12-23 | 汕头大学 | 一种基于深度学习的寄生虫辅助检测系统 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5978498A (en) * | 1994-09-20 | 1999-11-02 | Neopath, Inc. | Apparatus for automated identification of cell groupings on a biological specimen |
| US20050240106A1 (en) * | 2000-11-13 | 2005-10-27 | Oravecz Michael G | Frequency domain processing of scanning acoustic imaging signals |
| WO2012055543A1 (fr) * | 2010-10-26 | 2012-05-03 | Technische Universität München | Utilisation d'un signal analytique bidimensionnel en échographie |
| US20120213418A1 (en) * | 2006-09-15 | 2012-08-23 | Identix Incorporated | Multimodal ocular biometric system and methods |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102175625A (zh) * | 2010-11-29 | 2011-09-07 | 樊潮 | 一种癌细胞识别方法 |
-
2012
- 2012-10-26 CN CN201210418733.8A patent/CN103793709A/zh active Pending
-
2013
- 2013-10-21 WO PCT/US2013/065879 patent/WO2014066231A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5978498A (en) * | 1994-09-20 | 1999-11-02 | Neopath, Inc. | Apparatus for automated identification of cell groupings on a biological specimen |
| US20050240106A1 (en) * | 2000-11-13 | 2005-10-27 | Oravecz Michael G | Frequency domain processing of scanning acoustic imaging signals |
| US20120213418A1 (en) * | 2006-09-15 | 2012-08-23 | Identix Incorporated | Multimodal ocular biometric system and methods |
| WO2012055543A1 (fr) * | 2010-10-26 | 2012-05-03 | Technische Universität München | Utilisation d'un signal analytique bidimensionnel en échographie |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113379671A (zh) * | 2021-02-23 | 2021-09-10 | 华北电力大学 | 一种开关类设备局部放电诊断系统及诊断方法 |
| CN114913374A (zh) * | 2022-05-16 | 2022-08-16 | 浙江中烟工业有限责任公司 | 一种卷烟烟包的识别方法及装置 |
| CN117593746A (zh) * | 2024-01-18 | 2024-02-23 | 武汉互创联合科技有限公司 | 基于目标检测的细胞分裂均衡度评估系统及装置 |
| CN117593746B (zh) * | 2024-01-18 | 2024-04-19 | 武汉互创联合科技有限公司 | 基于目标检测的细胞分裂均衡度评估系统及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103793709A (zh) | 2014-05-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014066231A1 (fr) | Procédé et dispositif d'identification de cellule, et analyseur d'urine | |
| CN108961208B (zh) | 一种聚集白细胞分割计数系统及方法 | |
| Cosatto et al. | Grading nuclear pleomorphism on histological micrographs | |
| US8077958B2 (en) | Computer-aided pathological diagnosis system | |
| JP7197584B2 (ja) | デジタル病理学分析結果の格納および読み出し方法 | |
| Nguyen et al. | Prostate cancer detection: Fusion of cytological and textural features | |
| US20240201063A1 (en) | Method of storing and retrieving digital pathology analysis results | |
| JP2013524361A (ja) | 画像内のオブジェクトをセグメンテーションする方法 | |
| CN110969204B (zh) | 基于磁共振图像与数字病理图像融合的样本分类系统 | |
| CN112703531A (zh) | 生成组织图像的注释数据 | |
| Wan et al. | Wavelet-based statistical features for distinguishing mitotic and non-mitotic cells in breast cancer histopathology | |
| Chen et al. | Feasibility study on automated recognition of allergenic pollen: grass, birch and mugwort | |
| CN109033936A (zh) | 一种宫颈脱落细胞核图像识别方法 | |
| Kowal et al. | The feature selection problem in computer-assisted cytology | |
| Akakin et al. | Automated detection of cells from immunohistochemically-stained tissues: Application to Ki-67 nuclei staining | |
| Akbar et al. | Tumor localization in tissue microarrays using rotation invariant superpixel pyramids | |
| Rege et al. | Automatic leukemia identification system using otsu image segmentation and mser approach for microscopic smear image database | |
| Mani et al. | Design of a novel shape signature by farthest point angle for object recognition | |
| Zhang et al. | Cascaded-automatic segmentation for schistosoma japonicum eggs in images of fecal samples | |
| Rezaeilouyeh et al. | Prostate cancer detection and gleason grading of histological images using shearlet transform | |
| CN109697450B (zh) | 细胞分类方法 | |
| Veillard et al. | SVM-based framework for the robust extraction of objects from histopathological images using color, texture, scale and geometry | |
| Safa'a et al. | Histopathological prostate tissue glands segmentation for automated diagnosis | |
| Qian et al. | Coarse-to-fine particle segmentation in microscopic urinary images | |
| Amitha et al. | A survey on automatic breast cancer grading of histopathological images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13848938 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13848938 Country of ref document: EP Kind code of ref document: A1 |