[go: up one dir, main page]

WO2007068056A1 - Stain assessment for cereal grains - Google Patents

Stain assessment for cereal grains Download PDF

Info

Publication number
WO2007068056A1
WO2007068056A1 PCT/AU2006/001902 AU2006001902W WO2007068056A1 WO 2007068056 A1 WO2007068056 A1 WO 2007068056A1 AU 2006001902 W AU2006001902 W AU 2006001902W WO 2007068056 A1 WO2007068056 A1 WO 2007068056A1
Authority
WO
WIPO (PCT)
Prior art keywords
grain
image
colour
pixel
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/AU2006/001902
Other languages
French (fr)
Inventor
Leanne Bischof
Mark Berman
Richard Beare
Carolyn Joy Evans
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Grains Research and Development Corp
BRI Australia Ltd
Original Assignee
Grains Research and Development Corp
BRI Australia Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2005907031A external-priority patent/AU2005907031A0/en
Application filed by Grains Research and Development Corp, BRI Australia Ltd filed Critical Grains Research and Development Corp
Publication of WO2007068056A1 publication Critical patent/WO2007068056A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/10Starch-containing substances, e.g. dough
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/251Colorimeters; Construction thereof

Definitions

  • the present invention relates to surface blemish assessment for cereal grains.
  • the invention will be exemplified in the specification particularly with reference to cereal grains such as wheat grains which in general have a variable overall size, a complex detailed shape and the natural colour of which varies from year to year, variety to variety and even from batch to batch.
  • wheat is used to refer to a single grain kernel from a cereal crop.
  • a conventional approach to assessment of wheat comprises taking a sample (typically 300 grains) from a bulk supply and effecting a grain-by-grain visual inspection by a trained operator for various types of visual blemish. The type and severity of defects are judged from the size, colour and shape of the stain, as well as its position on the grain.
  • Blackpoint is caused by enzyme activity at or near the germ end. It causes downgrade of grain quality due to discoloration of products such as bread and noodles, a perceived risk of mycotoxins and association of enzymes which adversely affect bread baking.
  • the 3 texture features are marginatum (a single number quantifying the distribution of intensity between the centre and edge of the grain), heterogeneity (the fraction of pixels that vary by more than a set percentage, say 10%, from the average intensity of the grain) and dumpiness (somehow derived from the heterogeneity measure).
  • marginatum a single number quantifying the distribution of intensity between the centre and edge of the grain
  • heterogeneity the fraction of pixels that vary by more than a set percentage, say 10%, from the average intensity of the grain
  • dumpiness somehow derived from the heterogeneity measure.
  • These 3 texture features do incorporate some spatial context but it is decoupled from the colour information.
  • using a neural network in the present context involves capturing an image and identifying boundaries of the object such as a grain typically using a segmentation technique. Standard texture and colour measurements are applied with respect to the object and the information is fed into a neural network. Many measurements are made and a "black box" makes up applicable rules for analysing the object.
  • a given standard for wheat may be that in any sample of say 300 individual wheat grains no more than 1% are found to fail the relevant test for defects which can be determined visually. Otherwise the entire consignment is downgraded to a lesser quality.
  • the present invention broadly in one aspect is found in a method of assessing cereal grain for staining caused by fungus or enzyme action comprising imaging the grain to show colour and variations in colour across the grain, and introducing corrections into a stored image data file for each grain in a sample, so that the image data file is suitable for analysis for grain quality assessment in a system established with reference criteria correlated with required grain standards.
  • the corrections and the analysis are to ensure high correlation with grain assessment by a careful human expert appraising a large number of grains from a representative sample of a consignment so that its quality may be assessed and a grading according to established standards given.
  • the method comprises using diffuse lighting over a sample of individual grains so as to provide evenness over the curved surface of a grain presented for an inspection and substantially to avoid specular reflection i.e. avoiding glints.
  • the present invention manifests itself also in an apparatus and a data processing system adapted to provide processing of data judging grain by its appearance. Furthermore, manifestation can be in terms of a system to produce data files related to grain image and systems to interpret such data files.
  • the grain may be illuminated with a diffuse light which transmits a broad range of wavelengths such as an incandescent halogen bulb in conjunction with a camera which includes optical filters to separate the various spectral components into three superimposed monochrome images termed red, green and blue (R 5 G &B).
  • a light source such as an light emitting diode (LED), laser, grating filter or monochromator which transmits or allows to pass only a limited range of wavelengths in conjunction with a camera which is sensitive to a broad range of wavelengths so that a number of different light sources can be used to make images at a multiplicity of wavelengths.
  • the system may use an embodiment being a combination of the above lighting approaches.
  • the grain is illuminated at short (eg ultraviolet) wavelengths, and a colour image is taken of fluoresced radiation.
  • colour includes using red, green and blue (R, G, B) portions of the visible spectrum as well as wavelengths of light beyond the normal range of the human eye (approx 400nm to 700nm).
  • the range 700 to lOOOnm contains useful information for the purposes of embodiments of the invention.
  • colour may include detailed spectral information well beyond the three variables normally used in electronic imaging (eg RGB).
  • a hyperspectral camera may be used to record a detailed spectrum for each pixel.
  • Technical limitations of this type of hyperspectral camera may necessitate a small number of pixels per image for superior results, but the spectral information is richer.
  • the outline of each grain may be first identified in order to avoid processing of the non-grain "background".
  • a region near the boundary may be also excluded from further processing as the acute angle of the surface of the grain is such as to give a poor representation of its colour.
  • the resulting images are captured for data processing.
  • the colour measurements at each pixel are calibrated against a reference standard at all wavelengths.
  • the measurements may be further converted to a standardised colour system such as the CIE L-a-b colour space.
  • the resulting images may be processed using rules described to enhance accuracy of strain detection.
  • Methods and apparatus according to the invention can be extended to image individual grains from different views such as the crease side, non crease side, brush end and germ end. However, useful results have been obtained using an image of only the non-crease side.
  • a high resolution colour CCD camera with an appropriate lens can be used so that the resulting image is captured using a frame-grabber module or equivalent in conjunction with a computer. Alternatively, a digital colour camera may be used.
  • a statistical classification tool such as Linear Discriminant Analysis (LDA) classifier may then be trained using a set of grains containing all relevant stain types as well as sound grain kernels, classified by an expert into the various stain types.
  • LDA Linear Discriminant Analysis
  • the statistical classification tool is then used to classify each pixel from its "colour" into categories as follows:
  • PINK stain identified in the body, germ and brush region is more than (XA)% THEN classify as PINK
  • THEN classify as BLACKPOINT IF SOUND GRAIN in the body region only more than (XD)%
  • the values XA ..XD are found by experience with the training set.
  • image processing is used to identify type and extent of stains, using a camera system where each pixel of the image comprises a detailed spectrum of wavelengths, a statistical tool such as Penalised Discriminant Analysis (PDA) can be used to classify the different parts of each grain as sound or as one of the stain types.
  • PDA Penalised Discriminant Analysis
  • training pixels can be found by choosing that linear combination of the different components of the spectrum which minimises the variance of this aggregate spectrum (AS) for pixels in sound grains, while maximising the variance of AS amongst pixels on stained grains.
  • a camera system where each pixel of the image comprises a detailed spectrum of wavelengths and where the pixels containing spectral data are small enough to allow identification as belonging to different parts of the grain (genn, crease, brush, shadow etc.).
  • This spatial data can be used in conjunction with the spectral data to classify stains.
  • Embodiments use at least one and preferably most or all of the following corrections, when applied to an imaging system using images from the red, green and blue portions of the visible spectrum, suitable corrected for lighting and colour.
  • This image is adapted to be a first image to be used in a classifier, such as Linear Discriminant
  • the first image is suitable for processing to provide six further input images for the classifier and which are derived as follows:
  • Pixels of the first image are examined in relation to colour and texture and the relevant data is corrected to allow for apparent variation in colour or texture due to the effects of curvature of the grain and despite diffuse lighting false apparent variation in the colour or texture of grain.
  • the image produced is monochromatic and adapted to be supplied as a second image to the classifier.
  • a second correction recognises that sound grain may either be uniform in appearance across the surface or may be mottled and thus there are false apparent texture differences which need to be recognised to avoid false down grading of the particular grain. Therefore a preferred embodiment uses linear smoothing so the blurring effect avoids apparent doubtful surface structure due to mottling.
  • This technique involves replacing each pixel by the mean of it and its neighbours. For example an array of 21 by 21 pixels may be averaged into a single "value" to provide a smoothed image being a third image for the classifier for subsequent processing.
  • a further desirable correction is to introduce processing of the data relating to each pixel to take account of relative assessment of colour across a grain.
  • a sound grain of one consignment may be generally yellowish yet in another consignment the grain may have a different hue namely pinkish. It is the degree of departure of a colour in each pixel from the overall colour of the grain which is to be used to assess for defects.
  • This correction is conveniently effected by first taking a linear transformation along the first canonical axis which separates sound from stained grain. This is a monochrome image where degree of brightness is indicative of degree of soundness.
  • the most sound part of the grain is taken to be the part of the image with a brightness in the top 10% (i.e. the canonical variate value (CV.) is greater than 90% of the cumulative histogram).
  • CV. canonical variate value
  • a mask of this area is produced and the mean value within the mask of this sound part is derived.
  • the variance from this mean value for each pixel across the whole grain is calculated and a corresponding image is produced for use as a fourth image in the classifier.
  • a further derived image is produced namely a global average of the value of the pixels in the body portion of the fourth image. This further image is used as a fifth image in the classifier.
  • a sixth image for the classifier is derived as the mean of the red, green and blue values in each pixel and is thus a "lightness" image. This permits the L.D.A. classifier to remove "lightness" effects when classifying colour.
  • a seventh image for the classifier is derived by firstly replacing a small region of extreme brightness in the "lightness" image with the value in the neighbourhood, secondly by median filtering with a relatively small and a relatively large window and determining the difference between these median filtered images to produce the sixth image.
  • Various spatial regions of the grain namely the germ portion at one end, the central body and the opposite end brush region are preferably segmented and used in deriving the fourth and fifth images.
  • a preferred embodiment uses the L.D.A. classifier receiving the seven image inputs pixel by pixel.
  • the L.D.A. is trained by manually selecting pixels from known strained grains as rated by a human expert. From the training data, the L.D.A. determines the weighting to be given to each of the seven input images pixel by pixel.
  • the L.D.A. classifies pixel by pixel to determine the probability that the pixel is indicated to belong to one of the allowable classes.
  • the L.D.A. is such that the pixel classification is only accepted if the posterior probability is greater than 90%.
  • the L.D.A. produces an image showing the most likely class for each pixel except for those pixels which are below the probability threshold.
  • a set of features summarising the spatial context of the colour images is used. Then a tree- based classifier suitably trained by the training data, is used to classify the grain (based on summary features) into one of the allowable classes.
  • the system preferably recognises any such shell protrusions and excludes the grain from assessment in the final analyses as the defect is not linked to the stained defect for which the system is testing.
  • cracked grain can have its image assessed by establishing sharpness of edges in the image and such grain needs to be excluded from the assessment as it is not in itself a defect for which investigation is being made.
  • the system preferably has the ability to choose between two sets of rules for classification depending on whether the assessed grain is plain grain or alternatively whether it has a mottled appearance but is still sound.
  • the rules to be applied in grain assessment may need to vary from grain variety to grain variety in a particular type as well as between grains of different type.
  • barley will generally produce a different set of images from wheat and needs to be processed according to its own rules.
  • One system uses images from red, green and blue portions of the visible spectrum (RGB images) of each pixel with appropriate mapping and data is collected at a high spatial resolution with a multiplicity of values determined and then applied.
  • the classifier preferably uses the following values derived from the input images:-
  • the classifier based on data from a training set of grain images, applies weighting to the various values to derive, pixel by pixel, a classification.
  • a tree classifier may be used to catalogue data from individual grains to allocate the grain into one of four classes:- sound, blackpoint affected, field fungi affected, and pink (toxic) stained.
  • Embodiments include assessing grain from views of each grain captured from different angles, each view being sufficiently detailed so that the grain occupies at least many hundred pixels of the stored image. A significant number (typically several hundred) individual grains are captured and analysed for a sample. Preferred processing is with images from the opposite side of the grain having the crease.
  • Fig 1 is a schematic representation of a system comprising a camera, image capture and lighting,
  • Fig 2 is a schematic representation of a wheat kernel profile with key features
  • Fig 3 is a representation of a pixellated kernel image with some processing; in the representation various shading patterns are used to represent the different colours of the image
  • Fig 4 is an example of a tree classifier for processing grain against set of rules derived from a training set of grains, and
  • Fig 5 is a table of results comparing automated justification with expert human appraisal.
  • Fig 1 shows an apparatus for use in assessing grain.
  • An array of grain kernels (21) are suitably supported and illuminated by an array of lights (22) with diffusers (23) to evenly illuminate the grain kernels.
  • Image capture of each grain is effected by an electronic camera (24) using a very large number of pixels for each grain typically 200,000 pixels.
  • the data from the image of each grain is transferred to a computer (26) via an appropriate interface for image processing using suitable rules to establish if each grain meets a prescribed standard.
  • Fig 2 shows schematically a wheat kernel (31) presented with crease side down, as viewed by the camera.
  • the germ end (32) and brush end (36) are shown.
  • Dark areas apparent are blackpoint stain (33), mottle (34) over the body portion, and perimeter edges with shadow (35).
  • Fig 3 shows the wheat grain of Fig 2 as captured and pixellated (41) by the image capture system. For clarity extremely large pixels are shown. The pixels representing the germ end and brush end are segregated from the body area by (possibly curved) lines (43, 44). The outline shadow (42) outside the perimeter line (45) has been eliminated to avoid distorting data being used.
  • the system is designed to handle a large number of samples individually in rapid succession and a feed system (not shown) is provided for sequentially supplying the individual kernels using a vibratory or pneumatic supply arrangement. After measurement, the grain or group of grains is withdrawn and the next batch is supplied.
  • a refinement of the image is desirable for discriminating between the edge of the grain to be imaged and the background such as a black tray.
  • a black tray has red and green components in reflected light with similar values and very low brightness.
  • An image mask is determined and used to define the boundary of the grain for imaging purposes. This masked image is then used in further processing.
  • the masked image is subjected to normalisation for lighting correction and calibration and this primary or first image is then used in a statistical classifier and also used to derive some six adaptations, also to be used in the classifier.
  • Embodiments of the present invention recognised that assessment of the grain by colour alone would result in misclassification. Due to local high curvature in the germ, colour detected would not be true colour and in the brush region, due to the mixing of image of the brush structure and background, colour would also be misclassif ⁇ ed. Thirdly at the edge of the grain, due to high global curvature, it is not possible to have the same reflectance of light from the same colour as in the central region of the body and this portion of the image if used would cause misclassification.
  • blackpoint stain which is in the germ region and extends often into the body region although usually brown may be a black colour and therefore colour alone could cause misclassification.
  • This embodiment for each pixel transforms the primary or first image into six image data files.
  • the system recognises that despite having uniform lighting there is considerable roll-off in grain reflectance at the edge of the grain resulting from curvature of the grain surface.
  • a second image is provided based on determining a surface reflectance model by plotting the brightness profiles across the girth of uniformly coloured grains.
  • the third image is a smoothed image wherein averaging across the local neighbourhood of pixels was effected to reduce the resolution and thereby provide an overall colouration which helps to reduce "false positives" with respect to mottled but sound grain.
  • a further step is to implement an approximation of human grading techniques wherein for grains having a faint indication of stain, the human compares the colour of the suspect region with that of the rest of the grain, i.e. a relative colour comparison is made rather than monitoring with regard to an absolute standard.
  • linear discriminate analysis is used to determine the best linear combination of the R, G and B images to best separate the colour of sound grain from stained grains.
  • the transformed image is called the first Canonical Variant (CV) image.
  • the resulting image is best processed by linear smoothing.
  • the classifier also needs to use an image which is the monochromatic mean of the R, and B channels of the first image.
  • This mean represents the "lightness" of the first image and the mean becomes a sixth image for the classifier.
  • a further issue addressed is reducing confusion between the colour in small dark blotches in mottled stain grains (which typically are shades of grey or dark brown) in contrast to either field fungi staining or blackpoint staining.
  • the technique adopted involves determining a local texture variable which represents the degree of mottle.
  • This texture measure is derived from the raw colour image.
  • a median filter with a large window e.g. 81 by 81 pixels can be used to remove gross brightness variations over the surface of the grain.
  • a further refinement before applying the median filter is to recognise that bright flecks of dust would disrupt local statistics. Distinct bright regions in the image were replaced by values of their surrounding regions.
  • a second median filter is applied using a smaller window of the average of the mottled spots e.g. 21 by 21 pixels. The mottled texture measure was then obtained by taking the difference between these two median filtered images. The resultant of this is a seventh image for the classifier.
  • a further requirement is to determine the spatial regions of the grain and in particular to identify the germ portion and the opposite brush end portion.
  • the germ portion may have significant local curvature and thus dark shadows can appear in sound grain as distinct from darkness caused by blackpoint stain.
  • the brush portion ambiguity can arise because of a mixture of brush elements and background in a pixel resulting in a grey colour suggestive of field fungi stain, even though the grain is sound. Therefore an automated procedure is used to find both the brush and germ regions of the grain, the body of the grain being defined as the zone between these regions.
  • the first step is to determine the grain orientation i.e. which end is germ and which is brush. Linear discriminate analysis of shape and texture features extracted from at the end regions of the grain is used.
  • the germ region of a grain can be found by locating the strong edges which are typically created by a high local curvature of the boundary between the germ and the body of the grain.
  • the inverse of an edge strength image is taken (known as a cost image) in which the shortest path through the cost image can be determined to act as a deemed boundary of the germ portion.
  • the brush end can then be determined by either locating fine horizontal hairs or finding the fine protrusions of the hairs into the background and offsetting by the average length of the hairs. This latter approach is preferred. This can best be implemented by first creating a more sensitive mask of the grain to ensure that even the finest hairs projecting from the grain can be detected.
  • the computer system is then used to make a per pixel classification.
  • a preferred embodiment uses some 11 colour and texture measures for each pixel namely the values of the logarithms of red, green, blue, smoothed red, smoothed green, smoothed blue and lightness values.
  • the eighth value is the logarithm of the surface reflectance profile across the grain.
  • the ninth and tenth values are respectively the logarithms of the local and global relative brightness (CVA), and the eleventh value is the texture measure designed to detect the dark motley texture of sound grains.
  • LDA linear discriminate analyser
  • the process also uses spatial region information applied to the per pixel stain classification to give an output of per-grain numerical features so that a tree classifier such as a Recursive Partitioning and Regression Tree (RPART) classifier can be used to derive the rules to produce the grain classification as a whole.
  • a tree classifier such as a Recursive Partitioning and Regression Tree (RPART) classifier can be used to derive the rules to produce the grain classification as a whole.
  • RPART Recursive Partitioning and Regression Tree
  • the procedure is repeated and thirdly for the mask of the brush region is the procedure is repeated again.
  • the brush region is often of little diagnostic value and one option is to combine the germ and body results into a "no-brush" result to derive the total number of pixels in the no-brush with sure classification in terms of the four classes.
  • stain colour does not conform to simple colour rules e.g. blackpoint strain is usually brown but can also be grey-black in colour. Therefore stain colour classes that can be confused are best combined. Combining brown and grey black colours is useful because such colours can be each indicative of either Blackpoint stain or field fungi stain. Similarly combining pink and brown colours is useful because a dark pink stain is indistinguishable from dark brown.
  • the degree of spatial continuity of patches of stain is also a relevant question.
  • An example of a relevant question is whether a stain of brown colour is present in a continuous region or in isolated colonies? The answer determines the classification.
  • Such spatial continuity information is derived by obtaining the area and the proportion of area that is connected with the number and average size of blobs of apparent staining responsible for most of the total area. Accordingly detection and computation based on labelling each individual Blackpoint object in the no-brush region and sorting by size is an appropriate technique.
  • the area of the first object in the sorted list is the largest connected region and determination is made of the proportion of the total area within the largest connected region.
  • the cumulative sum of the areas of the blobs can be considered to find the number of blobs that make up the majority say 75% of the area.
  • a morphological reconstruction of the connected components of the field fungi-blackpoint mask from the germ region is then effected.
  • a similar procedure is then followed to determine the area and the proportion of area that is connected to blackpoint in the germ by way of calculation.
  • a set of rules to produce the per-grain classification is derived using an RPART classifier. It has been recognised in embodiments of the present invention that concerns are that false positive values for sound grains is a more important performance criteria for an automated system then the true positive rates. For example it has been assessed that false positive rates for detection of fungal contamination had to be substantially better than 0.3% since otherwise nearly every batch of sound grain tested would be incorrectly rejected.
  • Figure 4 is an example of RPART classifier rules determined for a particular wheat variety for a particular geographical location and a particular season. It may be that seasonal variation characteristics as well as geographical area and varietal variations would produce different rules for determination after a careful calibration with a training set has been achieved. As shown in Figure 4, the decision tree is firstly to assess the area of pink in the body plus germ region (nobsh.pink) as to whether it is less than a determined critical value. If the area is greater than the critical value, then the next decision is whether the grain is deemed to suffer from blackpoint or pink strain.
  • the alternative branch of a tree is where the (nobsh.pink) value is less than the critical value and then the decision is whether the connected area of grey plus brown in the body plus germ region (nobsh.FFB.con) is greater or less than a critical value. If it is less than a critical value then it is determined to be sound grain. If it is greater than the threshold value (432.5 in the example) then the next decision in the tree is considered namely the ratio of grey to grey plus brown in the body plus germ region ("nobsh.FF2FFB.r"). In the example given the value for the threshold was 0.1512 and the final decision in the tree occurs depending on whether the threshold is exceeded or not.
  • the final test to determine between blackpoint and sound is to consider the connected area of brown in the body germ region ("nobsh.BB.con"). If the value in this example of 4117 is exceeded then the decision is that blackpoint subsists in the grain kernel but if the value is less than this threshold the grain is deemed sound.
  • the alternative element of the tree is to consider the proportion of brown in the germ region ("germ.blackpoint.p")- If the value in the example 0.2642 is exceeded then blackpoint is deemed to subsist but otherwise a field fungi contamination is deemed to apply.
  • the table of Figure 5 shows results for classification in three hyperspectral examples and a fourth example being the RGB colour plus spatial combination. Thus results of the order of 98% correct classification are obtainable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Food Science & Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Medicinal Chemistry (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

A method of assessing cereal grain for staining comprising subjecting a multiplicity of grain kernels to diffuse lighting, capturing images of each grain kernel and processing each image with masking to exclude the edge portion of the grain and the brush region to determine the indicative portion of the grain and analysing the image thereof to determine whether the extent of colouration and distribution subsists above a threshold value for the grain to be classified as defective in terms of pink stain or blackpoint or field fungi or otherwise whether the grain is sound, taking into account that sound grain may be relatively uniform in colour but may also be mottled, the image data being processed to discriminate against false positives having regard to the spatial distribution of apparent staining and its variability in colour across the germ and body portions of the grain kernel.

Description

STAIN ASSESSMENT FOR CEREAL GRAINS
Field of the Invention
The present invention relates to surface blemish assessment for cereal grains. The invention will be exemplified in the specification particularly with reference to cereal grains such as wheat grains which in general have a variable overall size, a complex detailed shape and the natural colour of which varies from year to year, variety to variety and even from batch to batch. In this specification the term "grain" is used to refer to a single grain kernel from a cereal crop.
Background to the Invention
There is a need to have an accurate, reliable and reproducible assessment methodology so that a consignment of grain can be sampled and an assessment for quality can be made, particularly so that the grain can be bulk stored in an appropriate grade as well as ensuring payment to the grower is fair and appropriate.
A conventional approach to assessment of wheat comprises taking a sample (typically 300 grains) from a bulk supply and effecting a grain-by-grain visual inspection by a trained operator for various types of visual blemish. The type and severity of defects are judged from the size, colour and shape of the stain, as well as its position on the grain.
Blackpoint is caused by enzyme activity at or near the germ end. It causes downgrade of grain quality due to discoloration of products such as bread and noodles, a perceived risk of mycotoxins and association of enzymes which adversely affect bread baking.
Some other discoloration such as pink stain are caused by fungi. Some of the byproducts of these fungi are toxic.
It will be appreciated that assessments must be effected quickly before bulk grain delivered from a grower is discharged at a receival station. There is also a need, when applying official standards, that measurements be traceable as well as accurate. The currently-used methods do not meet this requirement. Human grain experts base their judgement of the soundness of a grain not only on their knowledge of the colour of stains but also on their spatial patterns. These spatial patterns are known from the chemistry and biology of the grain and how stains develop. This spatial pattern information often alters the interpretation of grain colour. For example, field fungi is normally grey-black in colour and Blackpoint is brown. However, if brown colour is present in isolated colonies across the body of a grain, human experts know that the stain must be field fungi even though the stain is the "wrong" colour. This is because they know that Blackpoint stain starts in the germ and grows outwards in a continuous sheath, whereas field fungi grow in isolated colonies. If an image analysis system is to replicate the judgement of the human expert, it must also have a means of modifying the interpretation of colour in the light of such application specific knowledge.
Published approaches include neural net approaches. Such approaches extract a considerable number of generic numerical features from the colour image of a grain and then trains a neural network to combine those features (by addition and multiplication) to produce a classification. For this approach to be able to deal with unusual stain colour and spatial patterns, two things must occur. One is that the features that are extracted must quantify in some way the departures from the normal stain colour and spatial patterns i.e. the features must have discriminating power for the full range of stain appearances. The other is that there must be sufficient grains in the training set with those unusual appearances for the neural network classifier to develop rules appropriate for these exceptions from the norm. The classification accuracy achievable by the neural net approach is limited by the significance of the features that are extracted.
One such neural net approach is disclosed in US patent application 2003/0072484 Al (Kokko and Hill). In table 2, the specification list some 34 features of which 24 characterise size and shape, with 3 for texture and 7 for colour intensity/density. The 7 colour features are mean density/intensity in the red (R), green (G) and blue (B) bands, and the minimum, maximum, standard deviation and mean of the grain intensity, where intensity is presumably the lightness, L = (R+G+B)/3. All these colour features combine the colour of the entire grain into a single number without any spatial context. The 3 texture features are marginatum (a single number quantifying the distribution of intensity between the centre and edge of the grain), heterogeneity (the fraction of pixels that vary by more than a set percentage, say 10%, from the average intensity of the grain) and dumpiness (somehow derived from the heterogeneity measure). These 3 texture features do incorporate some spatial context but it is decoupled from the colour information. In summary using a neural network in the present context involves capturing an image and identifying boundaries of the object such as a grain typically using a segmentation technique. Standard texture and colour measurements are applied with respect to the object and the information is fed into a neural network. Many measurements are made and a "black box" makes up applicable rules for analysing the object. In the present field a given standard for wheat may be that in any sample of say 300 individual wheat grains no more than 1% are found to fail the relevant test for defects which can be determined visually. Otherwise the entire consignment is downgraded to a lesser quality. However, it appears that there is an absence of any proposal which might be able to achieve a high classification accuracy when the stains are subtle or contravene the normal colour and spatial pattern rules. No amount of training could correct for this shortcoming because the extracted features, a crude summary of the grain's colour and texture, have discarded the discriminating information within the image.
Summary of the Invention
The present invention broadly in one aspect is found in a method of assessing cereal grain for staining caused by fungus or enzyme action comprising imaging the grain to show colour and variations in colour across the grain, and introducing corrections into a stored image data file for each grain in a sample, so that the image data file is suitable for analysis for grain quality assessment in a system established with reference criteria correlated with required grain standards. The corrections and the analysis are to ensure high correlation with grain assessment by a careful human expert appraising a large number of grains from a representative sample of a consignment so that its quality may be assessed and a grading according to established standards given. The method comprises using diffuse lighting over a sample of individual grains so as to provide evenness over the curved surface of a grain presented for an inspection and substantially to avoid specular reflection i.e. avoiding glints. The present invention manifests itself also in an apparatus and a data processing system adapted to provide processing of data judging grain by its appearance. Furthermore, manifestation can be in terms of a system to produce data files related to grain image and systems to interpret such data files.
In order to obtain the colour images, the grain may be illuminated with a diffuse light which transmits a broad range of wavelengths such as an incandescent halogen bulb in conjunction with a camera which includes optical filters to separate the various spectral components into three superimposed monochrome images termed red, green and blue (R5G &B). Alternatively the system may use a light source such as an light emitting diode (LED), laser, grating filter or monochromator which transmits or allows to pass only a limited range of wavelengths in conjunction with a camera which is sensitive to a broad range of wavelengths so that a number of different light sources can be used to make images at a multiplicity of wavelengths. Furthermore, the system may use an embodiment being a combination of the above lighting approaches. Yet a further possible approach is where the grain is illuminated at short (eg ultraviolet) wavelengths, and a colour image is taken of fluoresced radiation.
In this context the term "colour" includes using red, green and blue (R, G, B) portions of the visible spectrum as well as wavelengths of light beyond the normal range of the human eye (approx 400nm to 700nm). The range 700 to lOOOnm contains useful information for the purposes of embodiments of the invention. In addition the term "colour" may include detailed spectral information well beyond the three variables normally used in electronic imaging (eg RGB).
An alternative approach is to use a hyperspectral analysis with linear discrimination analysis in a system that is essentially a low resolution system. In embodiments of the present invention, a hyperspectral camera may be used to record a detailed spectrum for each pixel. Technical limitations of this type of hyperspectral camera may necessitate a small number of pixels per image for superior results, but the spectral information is richer. In embodiments of the invention, the outline of each grain may be first identified in order to avoid processing of the non-grain "background". A region near the boundary may be also excluded from further processing as the acute angle of the surface of the grain is such as to give a poor representation of its colour.
According to a first specific embodiment of the present invention, there is provided a method of presenting a representative sample of grains to an imaging location supplied with diffuse light and taking images of the individual grains using a suitable digital camera. The resulting images are captured for data processing. The colour measurements at each pixel are calibrated against a reference standard at all wavelengths. In the special case of RGB colour, the measurements may be further converted to a standardised colour system such as the CIE L-a-b colour space. The resulting images may be processed using rules described to enhance accuracy of strain detection.
Methods and apparatus according to the invention can be extended to image individual grains from different views such as the crease side, non crease side, brush end and germ end. However, useful results have been obtained using an image of only the non-crease side. A high resolution colour CCD camera with an appropriate lens can be used so that the resulting image is captured using a frame-grabber module or equivalent in conjunction with a computer. Alternatively, a digital colour camera may be used.
A statistical classification tool such as Linear Discriminant Analysis (LDA) classifier may then be trained using a set of grains containing all relevant stain types as well as sound grain kernels, classified by an expert into the various stain types. The statistical classification tool is then used to classify each pixel from its "colour" into categories as follows:
• Sound and bright
• Sound but dark and mottled • Blackpoint
• Pink fungal stain
• Field fungal stain • Uncertain
For classifying a whole grain, the following process can be used: IfPINK stain identified in the body, germ and brush region is more than (XA)% THEN classify as PINK
ELSE IfFIELD FUNGI in the body and germ region only is more than (XB)%
THEN classify as FIELD FUNGI ELSE IF BLACKPOINT in the germ region only is more than (XC)%,
THEN classify as BLACKPOINT IF SOUND GRAIN in the body region only more than (XD)%,
THEN classify as SOUND ELSE classify as UNKNOWN
The values XA ..XD are found by experience with the training set.
With appropriate values, the system is found to return correct identifications for over 90% of cases tested. The above sequence of decision making is indicative only and may be modified and added to within the scope of this invention.
According to a specific embodiment of the invention, image processing is used to identify type and extent of stains, using a camera system where each pixel of the image comprises a detailed spectrum of wavelengths, a statistical tool such as Penalised Discriminant Analysis (PDA) can be used to classify the different parts of each grain as sound or as one of the stain types. Where the pixel size is not small enough to be able to reliably identify individual pixels as belonging to a particular class by visual means (for training purposes), training pixels can be found by choosing that linear combination of the different components of the spectrum which minimises the variance of this aggregate spectrum (AS) for pixels in sound grains, while maximising the variance of AS amongst pixels on stained grains. According to a specific embodiment of the invention, a camera system is used where each pixel of the image comprises a detailed spectrum of wavelengths and where the pixels containing spectral data are small enough to allow identification as belonging to different parts of the grain (genn, crease, brush, shadow etc.). This spatial data can be used in conjunction with the spectral data to classify stains.
Embodiments use at least one and preferably most or all of the following corrections, when applied to an imaging system using images from the red, green and blue portions of the visible spectrum, suitable corrected for lighting and colour. This image is adapted to be a first image to be used in a classifier, such as Linear Discriminant
Analysis (L.D.A.) classifier. Hereinafter description will be given with reference to use of an L.D.A. but any suitable alternative may be used. The first image is suitable for processing to provide six further input images for the classifier and which are derived as follows:
a) Pixels of the first image are examined in relation to colour and texture and the relevant data is corrected to allow for apparent variation in colour or texture due to the effects of curvature of the grain and despite diffuse lighting false apparent variation in the colour or texture of grain. The image produced is monochromatic and adapted to be supplied as a second image to the classifier.
b) A second correction recognises that sound grain may either be uniform in appearance across the surface or may be mottled and thus there are false apparent texture differences which need to be recognised to avoid false down grading of the particular grain. Therefore a preferred embodiment uses linear smoothing so the blurring effect avoids apparent doubtful surface structure due to mottling. This technique involves replacing each pixel by the mean of it and its neighbours. For example an array of 21 by 21 pixels may be averaged into a single "value" to provide a smoothed image being a third image for the classifier for subsequent processing.
c) A further desirable correction is to introduce processing of the data relating to each pixel to take account of relative assessment of colour across a grain. For example a sound grain of one consignment may be generally yellowish yet in another consignment the grain may have a different hue namely pinkish. It is the degree of departure of a colour in each pixel from the overall colour of the grain which is to be used to assess for defects. This correction is conveniently effected by first taking a linear transformation along the first canonical axis which separates sound from stained grain. This is a monochrome image where degree of brightness is indicative of degree of soundness.
The most sound part of the grain is taken to be the part of the image with a brightness in the top 10% (i.e. the canonical variate value (CV.) is greater than 90% of the cumulative histogram). Thus a mask of this area is produced and the mean value within the mask of this sound part is derived. The variance from this mean value for each pixel across the whole grain is calculated and a corresponding image is produced for use as a fourth image in the classifier. Furthermore, a further derived image is produced namely a global average of the value of the pixels in the body portion of the fourth image. This further image is used as a fifth image in the classifier.
d) A sixth image for the classifier is derived as the mean of the red, green and blue values in each pixel and is thus a "lightness" image. This permits the L.D.A. classifier to remove "lightness" effects when classifying colour.
e) A seventh image for the classifier is derived by firstly replacing a small region of extreme brightness in the "lightness" image with the value in the neighbourhood, secondly by median filtering with a relatively small and a relatively large window and determining the difference between these median filtered images to produce the sixth image.
Various spatial regions of the grain, namely the germ portion at one end, the central body and the opposite end brush region are preferably segmented and used in deriving the fourth and fifth images.
A preferred embodiment uses the L.D.A. classifier receiving the seven image inputs pixel by pixel. The L.D.A. is trained by manually selecting pixels from known strained grains as rated by a human expert. From the training data, the L.D.A. determines the weighting to be given to each of the seven input images pixel by pixel. The L.D.A. classifies pixel by pixel to determine the probability that the pixel is indicated to belong to one of the allowable classes. Preferably, the L.D.A. is such that the pixel classification is only accepted if the posterior probability is greater than 90%. The L.D.A. produces an image showing the most likely class for each pixel except for those pixels which are below the probability threshold.
Having obtained a classification of the colour of each pixel of the grain image, a set of features summarising the spatial context of the colour images is used. Then a tree- based classifier suitably trained by the training data, is used to classify the grain (based on summary features) into one of the allowable classes.
In general there is a possibility with cereal grains that some individual grains may have shell protrusions due to the bran lifting. While such a grain may have some defect in appearance, the system preferably recognises any such shell protrusions and excludes the grain from assessment in the final analyses as the defect is not linked to the stained defect for which the system is testing.
Furthermore, cracked grain can have its image assessed by establishing sharpness of edges in the image and such grain needs to be excluded from the assessment as it is not in itself a defect for which investigation is being made.
The system preferably has the ability to choose between two sets of rules for classification depending on whether the assessed grain is plain grain or alternatively whether it has a mottled appearance but is still sound.
It is recognised that the rules to be applied in grain assessment may need to vary from grain variety to grain variety in a particular type as well as between grains of different type. For example barley will generally produce a different set of images from wheat and needs to be processed according to its own rules.
Within the broad concept of the present invention two main approaches are available. One system uses images from red, green and blue portions of the visible spectrum (RGB images) of each pixel with appropriate mapping and data is collected at a high spatial resolution with a multiplicity of values determined and then applied. The classifier preferably uses the following values derived from the input images:-
• the logarithms of the values per pixel in the red, green and blue portions of the spectrum;
• logarithms of the smoothed red, smoothed green and smoothed blue portions of the visible spectrum to discriminate in colour from mottled sound grains from those having sparse colonies of field fungi;
• the logarithm of the value of lightness from the combination of the red, green and blue spectra;
• logarithm of the surface reflectance profile across the girth of the grain.
• logarithms of the local and global relative brightness variance.
• the texture measure designed to detect dark motley texture of sound grains.
The classifier, based on data from a training set of grain images, applies weighting to the various values to derive, pixel by pixel, a classification.
A tree classifier may be used to catalogue data from individual grains to allocate the grain into one of four classes:- sound, blackpoint affected, field fungi affected, and pink (toxic) stained.
Embodiments include assessing grain from views of each grain captured from different angles, each view being sufficiently detailed so that the grain occupies at least many hundred pixels of the stored image. A significant number (typically several hundred) individual grains are captured and analysed for a sample. Preferred processing is with images from the opposite side of the grain having the crease. Summary of the Drawings
For illustrative purposes an embodiment of the invention and an example will now be given with reference to the accompanying representations of which:-
Fig 1 is a schematic representation of a system comprising a camera, image capture and lighting,
Fig 2 is a schematic representation of a wheat kernel profile with key features; Fig 3 is a representation of a pixellated kernel image with some processing; in the representation various shading patterns are used to represent the different colours of the image; Fig 4 is an example of a tree classifier for processing grain against set of rules derived from a training set of grains, and
Fig 5 is a table of results comparing automated justification with expert human appraisal.
Detailed description of the Drawings
Fig 1 shows an apparatus for use in assessing grain. An array of grain kernels (21) are suitably supported and illuminated by an array of lights (22) with diffusers (23) to evenly illuminate the grain kernels. Image capture of each grain is effected by an electronic camera (24) using a very large number of pixels for each grain typically 200,000 pixels. The data from the image of each grain is transferred to a computer (26) via an appropriate interface for image processing using suitable rules to establish if each grain meets a prescribed standard.
Fig 2 shows schematically a wheat kernel (31) presented with crease side down, as viewed by the camera. The germ end (32) and brush end (36) are shown. Dark areas apparent are blackpoint stain (33), mottle (34) over the body portion, and perimeter edges with shadow (35).
Fig 3 shows the wheat grain of Fig 2 as captured and pixellated (41) by the image capture system. For clarity extremely large pixels are shown. The pixels representing the germ end and brush end are segregated from the body area by (possibly curved) lines (43, 44). The outline shadow (42) outside the perimeter line (45) has been eliminated to avoid distorting data being used. The system is designed to handle a large number of samples individually in rapid succession and a feed system (not shown) is provided for sequentially supplying the individual kernels using a vibratory or pneumatic supply arrangement. After measurement, the grain or group of grains is withdrawn and the next batch is supplied.
A preferred process will now be described wherein the camera has channels for each pixel responsive to red, green and blue portions of the visible spectrum. To establish the appropriate discrimination rules, a training set of grains expertly classified is used. Specific grains were used with pink strain, blackpoint, and field fungi defects along with sound grains, which could be either light and uniform in colour (except for the germ) or could be mottled and darker but sound. Thus 5 classes of grain are in the training set and correspond with the 5 classes requiring discrimination in field use. Standards for grain may vary from market to market, but, for example, there may be a tolerance of x% grains affected by "pink" stain, y% for "blackpoint and z% for "field fungi".
A refinement of the image is desirable for discriminating between the edge of the grain to be imaged and the background such as a black tray. A black tray has red and green components in reflected light with similar values and very low brightness. An image mask is determined and used to define the boundary of the grain for imaging purposes. This masked image is then used in further processing.
The masked image is subjected to normalisation for lighting correction and calibration and this primary or first image is then used in a statistical classifier and also used to derive some six adaptations, also to be used in the classifier.
Embodiments of the present invention recognised that assessment of the grain by colour alone would result in misclassification. Due to local high curvature in the germ, colour detected would not be true colour and in the brush region, due to the mixing of image of the brush structure and background, colour would also be misclassifϊed. Thirdly at the edge of the grain, due to high global curvature, it is not possible to have the same reflectance of light from the same colour as in the central region of the body and this portion of the image if used would cause misclassification.
Furthermore blackpoint stain which is in the germ region and extends often into the body region although usually brown may be a black colour and therefore colour alone could cause misclassification. Similarly field fungi stain generally in the body of the grain although normally grey or black can be brown. Even after applying lighting correction factors to the image of a kernel and colour calibration, these problems would still persist.
This embodiment for each pixel transforms the primary or first image into six image data files.
The system recognises that despite having uniform lighting there is considerable roll-off in grain reflectance at the edge of the grain resulting from curvature of the grain surface. A second image is provided based on determining a surface reflectance model by plotting the brightness profiles across the girth of uniformly coloured grains.
The third image is a smoothed image wherein averaging across the local neighbourhood of pixels was effected to reduce the resolution and thereby provide an overall colouration which helps to reduce "false positives" with respect to mottled but sound grain.
A further step is to implement an approximation of human grading techniques wherein for grains having a faint indication of stain, the human compares the colour of the suspect region with that of the rest of the grain, i.e. a relative colour comparison is made rather than monitoring with regard to an absolute standard. To establish the correct approach linear discriminate analysis is used to determine the best linear combination of the R, G and B images to best separate the colour of sound grain from stained grains. The transformed image is called the first Canonical Variant (CV) image. The resulting image is best processed by linear smoothing.
It has then been found best to determine also the mean of the CV image within a mask identified as applying to the most sound part of the grain and then to calculate the pixel- wise variance of the rest of the image relative to this mean. This is the fourth image for the classifier. Taking the average of the local variance measure within the body of the grain will give a global measure of variants for this grain. To allow a pixel-wise comparison of the local variance with the global variance, an artificial image is constructed, with the whole grain set to the global variance value. Then a fifth image for the classifier is derived.
The classifier also needs to use an image which is the monochromatic mean of the R, and B channels of the first image. This mean represents the "lightness" of the first image and the mean becomes a sixth image for the classifier.
A further issue addressed is reducing confusion between the colour in small dark blotches in mottled stain grains (which typically are shades of grey or dark brown) in contrast to either field fungi staining or blackpoint staining. The technique adopted involves determining a local texture variable which represents the degree of mottle.
This texture measure is derived from the raw colour image. A median filter with a large window e.g. 81 by 81 pixels can be used to remove gross brightness variations over the surface of the grain. A further refinement before applying the median filter is to recognise that bright flecks of dust would disrupt local statistics. Distinct bright regions in the image were replaced by values of their surrounding regions. After applying the first median filter, a second median filter is applied using a smaller window of the average of the mottled spots e.g. 21 by 21 pixels. The mottled texture measure was then obtained by taking the difference between these two median filtered images. The resultant of this is a seventh image for the classifier.
A further requirement is to determine the spatial regions of the grain and in particular to identify the germ portion and the opposite brush end portion. The germ portion may have significant local curvature and thus dark shadows can appear in sound grain as distinct from darkness caused by blackpoint stain. In the brush portion ambiguity can arise because of a mixture of brush elements and background in a pixel resulting in a grey colour suggestive of field fungi stain, even though the grain is sound. Therefore an automated procedure is used to find both the brush and germ regions of the grain, the body of the grain being defined as the zone between these regions. The first step is to determine the grain orientation i.e. which end is germ and which is brush. Linear discriminate analysis of shape and texture features extracted from at the end regions of the grain is used.
The germ region of a grain can be found by locating the strong edges which are typically created by a high local curvature of the boundary between the germ and the body of the grain. The inverse of an edge strength image is taken (known as a cost image) in which the shortest path through the cost image can be determined to act as a deemed boundary of the germ portion.
The brush end can then be determined by either locating fine horizontal hairs or finding the fine protrusions of the hairs into the background and offsetting by the average length of the hairs. This latter approach is preferred. This can best be implemented by first creating a more sensitive mask of the grain to ensure that even the finest hairs projecting from the grain can be detected.
The computer system is then used to make a per pixel classification. A preferred embodiment uses some 11 colour and texture measures for each pixel namely the values of the logarithms of red, green, blue, smoothed red, smoothed green, smoothed blue and lightness values. The eighth value is the logarithm of the surface reflectance profile across the grain. The ninth and tenth values are respectively the logarithms of the local and global relative brightness (CVA), and the eleventh value is the texture measure designed to detect the dark motley texture of sound grains.
As an initial pixel-based classification, a linear discriminate analyser (LDA) was used on the eleven measurements. Logarithms were used to allow the LDA classifier the option of ratioing the variables. The surface reflectance correction is achieved by dividing each of the colour images by the surface profile. The relative soundness of the grain colour can be found by ratioing the local and global variance measures. LDA classifier weights for the eleven variable measurements at each pixel created a per pixel classification. Pixels with low posterior probabilities were left unclassified. The process also uses spatial region information applied to the per pixel stain classification to give an output of per-grain numerical features so that a tree classifier such as a Recursive Partitioning and Regression Tree (RPART) classifier can be used to derive the rules to produce the grain classification as a whole. For this process it is first necessary to convert the per-pixel classification maps into a simple numerical summary of the colour of each grain region i.e. the germ region, the brush region or the body region. This is accomplished by taking a mask of the body of the grain and for the pixels having a classification with a certainty of greater than 90% posterior probability, recording the number of pixels in each class, the classes being;
a) combination of mottled and sound classes b) blackpoint c) field fungi and d) pink
Using a mask for the identified germ region, the procedure is repeated and thirdly for the mask of the brush region is the procedure is repeated again. However, it has been found the brush region is often of little diagnostic value and one option is to combine the germ and body results into a "no-brush" result to derive the total number of pixels in the no-brush with sure classification in terms of the four classes.
It may also be useful to record the proportions of each of the four classes in each grain region.
It has been already noted that sometimes stain colour does not conform to simple colour rules e.g. blackpoint strain is usually brown but can also be grey-black in colour. Therefore stain colour classes that can be confused are best combined. Combining brown and grey black colours is useful because such colours can be each indicative of either Blackpoint stain or field fungi stain. Similarly combining pink and brown colours is useful because a dark pink stain is indistinguishable from dark brown.
Therefore outside the brush regions ("no-brush") it can be useful to determine and use the ratio of pink to (pink plus blackpoint) areas and the ratio of field fungi to (field fungi plus blackpoint) areas. In order to optimise results, the degree of spatial continuity of patches of stain is also a relevant question. An example of a relevant question is whether a stain of brown colour is present in a continuous region or in isolated colonies? The answer determines the classification. Such spatial continuity information is derived by obtaining the area and the proportion of area that is connected with the number and average size of blobs of apparent staining responsible for most of the total area. Accordingly detection and computation based on labelling each individual Blackpoint object in the no-brush region and sorting by size is an appropriate technique. The area of the first object in the sorted list is the largest connected region and determination is made of the proportion of the total area within the largest connected region. The cumulative sum of the areas of the blobs can be considered to find the number of blobs that make up the majority say 75% of the area.
The same procedure suitable for the field fungi mask is used to obtain the area and the proportion of area that is connected and the number and average size of blobs responsible for the majority of the area.
In assessing grain for blackpoint, it is important to determine whether the apparent blackpoint stain in the body region is connected to the blackpoint stain in the germ region. If it is unconnected, then it is not blackpoint which develops from the germ region. To deal with this question it is appropriate to combine field fungi and blackpoint masks because the blackpoint stain although usually brown can also be grey- black in colour. The continuity of the combined mask is assessed together with the proportion of area that is connected and the number and average size of the blobs responsible for the majority of the total of the field fungi plus blackpoint apparent area. A further question to be answered is "how much of the combined mask is connected to the germ?". This can be determined by finding the region of the field fungi-blackpoint mask within the germ. A morphological reconstruction of the connected components of the field fungi-blackpoint mask from the germ region is then effected. A similar procedure is then followed to determine the area and the proportion of area that is connected to blackpoint in the germ by way of calculation. Having obtained a set of features summarising the colour and spatial attributes of the grain, a set of rules to produce the per-grain classification is derived using an RPART classifier. It has been recognised in embodiments of the present invention that concerns are that false positive values for sound grains is a more important performance criteria for an automated system then the true positive rates. For example it has been assessed that false positive rates for detection of fungal contamination had to be substantially better than 0.3% since otherwise nearly every batch of sound grain tested would be incorrectly rejected.
Figure 4 is an example of RPART classifier rules determined for a particular wheat variety for a particular geographical location and a particular season. It may be that seasonal variation characteristics as well as geographical area and varietal variations would produce different rules for determination after a careful calibration with a training set has been achieved. As shown in Figure 4, the decision tree is firstly to assess the area of pink in the body plus germ region (nobsh.pink) as to whether it is less than a determined critical value. If the area is greater than the critical value, then the next decision is whether the grain is deemed to suffer from blackpoint or pink strain. This is a decision by taking the ratio of pink to (pink plus brown) in the body plus germ region (nobsh.p2pb.r) and if the value exceeds a threshold value then pink is the classification and below that value blackpoint is the classification.
The alternative branch of a tree is where the (nobsh.pink) value is less than the critical value and then the decision is whether the connected area of grey plus brown in the body plus germ region (nobsh.FFB.con) is greater or less than a critical value. If it is less than a critical value then it is determined to be sound grain. If it is greater than the threshold value (432.5 in the example) then the next decision in the tree is considered namely the ratio of grey to grey plus brown in the body plus germ region ("nobsh.FF2FFB.r"). In the example given the value for the threshold was 0.1512 and the final decision in the tree occurs depending on whether the threshold is exceeded or not.
The final test to determine between blackpoint and sound is to consider the connected area of brown in the body germ region ("nobsh.BB.con"). If the value in this example of 4117 is exceeded then the decision is that blackpoint subsists in the grain kernel but if the value is less than this threshold the grain is deemed sound.
The alternative element of the tree is to consider the proportion of brown in the germ region ("germ.blackpoint.p")- If the value in the example 0.2642 is exceeded then blackpoint is deemed to subsist but otherwise a field fungi contamination is deemed to apply.
The result of the decisions in the particular tree of Figure 4 is that a grain will be classified as one of sound grain, blackpoint, field fungi or pink. By classifying an entire sample of grains (typically many hundred), the proportion in each class can be determined and this is found to accurately correlate with expert human classification.
The table of Figure 5 shows results for classification in three hyperspectral examples and a fourth example being the RGB colour plus spatial combination. Thus results of the order of 98% correct classification are obtainable.
For higher accuracy it is considered that grain should be orientated and its image only relied upon if it is crease side down and the germ is fully visible.
It is also considered that even higher accuracy may be achievable with the sophistication of stereo pairs of images to map the depth of the grain surface to discriminate between a darker grain colour caused by shadowing of shot grains by contrast to those having darkness due to staining.

Claims

1. A method of assessing cereal grain for staining comprising subjecting a multiplicity of grain kernels to diffuse lighting, capturing images of each grain kernel and processing each image with masking to exclude the edge portion of the grain and the brush region to determine the indicative portion of the grain and analysing the image thereof to determine whether the extent of colouration and distribution subsists above a threshold value for the grain to be classified as defective in terms of pink stain or blackpoint or field fungi or otherwise whether the grain is sound, taking into account that sound grain may be relatively uniform in colour but may also be mottled, the image data being processed to discriminate against false positives having regard to the spatial distribution of apparent staining and its variability in colour across the germ and body portions of the grain kernel.
2. A method as claimed in Claim 1, wherein a statistical classifier is used on an digital image of the grain pixel by pixel to provide a stain classification.
3. A method as claimed in Claim 2, wherein the method operates on red, blue and green portions of the spectrum and the method uses a lighting corrected and colour calibrated primary image masked to the indicative portions of the body and germ end portions of the grain.
4. A method as claimed in Claim 3, wherein further images derived from the primary image are formed and applied to a statistical classifier.
5. A method as claimed in Claim 4, wherein the primary image is processed to provide one or more of the following images for the statistical classifier, the images being selected from or consisting of:-
a second image being a monochromatic image establishing a correction for false apparent variations in the first image due to curvature of the grains and;
a third image being derived by replacing the value for each pixel with a mean value for that pixel and its group of neighbours, whereby a smoothed image is provided; a fourth image being derived by first effecting a linear transformation along the first canonical axis to separate sound grain from that having various degrees of staining, secondly preparing a mask of the part of the image in the most bright portion and deriving the mean value with the mask and thirdly establishing the fourth image as the variance from each pixel of the mean value;
a fifth image based on the global average of the value of the pixels in the body portion of the grain;
a sixth image derived on the mean of the red, green and blue values in each pixel as a "lightness" indicator; and
a seventh image derived by firstly replacing a small region of extreme brightness in the "lightness" image with the value in the neighbourhood, secondly by median filtering with a relatively small and a relatively large window and determining the difference between these median filtered images to produce the sixth image.
A method as claimed in Claim 5, wherein values supplied per pixel to the classifier comprise :
• the logarithms of the values per pixel in the red, green and blue portions of the spectrum;
• logarithms of the smoothed red, smoothed green and smoothed blue portions of the visible spectrum to discriminate in colour from mottled sound grains from those having sparse colonies of field fungi;
• the logarithm of the value of lightness from the combination of the red, green and blue spectra;
• logarithm of the surface reflectance profile across the girth of the grain.
• logarithms of the local and global relative brightness variants.
• the texture measure designed to detect dark motley texture of sound grains.
7. A method as claimed in Claim 5, wherein the classifier is a linear discriminate analysis classifier.
8. A method as claimed in Claim 5, wherein the output image from the classifier is correlated with spatial information on the brush, body portion and germ portion of the grain to provide an indication of grain class with numerical features and a trained tree classifier is used to classify the grain providing a high level of probability of correctness is determined to subsist in respect of the grain class indicated.
9. A method as claimed in any one of the preceding claims, wherein the image is examined to detect if a grain is cracked or has any shell protrusions, and in either case the grain is rejected from classification.
10. A method as claimed in Claim 1, wherein a hyperspectral camera is used to record grain images and a statistical analysis is applied to the images for classification purposes.
11. An apparatus for grain assessment and for the method of any one of the preceding claims.
12. A digital image processing system for receiving primary image of a grain and providing a classification of these grain with respect to sound, pink, blackpoint and field fungi and wherein the primary image is processed to provide one or more of the following images for the statistical classifier, the images being selected from or consisting of:-
a second image being a monochromatic image establishing a correction for false apparent variations in the first image due to curvature of the grains and;
a third image being derived by replacing the value for each pixel with a mean value for that pixel and its group of neighbours, whereby a smoothed image is provided;
a fourth image being derived by first effecting a linear transformation along the first canonical axis to separate sound grain from that having various degrees of staining, secondly preparing a mask of the part of the image in the most bright portion and deriving the mean value with the mask and thirdly establishing the fourth image as the variance from each pixel of the mean value;
a fifth image based on the global average of the value of the pixels in the body portion of the grain;
a sixth image derived on the mean of the red, green and blue values in each pixel as a "lightness" indication; and
a seventh image derived by firstly replacing a small region of extreme brightness in the "lightness" image with the value in the neighbourhood, secondly by median filtering with a relatively small and a relatively large window and determining the difference between these median filtered images to produce the sixth image.
13. A method as claimed in any one of claims 1-9, and further comprising feeding into a tree classifier a series of summary features characterising the spatial relationship of the colour classes.
14. A method as claimed in claim 13, wherein the per-pixel classification images are converted into a simple numerical summary of the colour and spatial relationships of the colour for each grain by a process substantially as herein described.
PCT/AU2006/001902 2005-12-14 2006-12-14 Stain assessment for cereal grains Ceased WO2007068056A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2005907031 2005-12-14
AU2005907031A AU2005907031A0 (en) 2005-12-14 Stain assessment for cereal grains

Publications (1)

Publication Number Publication Date
WO2007068056A1 true WO2007068056A1 (en) 2007-06-21

Family

ID=38162485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2006/001902 Ceased WO2007068056A1 (en) 2005-12-14 2006-12-14 Stain assessment for cereal grains

Country Status (1)

Country Link
WO (1) WO2007068056A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011155888A1 (en) * 2010-06-09 2011-12-15 Umbio Ab Procedure for hyperspectral image analysis in real time
EP2674746A1 (en) * 2012-06-13 2013-12-18 Bayer CropScience AG Device and method for optical quality control of the coating and staining of a granular substrate
WO2015004672A1 (en) * 2013-07-11 2015-01-15 Igal Loevsky A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment
JP2017203652A (en) * 2016-05-10 2017-11-16 株式会社Msテクノロジー Color measuring apparatus and color measuring method
WO2018236844A1 (en) * 2017-06-19 2018-12-27 ImpactVision, Inc. SYSTEM AND METHOD FOR HYPERSPECTRAL IMAGE PROCESSING TO IDENTIFY AN OBJECT
WO2018236842A1 (en) * 2017-06-19 2018-12-27 ImpactVision, Inc. SYSTEM AND METHOD FOR HYPERSPECTRAL IMAGE PROCESSING FOR IDENTIFYING FOREIGN OBJECTS
CN109843034A (en) * 2016-10-19 2019-06-04 巴斯夫农化商标有限公司 Production forecast for wheatland
CN110376340A (en) * 2019-07-16 2019-10-25 江苏华升面粉有限公司 A kind of raw material detection device suitable for flour processing
CN110538811A (en) * 2018-05-29 2019-12-06 惠特科技股份有限公司 Automatic edge-emitting laser dual-temperature synchronous detection and classification equipment
WO2020040817A1 (en) * 2018-08-20 2020-02-27 General Mills, Inc. Method of producing gluten free oats through hyperspectral imaging
RU2720867C2 (en) * 2014-12-26 2020-05-13 Дир Энд Компани Monitoring grain quality
CN113109240A (en) * 2021-04-08 2021-07-13 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
JP2022189233A (en) * 2021-06-11 2022-12-22 東京技研工業株式会社 Visual appearance inspection method and visual appearance inspection device
CN116343199A (en) * 2023-05-25 2023-06-27 安徽高哲信息技术有限公司 Wheat black embryo grain identification method, device, electronic equipment and readable storage medium
WO2023200006A1 (en) * 2022-04-15 2023-10-19 国立研究開発法人農業・食品産業技術総合研究機構 Determination device, determination method, and control program
CN116993527A (en) * 2023-09-26 2023-11-03 深圳市金新农科技股份有限公司 An optimized collection and monitoring method for pig feed production data

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989011644A1 (en) * 1988-05-20 1989-11-30 Tepral Societe Anonyme Process and apparatus for automatic determination of physical-chemical parameters of a batch of grains
GB2333628A (en) * 1997-11-07 1999-07-28 New Royal Holloway & Bedford Detection of contaminants in granular material
JP2002139443A (en) * 2000-10-31 2002-05-17 Kett Electric Laboratory Quality discrimination apparatus for grain, etc.
WO2002048687A2 (en) * 2000-10-30 2002-06-20 Monsanto Technology Llc Methods and devices for analyzing agricultural products
US20040141641A1 (en) * 2003-01-21 2004-07-22 Mcdonald Miller Baird Seed image analyzer
JP2005055245A (en) * 2003-08-01 2005-03-03 Seirei Ind Co Ltd Apparatus and method for sorting grain
JP2005083776A (en) * 2003-09-05 2005-03-31 Seirei Ind Co Ltd Grain classifier
JP2005091159A (en) * 2003-09-17 2005-04-07 Seirei Ind Co Ltd Grain separator
US20050074146A1 (en) * 2003-09-17 2005-04-07 Advanta Technology, Ltd. Method and apparatus for analyzing quality traits of grain or seed

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989011644A1 (en) * 1988-05-20 1989-11-30 Tepral Societe Anonyme Process and apparatus for automatic determination of physical-chemical parameters of a batch of grains
GB2333628A (en) * 1997-11-07 1999-07-28 New Royal Holloway & Bedford Detection of contaminants in granular material
WO2002048687A2 (en) * 2000-10-30 2002-06-20 Monsanto Technology Llc Methods and devices for analyzing agricultural products
JP2002139443A (en) * 2000-10-31 2002-05-17 Kett Electric Laboratory Quality discrimination apparatus for grain, etc.
US20040141641A1 (en) * 2003-01-21 2004-07-22 Mcdonald Miller Baird Seed image analyzer
JP2005055245A (en) * 2003-08-01 2005-03-03 Seirei Ind Co Ltd Apparatus and method for sorting grain
JP2005083776A (en) * 2003-09-05 2005-03-31 Seirei Ind Co Ltd Grain classifier
JP2005091159A (en) * 2003-09-17 2005-04-07 Seirei Ind Co Ltd Grain separator
US20050074146A1 (en) * 2003-09-17 2005-04-07 Advanta Technology, Ltd. Method and apparatus for analyzing quality traits of grain or seed

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DATABASE WPI Week 200248, Derwent World Patents Index; Class S03, AN 2002-451770, XP003014367 *
DATABASE WPI Week 200519, Derwent World Patents Index; Class P43, AN 2005-177411, XP003014369 *
DATABASE WPI Week 200526, Derwent World Patents Index; Class P43, AN 2005-247780, XP003014368 *
DATABASE WPI Week 200531, Derwent World Patents Index; Class P43, AN 2005-299239, XP003014366 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011155888A1 (en) * 2010-06-09 2011-12-15 Umbio Ab Procedure for hyperspectral image analysis in real time
EP2674746A1 (en) * 2012-06-13 2013-12-18 Bayer CropScience AG Device and method for optical quality control of the coating and staining of a granular substrate
WO2013186144A1 (en) * 2012-06-13 2013-12-19 Bayer Cropscience Ag Device and method for optical quality control of the coating or staining of a kernel-type substrate
AU2013276693B2 (en) * 2012-06-13 2017-05-11 Bayer Cropscience Ag Device and method for optical quality control of the coating or staining of a kernel-type substrate
US10013771B2 (en) 2012-06-13 2018-07-03 Bayer Cropscience Ag Device and method for optical quality control of the coating or staining of a kernel-type substrate
WO2015004672A1 (en) * 2013-07-11 2015-01-15 Igal Loevsky A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment
RU2720867C2 (en) * 2014-12-26 2020-05-13 Дир Энд Компани Monitoring grain quality
US10664726B2 (en) 2014-12-26 2020-05-26 Deere & Company Grain quality monitoring
JP2017203652A (en) * 2016-05-10 2017-11-16 株式会社Msテクノロジー Color measuring apparatus and color measuring method
CN109843034A (en) * 2016-10-19 2019-06-04 巴斯夫农化商标有限公司 Production forecast for wheatland
WO2018236844A1 (en) * 2017-06-19 2018-12-27 ImpactVision, Inc. SYSTEM AND METHOD FOR HYPERSPECTRAL IMAGE PROCESSING TO IDENTIFY AN OBJECT
US11410295B2 (en) 2017-06-19 2022-08-09 Apeel Technology, Inc. System and method for hyperspectral image processing to identify foreign object
US11443417B2 (en) 2017-06-19 2022-09-13 Apeel Technology, Inc. System and method for hyperspectral image processing to identify object
US10902581B2 (en) 2017-06-19 2021-01-26 Apeel Technology, Inc. System and method for hyperspectral image processing to identify foreign object
WO2018236842A1 (en) * 2017-06-19 2018-12-27 ImpactVision, Inc. SYSTEM AND METHOD FOR HYPERSPECTRAL IMAGE PROCESSING FOR IDENTIFYING FOREIGN OBJECTS
US10902577B2 (en) 2017-06-19 2021-01-26 Apeel Technology, Inc. System and method for hyperspectral image processing to identify object
CN110538811A (en) * 2018-05-29 2019-12-06 惠特科技股份有限公司 Automatic edge-emitting laser dual-temperature synchronous detection and classification equipment
US11376636B2 (en) 2018-08-20 2022-07-05 General Mills, Inc. Method of producing gluten free oats through hyperspectral imaging
WO2020040817A1 (en) * 2018-08-20 2020-02-27 General Mills, Inc. Method of producing gluten free oats through hyperspectral imaging
CN110376340A (en) * 2019-07-16 2019-10-25 江苏华升面粉有限公司 A kind of raw material detection device suitable for flour processing
CN113109240A (en) * 2021-04-08 2021-07-13 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
CN113109240B (en) * 2021-04-08 2022-09-09 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
JP2022189233A (en) * 2021-06-11 2022-12-22 東京技研工業株式会社 Visual appearance inspection method and visual appearance inspection device
WO2023200006A1 (en) * 2022-04-15 2023-10-19 国立研究開発法人農業・食品産業技術総合研究機構 Determination device, determination method, and control program
CN116343199A (en) * 2023-05-25 2023-06-27 安徽高哲信息技术有限公司 Wheat black embryo grain identification method, device, electronic equipment and readable storage medium
CN116343199B (en) * 2023-05-25 2023-09-19 安徽高哲信息技术有限公司 Wheat black embryo grain identification method, device, electronic equipment and readable storage medium
CN116993527A (en) * 2023-09-26 2023-11-03 深圳市金新农科技股份有限公司 An optimized collection and monitoring method for pig feed production data
CN116993527B (en) * 2023-09-26 2024-01-23 深圳市金新农科技股份有限公司 Live pig feed production data optimization acquisition monitoring method

Similar Documents

Publication Publication Date Title
Converse et al. Discrimination of whole from broken corn kernels with image analysis
Leemans et al. Defects segmentation on ‘Golden Delicious’ apples by using colour machine vision
US7218775B2 (en) Method and apparatus for identifying and quantifying characteristics of seeds and other small objects
Markovic et al. Color measurement of food products using CIE L* a* b* and RGB color space.
WO2007068056A1 (en) Stain assessment for cereal grains
Tao et al. Machine vision for color inspection of potatoes and apples
Delwiche et al. Hyperspectral imaging for detection of scab in wheat
JP7412556B2 (en) Method and apparatus for identifying effect pigments in target coatings
CN110646354B (en) Color testing device and method for cotton fibers
CN104871175A (en) Method for scoring and controlling quality of food products in a dynamic production line
US20110007151A1 (en) Imaging Method For Determining Meat Tenderness
Chao et al. High throughput spectral imaging system for wholesomeness inspection of chicken
Lai et al. Application of pattern recognition techniques in the analysis of cereal grains
CN113109240B (en) Method and system for determining imperfect grains of grains implemented by computer
Shahin et al. Lentil type identification using machine vision
EP3896650A1 (en) Quality control system for series production
Al-Rahbi et al. Detecting surface cracks on dates using color imaging technique
CN118067639A (en) A hyperspectral image acquisition and analysis method for evaluating the aging degree of textiles
CN113777104B (en) A method for hyperspectral detection of maturity of single corn seeds
US20240428388A1 (en) Soybean Quality Assessment
KR20090124335A (en) Color Factor Determination Method for Fruit Color Selection and Fruit Color Grading Method
Yin et al. Image processing techniques for internal texture evaluation of French fries
CN119180873A (en) Pork color accurate scoring model based on pork color standard scoring board image and rapid construction method thereof
CN108885169A (en) Grain fineness evaluation method and device
Mendoza et al. Predicting ripening stages of bananas (Musa cavendish) by computer vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06840377

Country of ref document: EP

Kind code of ref document: A1