[go: up one dir, main page]

WO2019229767A1 - A food grain profiling system - Google Patents

A food grain profiling system Download PDF

Info

Publication number
WO2019229767A1
WO2019229767A1 PCT/IN2019/050413 IN2019050413W WO2019229767A1 WO 2019229767 A1 WO2019229767 A1 WO 2019229767A1 IN 2019050413 W IN2019050413 W IN 2019050413W WO 2019229767 A1 WO2019229767 A1 WO 2019229767A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
grain
grains
food grains
profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IN2019/050413
Other languages
French (fr)
Inventor
Ashiesh SHUKLA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2019229767A1 publication Critical patent/WO2019229767A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/02Investigating particle size or size distribution
    • G01N15/0205Investigating particle size or size distribution by optical means
    • G01N15/0227Investigating particle size or size distribution by optical means using imaging; using holography
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Definitions

  • This invention relates to identification of parameters such as length of food grains through a hand held device.
  • Grain size has always been an essential step towards measuring the quality of the grain. Measuring grain size from a population of grains is always a tedious task. Sample of grains can be measured to ascertain the quality of grain, which may further assist in benchmarking and pricing of grains. Determination of dimensions of food grains used to be a manual exercise. The manual method used to be a time taking and unfeasible method, which involved lot of man hours.
  • a method for determining a grain profile comprising, receiving a plurality of grains on a flat plate, scanning the plurality of food grains receivedon the flat plate by a hand held device to obtain an image, sending by the hand held device, the obtained image to a server, processing the image by the server based on the reference points on the reference lines, the obtained image to generate a profile for the plurality of food grains, generating a grain profile for display on a device.
  • the method further comprises grain profile including a texture of the food grains.
  • the plurality of food grains are received on the identified area of the apparatus.
  • the obtained image comprises texture values of the food grains and reference lines in the image.
  • the processing of the obtained image comprises, determining the dimension profile of the food grains by comparing the texture values of food grains with texture values of reference lines provided on the plate.
  • sending the obtained image comprises sending the image to the server via an application.
  • a system for determining a grain profile is provided.
  • the system is configured to, receive a plurality of grains on a flat plate, scan the plurality of food grains received on the flat plate by a hand held device to obtain an image, send by the hand held device, the obtained image to a server, process by the server, the obtained image to generate a profile for the plurality of food grains, generate a grain profile for display on a device.
  • the system also comprises flat plate has a plurality of reference lines. The food grains are received on the identified area of the flat plate.
  • the hand held device is configured to send the obtained image file to the server.
  • the server is configured to process the received image file and generate a grain profile.
  • the server is configured to send the generated grain profile to a display device.
  • An object of the present invention is to provide a method and apparatus for automatically detecting grains and determining if individual grains are broken.
  • Another objection of the invention is to provide a method and apparatus for automatically detecting the food grain and ascertaining a profile of the food grain.
  • Another object of the invention is to provide a method an apparatus for
  • Another object of the invention is to provide a method and apparatus for automatically inspecting rice and provide information that can be used to adjust and/or optimize the rice-milling process and processing plant operations.
  • Fig.1 describes the overview of the complete process for determining a grain profile.
  • Fig. 2A and Fig. 2B describes the embodiments of the image capture by the hand held device according to an embodiment of the invention.
  • Fig. 4 - describes a cloud computing system.
  • Fig. 5 describes a plate.
  • Fig. 6 illustrates the method capturing an image according to an embodiment of the invention.
  • FIGS. 1 through 5 discussed below, and the various embodiments used to describe the principles of the present invention in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the invention. Those skilled in the art will understand that the principles of the invention may be implemented in any type of suitably arranged device or system.
  • Figure 1A describes an apparatus 100, the apparatus includes a plate 101 , a hand held device 103, food grains 102, a cloud 104 and a device 105.
  • the plate 101 comprises of the platform over which plurality of food grains are placed.
  • the plate comprises of a plurality of reference lines which provide a plurality of reference points.
  • the user may place the food grains in any manner such that they do not overlap with each other.
  • the food grains can comprise grains such as rice, wheat, barley, pulses etc.
  • the food grains may be placed on the plate in a spatial manner.
  • the user may then use the hand held device to capture an image of the plate.
  • the captured image consists of the entire plate and of the dispersed food grains as shown in figure 1 and figures 2A and 2B.
  • the hand held device 103 used for capturing an image can be a cellular mobile device.
  • the captured image of the food grain is in the visible spectrum, which encompasses the wavelength range of 400 nm to 700 nm, by on-line vision equipment such as a having a camera.
  • the user may then send the captured image through the hand held device to the cloud 104.
  • the hand held device may be connected to the cloud in a manner known to a person skilled in the art.
  • the cloud analyses the received image with respect to the reference points provided on the reference lines of the plate.
  • the analyzed output from the cloud may further be sent to a remote display to a second user 106.
  • An image analysis processing unit in the cloud is used for analysing the parameters of the food grains.
  • the cloud may establish a connection with a plurality of display devices including the hand held device so as to display the analysed parameters.
  • the analysed data is shared with any of the display devices to enable a user or users to check the parameters of the food grains.
  • Figure 1A describes a user capturing an image according to an embodiment of the invention.
  • the hand held device can be placed at any distance as may be desirable from the plate.
  • the captured image comprises of the entire plate along with the grains.
  • the user through the hand held 103 device may check the image before opting for sending the image to the cloud 104 for analysis.
  • the user may capture a plurality of images and may ascertain an image before opting to send the image for analysis.
  • the hand held device 103 displays at least one of the plurality of captured images.
  • the hand held device may further comprise receiving a user selection to select at least one displayed image of the plurality of captured image.
  • the hand held device may store the at least one displayed captured image selected by the user.
  • the image capture may further comprise capturing an image based on a capture criteria set by a user.
  • the at least one image capture criteria may be set by the hand held device.
  • the at least one image capture criteria may comprise at least one of the criteria to use a flash, an image magnification, an image resolution, a display screen brightness, a color effect among other criteria well known to a person skilled in the art.
  • Figures 2A and 2B describes one of the ways to capture the image.
  • the hand held device covers the entire body of the plate, such that the plate is visible in its entirety on the display device.
  • Figure 2A describes placing the hand held the device on the plate in the correct position wherein the plate is visible in its entirety.
  • the figure illustrates screen area as 201.
  • the screen area comprises of the area within which the camera captures the image.
  • the reference lines are indicated as 202 and 203. In the embodiment the reference lines 202 and 203 are framed inside the screen area. The screen completely covers the plate long with the grains placed thereon.
  • Figure 2B describes an embodiment illustrating the image captured as per the invention.
  • the captured image comprises of the grains, reference lines and the plate in its entirety.
  • the image describes the reference lines 202 and 203 completely captured within the image.
  • the image may be taken at any angle or distance from the hand held device to the plate.
  • Figure 3 describes a hand held device according to an embodiment of the disclosure.
  • the hand held device is configured for capturing an image.
  • the hand held device comprises of a transceiver 301 , an image capture unit 302, a display unit 303, input unit 304 and a storage unit 305.
  • the transceiver is configured to transmit and receive signals to and from, the cloud.
  • the image capture unit of the hand held device is configured to capture a plurality of images according to an embodiment of the disclosure.
  • the plurality of images may be captured according to an image capture criteria set by the user.
  • the display unit of the hand held device is operatively coupled to the image capture unit, configured to display at least one image of the plurality of captured images.
  • the input unit of the hand held device is communicatively coupled to the display unit, configured to receive a user selection to select at least one displayed image.
  • a storage unit is communicatively coupled to the display unit and the input unit, and is configured to store the at least one displayed image selected by a user.
  • the storage unit is further configured to store the plurality of images captured by the user.
  • the hand held device also comprises a controller 306 for controlling the functions of the hand held device.
  • the controller may function as a processor such as a mobile station modem (MSM) or a digital signal processor (DSP).
  • the image capture unit may comprise a lens such as an optical lens used to capture image in a visible spectrum.
  • the display unit may comprise a liquid crystal display (LCD) for display of the captured image.
  • the input unit may preferably be constructed and configured to receive a control command including a user command pertaining to image capture.
  • the input unit may be a key pad or a touch screen, for example.
  • the storage unit may store software related to image capture and may be constructed as a volatile or non-volatile memory.
  • the controller may preferably perform image capture by using the software related to image capture.
  • the storage unit may store an image captured by the image capture unit and may comprise a buffer for temporarily storing the captured image and/or an image storage unit for storing an image selected by the user.
  • the controller of the hand held device receives the image capture commencement command from the user and thereafter directs the image capture unit to perform an operation of image capture.
  • the controller also commands the image capture unit to begin the image capture according to one or more image capture criteria.
  • the image capture criteria may preferably be set prior to the capture of the image.
  • the at least one of the image capture criteria is set by the user.
  • the image capture criteria may preferably be stored in the storage unit.
  • the image capture criteria may include whether or not to use a flash, an image magnification, a resolution, display screen brightness, and a color effect or any such criteria as may be known to a person skilled in the art.
  • the controller 306 commands the image capture unit to sequentially capture images and to optionally vary a value and/or a type of at least one image capture criteria.
  • the controller also commands the image capture unit to output a sequence of captured images to the display unit.
  • the user may then view the plurality of captured images displayed on the display unit, select a desired image and command the controller to send the same to the cloud, for analysis.
  • the user may also provide the command to the controller to store the selected image in the storage unit.
  • the mobile terminal may also include an RF module, a power management module, an antenna, a battery, a subscriber identity module (SIM) card, a speaker, and a microphone, or some combination thereof.
  • SIM subscriber identity module
  • the transceiver of the hand held device is configured to be communicatively coupled to a computing cloud.
  • Figure 4 discloses a cloud computing system, the cloud computing system provides computational capacity, data access, networking/routing and storage services via a large pool of shared resources operated by a cloud computing system.
  • cloud computing system are delivered over a network, cloud computing is location-independent computing, with all resources being provided to end-users on demand with control of the physical resources separated from control of the computing resources.
  • FIG. 4 illustrates an example system 400 according to this disclosure.
  • FIG. 4 shows a hand held device connected to a computing cloud 401.
  • Computing cloud 401 comprises processing unit 411 and data storage unit 412, both of which are accessible to clients.
  • Computing cloud is a computing cloud that is capable of both storing information and performing data analysis on information.
  • a computing cloud comprises at least one computer that is accessible from a remote location.
  • the computing cloud may comprise a plurality of storage devices referred to as storage unit, as well as a plurality of processing units referred to collectively as the processing unit.
  • the computing cloud may comprise hardware that is cost prohibitive to deploy and maintain at individual clients.
  • the computing cloud may comprise software that is cost prohibitive to install, deploy, and maintain at individual computing clouds.
  • the computing cloud provides hardware and software remotely through a secure connections to hand held devices. While there is one computing cloud shown in the figures, it is explicitly understood that a plurality of clouds may be consistent with the disclosures.
  • Fland held devices are devices in communication with the computing cloud. There may be a plurality of hand held devices which may access both the processing unit and storage unit that are located in the computing cloud simultaneously.
  • the hand held device communicates with the computing cloud through any secured or unsecured method, including Flypertext Transfer Protocol Secure (HTTPS), secure telnet, or file transfer protocol secure (FTPS). It is understood that secure methods may be preferred over unsecure methods, and that the particular method chosen will depend upon the requirements of the function being accessed. This disclosure should not be interpreted as being limited to any particular protocol or method of transferring data.
  • the communication between the handheld devices and the computing cloud may be unidirectional or bidirectional. In many of the systems and methods disclosed herein, bidirectional communication is preferred.
  • the phrase “unidirectional communication” refers to communication in which data is sent from one communications device to a second communications device.
  • the term “bidirectional communication” refers to communication where data is sent and received by two or more communication devices.
  • Figure 5 illustrates the plate 500.
  • the plate comprises of a defined boundary 501.
  • the plate may be rectangular, square or of any predefined geometrical shape.
  • the surface 502 of the plate is horizontal to the ground and houses the grains which are to be analyzed.
  • the surface 502 may be surrounded in proximity with a secure frame.
  • the surface 502 comprises a set of reference lines 504.
  • the set of reference lines 504 comprise of at least two parallel lines.
  • the set of reference lines are oriented parallel to the outer boundary of the plate. The distance of orientation of the reference lines from that of the outer boundary is pre-defined.
  • Each of the reference lines constitute plurality of reference points. These reference points can be in form of shape, color or both.
  • the surface 502 may preferably be of dark colour.
  • the reference lines also may preferably be of dark color.
  • the reference points may be configurable with respect to size and color.
  • the captured image is analysed by the processing unit of the cloud.
  • the analysis by the processing unit comprises extracting the resolution of the received image and analzing the texture values to ascertain the grain quality.
  • the texture values comprise reference points, grain image, boundary of grains and the color space factor. These reference points provided on the reference lines can be in form of shape, color or both.
  • the processing unit further extracts the reference points provided on the reference lines of the plate 101.
  • the processing unit identifies the reference points available on the plate.
  • the reference points are extracted from the reference lines, by the processing unit.
  • the processing unit is configured to extract the grain images separately.
  • the processing unit also identifies the grains in the image which are inside the object area, as separate objects.
  • the identification of grains as separate objects comprises identifying the boundary of the grains.
  • the processing unit analyses the image to find out‘number of pixels’ within each reference point of the reference lines.
  • the processing unit adjusts the pixel length based on the resolution of the image and obtains the final pixel length.
  • the processing unit identifies a mean of all identified reference points as l.
  • the processing unit calculates the color space factor using the reference points on the plate. Each reference point for color is also associated with the location the grain on the reference plate.
  • the processing unit is configured to detect colour of pixels of each reference points as image reference point pixel color.
  • the processing unit calculates the b( color space) value based on expected reference point color and image reference point pixel color of the image. A reference value of the each pixel in each reference point is provided to the processing unit as the expected reference point pixel color.
  • the processing unit is further configured to identify an object area.
  • the object area essentially comprises of the complete image.
  • the processing unit first identifies the top left corner pixel of the object area from the image.
  • the processing unit calculates the number of pixel vertically and horizontally present in the object area. Subsequent to acquiring the the number of vertical and horizontal pixels present in the object area, the processing unit uses the factor ltocalculateverticalpixelsandhorizontalpixels.
  • Vertical pixels are calculated as: vertical length on the actual object area/l.
  • Horizontal pixels are calculated as: horizontal length of the actual object area /l. Based on this analysis, the total number of horizontal pixels and vertical pixels are calculated for the entire object area.
  • the processing unit thereafter identifies the background pixels present on the object area and converts them to pure black colour space pixels.
  • the process of identifying the background pixels comprises the processor scanning and comparing each pixel of the image with b (color space) to determine if the pixel is a background pixel.
  • the processing unit then converts all background images into pure black colour space.
  • the processing unit starts identifying the grains in the image which are inside the object area and extracts them from the image as separate objects. This happens easily because all background pixels have been converted to pure black color space, so anything non-pure black remains on the object area is‘grain’..
  • the processing unit performs the straitening of the grains.
  • the captured image is further scanned with the help of identified factors l and b (Color Space).
  • Straighting the grains comprises individually extracted grains being further scanned to make them vertically straight.
  • the processing unit is configured to convert the grain image into a vector image and identifies a vertical edge of the grain.
  • the identification of vertical edge includes identifying number of pixels occupied by the grain vertically.
  • the identified number of pixels occupied by the grain vertically is known as grain vertical pixels.
  • the identification of horizontal edge including the number of pixels occupied by the grain horizontally.
  • the identified number of pixels occupied by the grain horizontally is known as grain horizontal pixels.
  • the processing unit creates a vector image by digital images through a sequence of commands or mathematical statements.
  • the vector image comprises sequence of commands or mathematical statements that place lines and shapes in a given two-dimensional or three-dimensional space.
  • the vector is a representation of both a quantity and a direction at the same time.
  • the processing unit calculate the angle of grains vertical edge to the actual vertical straight line of the extracted grain.
  • the entire image is then rotated with the calculated angle for measuring the parameters of each of the grain.
  • Figure 6 describes a method for determining a grain profile.
  • receiving a plurality of grains on a flat plate the grains are received from the user on the plate.
  • the grains may be placed in any manner such that they are non overlapping on each other.
  • scanning the plurality of food grains received on the flat plate by a hand held device to capture an image of the plate in accordance with an embodiment disclosed at figure 2A and 2B.
  • the user may capture multiple images and may select from a plurality of the images for sending the image from the hand held device to the cloud.
  • step 603 sending by the hand held device, the captured image to a cloud.
  • the image may be sent from the hand held device to the cloud by any means known to a persons killed in the art.
  • the sending of image may include sending via an internet through an application.
  • processing by the processing unit the obtained image to generate a profile for the plurality of food grains.
  • the processing unit obtains a grain profile of the each of the food grains comprising of the grain length, grain width and the grain colour.
  • the grain profile may further comprise of other conventionally known parameters which may be ascertained by the processing unit based on the grain pixel analysis.
  • the analysis of the food grains is performed by the processing unit as disclosed in the specification.
  • step 605 generating a grain profile for display on a device.
  • the generated grain profile comprises of grain length, grain width and the grain colour.
  • the grain profile generated may be in the form of diagram depicting the each of the above identified parameters.
  • the generated grain profile may be displayed to a user via a specially designed application. Based on the grain profile the user or any other person may make a decision as to the quality of the food grains.
  • the information about each of the parameters of the food grains may be associated with the cost of the food grains.
  • the method further comprises providing a grain profile of the food grains including a texture of the food grains.
  • the plurality of food grains are received on the identified area of the apparatus.
  • the obtained image comprises texture values of the food grains and reference lines in the image.
  • the processing of the obtained image comprises, determining the dimension profile of the food grains by comparing the texture values of food grains with texture values of reference lines provided on the plate.
  • Finally sending the obtained image comprises sending the image to the server via an application.
  • the handheld device software suite is intuitive application having capability to work as a grain scanner along with other features like dashboard, reports, alarms etc.
  • Handheld device will be connected to cloud server application over available internet network which are mobile internet using GSM module or Wi-Fi connectivity.
  • Handheld device and cloud server application will communicate over secured protocol which will enable customer to freely and securely pass the user information over internet without having the risk of data theft and maintaining data privacy.
  • the possibility of technology theft handheld device may be designed in a way to keep minimum possible components on the handheld device software suite and keep the mission critical technology components on the cloud server application and user interaction part on the handheld device which is using camera of device and device sensors like GPS, motion sensor etc.
  • various functions described in this patent document are implemented or supported by a computer program that is formed from computer readable program code and that is embodied in a computer readable medium.
  • the term “communicate” as well as derivatives thereof, encompasses both direct and indirect communication.
  • the term “or” is inclusive, meaning and/or.
  • the phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like.

Landscapes

  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Economics (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Dispersion Chemistry (AREA)
  • Immunology (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A method for determining a grain profile comprising receiving a plurality of grains on a flat plate, the plate includes a plurality of reference lines, scanning the plurality of food grains received on the flat plate by a hand held device to obtain an image, sending by the hand held device, the obtained image to a cloud, processing by the cloud, the obtained image to generate a profile for the plurality of food grains, wherein the processing of the obtained comprises analysing the parameters based on a plurality of reference points extracted from the reference lines, and generating a grain profile for display on a device, to enable a user to determine a cost associated with the food grains.

Description

Description
Title of Invention: A FOOD GRAIN PROFILING SYSTEM] Technical Field
[0001 ] This invention relates to identification of parameters such as length of food grains through a hand held device.
Background Art
[0002] Grain size has always been an essential step towards measuring the quality of the grain. Measuring grain size from a population of grains is always a tedious task. Sample of grains can be measured to ascertain the quality of grain, which may further assist in benchmarking and pricing of grains. Determination of dimensions of food grains used to be a manual exercise. The manual method used to be a time taking and unfeasible method, which involved lot of man hours.
[0003] With the advent of technology, image analytics has played a pivotal role in ascertaining physical dimensions and parameters of food grains. Several devices and methods are being used nowadays which use image analytics to ascertain the quality of food grain. However, the hardware currently present in the field of technology are not transportable easily. The processes and the hardware involved in this filed of technology are required to be setup each time when the grain size have to be measured.
[0004] Accordingly, there has arisen a need for simpler process of measuring grain size, where the hardware may be easily transportable and process used could be set up and could be performed even by unskilled person.
Summary of Invention
[0005] A method for determining a grain profile comprising, receiving a plurality of grains on a flat plate, scanning the plurality of food grains receivedon the flat plate by a hand held device to obtain an image, sending by the hand held device, the obtained image to a server, processing the image by the server based on the reference points on the reference lines, the obtained image to generate a profile for the plurality of food grains, generating a grain profile for display on a device. The method further comprises grain profile including a texture of the food grains. The plurality of food grains are received on the identified area of the apparatus. The obtained image comprises texture values of the food grains and reference lines in the image. The processing of the obtained image comprises, determining the dimension profile of the food grains by comparing the texture values of food grains with texture values of reference lines provided on the plate. Finally sending the obtained image comprises sending the image to the server via an application.
[0006] In another embodiment a system is provided for determining a grain profile is provided. The system is configured to, receive a plurality of grains on a flat plate, scan the plurality of food grains received on the flat plate by a hand held device to obtain an image, send by the hand held device, the obtained image to a server, process by the server, the obtained image to generate a profile for the plurality of food grains, generate a grain profile for display on a device. The system also comprises flat plate has a plurality of reference lines. The food grains are received on the identified area of the flat plate. The hand held device is configured to send the obtained image file to the server. The server is configured to process the received image file and generate a grain profile. The server is configured to send the generated grain profile to a display device.
Advantageous Effects of Invention
[0007] An object of the present invention is to provide a method and apparatus for automatically detecting grains and determining if individual grains are broken.
[0008] Another objection of the invention is to provide a method and apparatus for automatically detecting the food grain and ascertaining a profile of the food grain.
[0009] Another object of the invention is to provide a method an apparatus for
automatically detecting the food grains ascertain the profile and display the same on a remote location. [0010] Another object of the invention is to provide a method and apparatus for automatically inspecting rice and provide information that can be used to adjust and/or optimize the rice-milling process and processing plant operations.
Brief Description of Drawings
[0011 ] Fig.1 describes the overview of the complete process for determining a grain profile.
[0012] Fig. 2A and Fig. 2Bdescribes the embodiments of the image capture by the hand held device according to an embodiment of the invention.
[0013] Fig. 4 - describes a cloud computing system.
[0014] Fig. 5 describes a plate.
[0015] Fig. 6 illustrates the method capturing an image according to an embodiment of the invention.
Description of Embodiments
[0016] FIGS. 1 through 5, discussed below, and the various embodiments used to describe the principles of the present invention in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the invention. Those skilled in the art will understand that the principles of the invention may be implemented in any type of suitably arranged device or system.
[0017] Figure 1A describes an apparatus 100, the apparatus includes a plate 101 , a hand held device 103, food grains 102, a cloud 104 and a device 105. The plate 101 comprises of the platform over which plurality of food grains are placed. The plate comprises of a plurality of reference lines which provide a plurality of reference points. The user may place the food grains in any manner such that they do not overlap with each other. The food grains can comprise grains such as rice, wheat, barley, pulses etc. The food grains may be placed on the plate in a spatial manner. The user may then use the hand held device to capture an image of the plate. The captured image consists of the entire plate and of the dispersed food grains as shown in figure 1 and figures 2A and 2B. The hand held device 103 used for capturing an image can be a cellular mobile device. The captured image of the food grain is in the visible spectrum, which encompasses the wavelength range of 400 nm to 700 nm, by on-line vision equipment such as a having a camera. The user may then send the captured image through the hand held device to the cloud 104.
[0018] The hand held device may be connected to the cloud in a manner known to a person skilled in the art. The cloud analyses the received image with respect to the reference points provided on the reference lines of the plate. The analyzed output from the cloud may further be sent to a remote display to a second user 106.
[0019] An image analysis processing unit in the cloud is used for analysing the parameters of the food grains. The cloud may establish a connection with a plurality of display devices including the hand held device so as to display the analysed parameters. The analysed data is shared with any of the display devices to enable a user or users to check the parameters of the food grains.
[0020] Figure 1A describes a user capturing an image according to an embodiment of the invention. The hand held device can be placed at any distance as may be desirable from the plate. The captured image comprises of the entire plate along with the grains. The user through the hand held 103 device may check the image before opting for sending the image to the cloud 104 for analysis. The user may capture a plurality of images and may ascertain an image before opting to send the image for analysis.
[0021 ] In a preferred embodiment the hand held device 103 displays at least one of the plurality of captured images. The hand held device may further comprise receiving a user selection to select at least one displayed image of the plurality of captured image. The hand held device may store the at least one displayed captured image selected by the user. The image capture may further comprise capturing an image based on a capture criteria set by a user. The at least one image capture criteria may be set by the hand held device. The at least one image capture criteria may comprise at least one of the criteria to use a flash, an image magnification, an image resolution, a display screen brightness, a color effect among other criteria well known to a person skilled in the art.
[0022] Figures 2A and 2B describes one of the ways to capture the image. The hand held device covers the entire body of the plate, such that the plate is visible in its entirety on the display device. [0023] Figure 2A describes placing the hand held the device on the plate in the correct position wherein the plate is visible in its entirety. The figure illustrates screen area as 201. The screen area comprises of the area within which the camera captures the image. The reference lines are indicated as 202 and 203. In the embodiment the reference lines 202 and 203 are framed inside the screen area. The screen completely covers the plate long with the grains placed thereon.
[0024] Figure 2B describes an embodiment illustrating the image captured as per the invention. The captured image comprises of the grains, reference lines and the plate in its entirety. The image describes the reference lines 202 and 203 completely captured within the image. The image may be taken at any angle or distance from the hand held device to the plate.
[0025] Figure 3 describes a hand held device according to an embodiment of the disclosure. The hand held device is configured for capturing an image. The hand held device comprises of a transceiver 301 , an image capture unit 302, a display unit 303, input unit 304 and a storage unit 305.
[0026] The transceiver is configured to transmit and receive signals to and from, the cloud. The image capture unit of the hand held device is configured to capture a plurality of images according to an embodiment of the disclosure. The plurality of images may be captured according to an image capture criteria set by the user.
[0027] The display unit of the hand held device is operatively coupled to the image capture unit, configured to display at least one image of the plurality of captured images. The input unit of the hand held device is communicatively coupled to the display unit, configured to receive a user selection to select at least one displayed image. A storage unit is communicatively coupled to the display unit and the input unit, and is configured to store the at least one displayed image selected by a user. The storage unit is further configured to store the plurality of images captured by the user.
[0028] The hand held device also comprises a controller 306 for controlling the functions of the hand held device. The controller may function as a processor such as a mobile station modem (MSM) or a digital signal processor (DSP). The image capture unit may comprise a lens such as an optical lens used to capture image in a visible spectrum. The display unit may comprise a liquid crystal display (LCD) for display of the captured image. The input unit may preferably be constructed and configured to receive a control command including a user command pertaining to image capture. The input unit may be a key pad or a touch screen, for example. The storage unit may store software related to image capture and may be constructed as a volatile or non-volatile memory. The controller may preferably perform image capture by using the software related to image capture. The storage unit may store an image captured by the image capture unit and may comprise a buffer for temporarily storing the captured image and/or an image storage unit for storing an image selected by the user.
[0029] The controller of the hand held device receives the image capture commencement command from the user and thereafter directs the image capture unit to perform an operation of image capture. The controller also commands the image capture unit to begin the image capture according to one or more image capture criteria. The image capture criteria may preferably be set prior to the capture of the image. The at least one of the image capture criteria is set by the user. The image capture criteria may preferably be stored in the storage unit. The image capture criteria may include whether or not to use a flash, an image magnification, a resolution, display screen brightness, and a color effect or any such criteria as may be known to a person skilled in the art.
[0030] To commence image capture, the controller 306 commands the image capture unit to sequentially capture images and to optionally vary a value and/or a type of at least one image capture criteria. The controller also commands the image capture unit to output a sequence of captured images to the display unit. The user may then view the plurality of captured images displayed on the display unit, select a desired image and command the controller to send the same to the cloud, for analysis. The user may also provide the command to the controller to store the selected image in the storage unit. In another embodiment, the mobile terminal may also include an RF module, a power management module, an antenna, a battery, a subscriber identity module (SIM) card, a speaker, and a microphone, or some combination thereof. The transceiver of the hand held device is configured to be communicatively coupled to a computing cloud.
[0031 ] Figure 4 discloses a cloud computing system, the cloud computing system provides computational capacity, data access, networking/routing and storage services via a large pool of shared resources operated by a cloud computing system. Such cloud computing system are delivered over a network, cloud computing is location-independent computing, with all resources being provided to end-users on demand with control of the physical resources separated from control of the computing resources.
[0032] FIG. 4 illustrates an example system 400 according to this disclosure. FIG. 4 shows a hand held device connected to a computing cloud 401. Computing cloud 401 comprises processing unit 411 and data storage unit 412, both of which are accessible to clients. Computing cloud is a computing cloud that is capable of both storing information and performing data analysis on information. A computing cloud comprises at least one computer that is accessible from a remote location. The computing cloud may comprise a plurality of storage devices referred to as storage unit, as well as a plurality of processing units referred to collectively as the processing unit. The computing cloud may comprise hardware that is cost prohibitive to deploy and maintain at individual clients. In addition, the computing cloud may comprise software that is cost prohibitive to install, deploy, and maintain at individual computing clouds.
[0033] The computing cloud provides hardware and software remotely through a secure connections to hand held devices. While there is one computing cloud shown in the figures, it is explicitly understood that a plurality of clouds may be consistent with the disclosures. Fland held devices are devices in communication with the computing cloud. There may be a plurality of hand held devices which may access both the processing unit and storage unit that are located in the computing cloud simultaneously. The hand held device communicates with the computing cloud through any secured or unsecured method, including Flypertext Transfer Protocol Secure (HTTPS), secure telnet, or file transfer protocol secure (FTPS). It is understood that secure methods may be preferred over unsecure methods, and that the particular method chosen will depend upon the requirements of the function being accessed. This disclosure should not be interpreted as being limited to any particular protocol or method of transferring data. It is understood that the communication between the handheld devices and the computing cloud may be unidirectional or bidirectional. In many of the systems and methods disclosed herein, bidirectional communication is preferred. The phrase “unidirectional communication” refers to communication in which data is sent from one communications device to a second communications device. The term “bidirectional communication” refers to communication where data is sent and received by two or more communication devices.
[0034] Figure 5 illustrates the plate 500. The plate comprises of a defined boundary 501. The plate may be rectangular, square or of any predefined geometrical shape. The surface 502 of the plate is horizontal to the ground and houses the grains which are to be analyzed. The surface 502 may be surrounded in proximity with a secure frame. The surface 502 comprises a set of reference lines 504. The set of reference lines 504 comprise of at least two parallel lines. The set of reference lines are oriented parallel to the outer boundary of the plate. The distance of orientation of the reference lines from that of the outer boundary is pre-defined. Each of the reference lines constitute plurality of reference points. These reference points can be in form of shape, color or both. The surface 502 may preferably be of dark colour. The reference lines also may preferably be of dark color. The reference points may be configurable with respect to size and color.
[0035] The following description now illustrates in detail, the steps and their substeps required to carry out the invention. The captured image is analysed by the processing unit of the cloud. The analysis by the processing unit comprises extracting the resolution of the received image and analzing the texture values to ascertain the grain quality. The texture values comprise reference points, grain image, boundary of grains and the color space factor. These reference points provided on the reference lines can be in form of shape, color or both. The processing unit further extracts the reference points provided on the reference lines of the plate 101. The processing unit identifies the reference points available on the plate.
[0036] The reference points are extracted from the reference lines, by the processing unit. The processing unit is configured to extract the grain images separately. The processing unit also identifies the grains in the image which are inside the object area, as separate objects. The identification of grains as separate objects comprises identifying the boundary of the grains. The processing unit analyses the image to find out‘number of pixels’ within each reference point of the reference lines. The processing unit then converts the length of each pixel using the formula Pixel length = reference length/number of pixels. The processing unit then adjusts the pixel length based on the resolution of the image and obtains the final pixel length. The processing unit then identifies a mean of all identified reference points as l.
[0037] The processing unit calculates the color space factor using the reference points on the plate. Each reference point for color is also associated with the location the grain on the reference plate. The processing unit is configured to detect colour of pixels of each reference points as image reference point pixel color. The processing unit calculates the b( color space) value based on expected reference point color and image reference point pixel color of the image. A reference value of the each pixel in each reference point is provided to the processing unit as the expected reference point pixel color. The color space factor b ( Color Space) may be calculated as : b (Color Space) = expected reference point pixel color (color Space)— image reference point pixel color(coior Space).
[0038] The processing unit is further configured to identify an object area. The object area essentially comprises of the complete image. The processing unit first identifies the top left corner pixel of the object area from the image. The processing unit calculates the number of pixel vertically and horizontally present in the object area. Subsequent to acquiring the the number of vertical and horizontal pixels present in the object area, the processing unit uses the factor ltocalculateverticalpixelsandhorizontalpixels. Vertical pixels are calculated as: vertical length on the actual object area/l. Horizontal pixels are calculated as: horizontal length of the actual object area /l. Based on this analysis, the total number of horizontal pixels and vertical pixels are calculated for the entire object area.
[0039] The processing unit thereafter identifies the background pixels present on the object area and converts them to pure black colour space pixels. The process of identifying the background pixels comprises the processor scanning and comparing each pixel of the image with b (color space) to determine if the pixel is a background pixel.
[0040] The processing unit then converts all background images into pure black colour space. The processing unit starts identifying the grains in the image which are inside the object area and extracts them from the image as separate objects. This happens easily because all background pixels have been converted to pure black color space, so anything non-pure black remains on the object area is‘grain’..
[0041 ] In a preferred embodiment the processing unit performs the straitening of the grains. The captured image is further scanned with the help of identified factors l and b (Color Space). Straighting the grains comprises individually extracted grains being further scanned to make them vertically straight. The processing unit is configured to convert the grain image into a vector image and identifies a vertical edge of the grain. The identification of vertical edge includes identifying number of pixels occupied by the grain vertically. The identified number of pixels occupied by the grain vertically is known as grain vertical pixels. The identification of horizontal edge including the number of pixels occupied by the grain horizontally. The identified number of pixels occupied by the grain horizontally is known as grain horizontal pixels. The processing unit creates a vector image by digital images through a sequence of commands or mathematical statements. The vector image comprises sequence of commands or mathematical statements that place lines and shapes in a given two-dimensional or three-dimensional space. The vector is a representation of both a quantity and a direction at the same time. The processing unit calculate the angle of grains vertical edge to the actual vertical straight line of the extracted grain. The entire image is then rotated with the calculated angle for measuring the parameters of each of the grain. Measuring the grain size involves the processing unit ascertaining that once the reference points are extracted and grain objects are created, the processor measures the size of the grain wherein i. Grain length = l x (grain vertical pixels)
ii. Grain width = l c (grain horizontal pixels)
[0042] Similarly, corresponding to each of the pixel, color of the grain is calculated with following formula: iii. Grain co lor(coior space) = b (color space) + pixel CO lor(Color
Space) [0043] The various parameters including grain color, grain width, grain length pixel color, etc. identified by the processing unit in accordance with the invention are all based on the reference points present on the reference lines of the plate.
[0044] Figure 6 describes a method for determining a grain profile. At step 601 , receiving a plurality of grains on a flat plate, the grains are received from the user on the plate. The grains may be placed in any manner such that they are non overlapping on each other. At step 602 scanning the plurality of food grains received on the flat plate by a hand held device to capture an image of the plate in accordance with an embodiment disclosed at figure 2A and 2B. The user may capture multiple images and may select from a plurality of the images for sending the image from the hand held device to the cloud.
[0045] At step 603, sending by the hand held device, the captured image to a cloud.
The image may be sent from the hand held device to the cloud by any means known to a persons killed in the art. The sending of image may include sending via an internet through an application. At step 604, processing by the processing unit, the obtained image to generate a profile for the plurality of food grains. The processing unit obtains a grain profile of the each of the food grains comprising of the grain length, grain width and the grain colour. The grain profile may further comprise of other conventionally known parameters which may be ascertained by the processing unit based on the grain pixel analysis. The analysis of the food grains is performed by the processing unit as disclosed in the specification.
[0046] At step 605, generating a grain profile for display on a device. The generated grain profile comprises of grain length, grain width and the grain colour. The grain profile generated may be in the form of diagram depicting the each of the above identified parameters. At step 606, sending the grain profile to a user either on the same hand held device or to any display device via internet. The generated grain profile may be displayed to a user via a specially designed application. Based on the grain profile the user or any other person may make a decision as to the quality of the food grains. The information about each of the parameters of the food grains may be associated with the cost of the food grains.
[0047] The method further comprises providing a grain profile of the food grains including a texture of the food grains. The plurality of food grains are received on the identified area of the apparatus. The obtained image comprises texture values of the food grains and reference lines in the image. The processing of the obtained image comprises, determining the dimension profile of the food grains by comparing the texture values of food grains with texture values of reference lines provided on the plate. Finally sending the obtained image comprises sending the image to the server via an application.
[0048] The handheld device software suite is intuitive application having capability to work as a grain scanner along with other features like dashboard, reports, alarms etc. Handheld device will be connected to cloud server application over available internet network which are mobile internet using GSM module or Wi-Fi connectivity. Handheld device and cloud server application will communicate over secured protocol which will enable customer to freely and securely pass the user information over internet without having the risk of data theft and maintaining data privacy. To de-risk the possibility of technology theft handheld device may be designed in a way to keep minimum possible components on the handheld device software suite and keep the mission critical technology components on the cloud server application and user interaction part on the handheld device which is using camera of device and device sensors like GPS, motion sensor etc.
[0049] In some embodiments, various functions described in this patent document are implemented or supported by a computer program that is formed from computer readable program code and that is embodied in a computer readable medium.
[0050] It may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The term “communicate” as well as derivatives thereof, encompasses both direct and indirect communication. The terms“include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. [0051 ] While this disclosure has described certain embodiments and generally associated methods, alterations and permutations of these embodiments and methods will be apparent to those skilled in the art. Accordingly, the above description of example embodiments does not define or constrain this disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of this disclosure, as defined by the following claims

Claims

We Claim :
1. A method for determining a grain profile comprising:
receiving a plurality of grains on a flat plate, the plate includes a plurality of reference lines;
scanning the plurality of food grains received on the flat plate by a hand held device to obtain an image;
sending by the hand held device, the obtained image to a cloud;
processing by the cloud, the obtained image to generate a profile for the plurality of food grains, wherein the processing of the obtained comprises analysing the parameters based on a plurality of reference points extracted from the reference lines; and
generating a grain profile for display on a device, to enable a user to determine a cost associated with the food grains.
2. The method of claim 1, wherein the grain profile includes parameters of the food grains and the plate including the length width and color of the food grain.
3. The method of claim 1, wherein the plurality of food grains are received on the identified area of the apparatus.
4. The method of claim 1, wherein processing includes determining the number of pixels associated with each of the reference points and the food grains in the image.
SUBSTITUTE SHEETS (RULE 26)
5. The method of claim 1, wherein processing the obtained image comprises:
determining the dimension and color profile of the food grains by comparing the texture values of food grains with reference points provided on the reference lines of the plate.
6. A system for determining a grain profile configured to: receive a plurality of grains on a flat plate, the plate includes a plurality of reference lines;
scan the plurality of food grains received on the flat plate by a hand held device to obtain an image;
send, the obtained image to a cloud;
process, the obtained image to generate a profile for the plurality of food grains, wherein the processing of the obtained comprises analysing the parameters based on a plurality of reference points extracted from the reference lines; and
generate a grain profile for display on a device, to enable a user to determine a cost associated with the food grains.
7. The system of claim 7, wherein the plurality of reference lines on the plate may be parallel to a vertical or horizontal edge of the reference plate.
8. The system of claim 7, wherein the food grains are received on the identified area of the flat plate.
9. The system of claim 7, wherein the system is further configured to process the grain profile by determining the
SUBSTITUTE SHEETS (RULE 26) number of pixels associated with each of the reference points and the food grains in the image.
10. The system of claim 7, wherein the server is configured to send the generated grain profile to a display device.
SUBSTITUTE SHEETS (RULE 26)
PCT/IN2019/050413 2018-05-28 2019-05-27 A food grain profiling system Ceased WO2019229767A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ININ201811019972 2018-05-28
IN201811019972 2018-05-28

Publications (1)

Publication Number Publication Date
WO2019229767A1 true WO2019229767A1 (en) 2019-12-05

Family

ID=68696842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2019/050413 Ceased WO2019229767A1 (en) 2018-05-28 2019-05-27 A food grain profiling system

Country Status (1)

Country Link
WO (1) WO2019229767A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134552A1 (en) * 2013-02-28 2014-09-04 Day Neil M Method and apparatus for particle size determination
US9165187B2 (en) * 2012-01-12 2015-10-20 Kofax, Inc. Systems and methods for mobile image capture and processing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9165187B2 (en) * 2012-01-12 2015-10-20 Kofax, Inc. Systems and methods for mobile image capture and processing
WO2014134552A1 (en) * 2013-02-28 2014-09-04 Day Neil M Method and apparatus for particle size determination

Similar Documents

Publication Publication Date Title
CN110493595B (en) Camera detection method and device, storage medium and electronic device
CA2786955C (en) Method and system for determining colour from an image
CN105865723B (en) Leakage inspection non-uniform correction method and gas leak detection apparatus
EP3297275B1 (en) Method, server, system, and image capturing device for surveillance
US20170374354A1 (en) Apparatus and method for focal length adjustment and depth map determination
CN112906750B (en) A method and system for material analysis based on hyperspectral images
CN112434546A (en) Face living body detection method and device, equipment and storage medium
JPWO2014157340A1 (en) Size measuring apparatus and size measuring method
CN102880853A (en) Network-based traditional Chinese medicine identification system and method
JP2020191086A (en) Measuring meter/installed equipment information management system, measuring meter/installed equipment information management method and measuring meter/installed equipment information management program
US8983227B2 (en) Perspective correction using a reflection
CN110619807A (en) Method and device for generating global thermodynamic diagram
CN108986173A (en) Building blocks detection device and detection method
CN105894261A (en) Usage method of cashier desk face payment system
KR20190142931A (en) Code authentication method of counterfeit print image and its application system
KR101525915B1 (en) object information provision system and object information provision apparatus and object information providing method
WO2019229767A1 (en) A food grain profiling system
CN115860026A (en) Bar code detection method and device, bar code detection equipment and readable storage medium
JP6829333B1 (en) Charge output device, charge output method and charge output system
JP2018014572A (en) Information processing apparatus, image processing system, and program
EP3877951A1 (en) Automatic co-registration of thermal and visible image pairs
CN112241640B (en) Graphic code determining method and device and industrial camera
CN110930364B (en) AI-based video microscope implementation method
CN110363809B (en) Volume measurement method, volume measurement device, terminal and storage medium
KR20180110912A (en) Real Estate Information Providing Method and the Application Performing thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19810070

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19810070

Country of ref document: EP

Kind code of ref document: A1