[go: up one dir, main page]

WO2011048575A2 - Traitement optique pour commander un appareil de lavage - Google Patents

Traitement optique pour commander un appareil de lavage Download PDF

Info

Publication number
WO2011048575A2
WO2011048575A2 PCT/IB2010/054787 IB2010054787W WO2011048575A2 WO 2011048575 A2 WO2011048575 A2 WO 2011048575A2 IB 2010054787 W IB2010054787 W IB 2010054787W WO 2011048575 A2 WO2011048575 A2 WO 2011048575A2
Authority
WO
WIPO (PCT)
Prior art keywords
wash cycle
ware
time period
washing apparatus
cleanliness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2010/054787
Other languages
English (en)
Other versions
WO2011048575A3 (fr
Inventor
Christopher C. Wagner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecolab Inc
Original Assignee
Ecolab Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/604,893 external-priority patent/US8509473B2/en
Priority claimed from US12/628,478 external-priority patent/US8229204B2/en
Application filed by Ecolab Inc filed Critical Ecolab Inc
Publication of WO2011048575A2 publication Critical patent/WO2011048575A2/fr
Publication of WO2011048575A3 publication Critical patent/WO2011048575A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L15/00Washing or rinsing machines for crockery or tableware
    • A47L15/42Details
    • A47L15/4295Arrangements for detecting or measuring the condition of the crockery or tableware, e.g. nature or quantity
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L15/00Washing or rinsing machines for crockery or tableware
    • A47L15/0018Controlling processes, i.e. processes to control the operation of the machine characterised by the purpose or target of the control
    • A47L15/0021Regulation of operational steps within the washing processes, e.g. optimisation or improvement of operational steps depending from the detergent nature or from the condition of the crockery
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L15/00Washing or rinsing machines for crockery or tableware
    • A47L15/42Details
    • A47L15/46Devices for the automatic control of the different phases of cleaning ; Controlling devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2401/00Automatic detection in controlling methods of washing or rinsing machines for crockery or tableware, e.g. information provided by sensors entered into controlling devices
    • A47L2401/04Crockery or tableware details, e.g. material, quantity, condition
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2401/00Automatic detection in controlling methods of washing or rinsing machines for crockery or tableware, e.g. information provided by sensors entered into controlling devices
    • A47L2401/20Time, e.g. elapsed operating time
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2401/00Automatic detection in controlling methods of washing or rinsing machines for crockery or tableware, e.g. information provided by sensors entered into controlling devices
    • A47L2401/34Other automatic detections
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2501/00Output in controlling method of washing or rinsing machines for crockery or tableware, i.e. quantities or components controlled, or actions performed by the controlling device executing the controlling method
    • A47L2501/07Consumable products, e.g. detergent, rinse aids or salt
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2501/00Output in controlling method of washing or rinsing machines for crockery or tableware, i.e. quantities or components controlled, or actions performed by the controlling device executing the controlling method
    • A47L2501/24Conveyor belts, e.g. conveyor belts motors
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2501/00Output in controlling method of washing or rinsing machines for crockery or tableware, i.e. quantities or components controlled, or actions performed by the controlling device executing the controlling method
    • A47L2501/30Regulation of machine operational steps within the washing process, e.g. performing an additional rinsing phase, shortening or stopping of the drying phase, washing at decreased noise operation conditions
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2501/00Output in controlling method of washing or rinsing machines for crockery or tableware, i.e. quantities or components controlled, or actions performed by the controlling device executing the controlling method
    • A47L2501/32Stopping or disabling machine operation, including disconnecting the machine from a network, e.g. from an electrical power supply

Definitions

  • this disclosure describes techniques for automatically determining cleanliness of surfaces, such as surfaces of wares.
  • the wares may include, for example, dishes, glasses, stainless steel coupons, fabric swatches, Tosi plates, ceramic tiles, or other materials.
  • the techniques include capturing a digital image of the surface being inspected, or a portion thereof, and analyzing the image to determine an average luminosity value for the surface. For certain types of wares, a higher luminosity value may be indicative of cleanliness, thus higher luminosity values indicate that the surface of the ware is relatively clean, whereas lower luminosity values indicate that the surface is relatively soiled. For other types of wares, a lower luminosity value may be indicative of cleanliness.
  • An inspection system for performing the techniques of this disclosure may include a light-tight housing, a camera within the housing to capture a digital image of the ware being inspected, and a processor or other control unit to process the captured digital image.
  • the inspection system may provide an environment particularly tailored for the particular ware being inspected, which may include providing suitable lighting, camera settings (e.g., aperture and exposure time), and camera position relative to the ware. In this manner, the inspection system may produce an luminosity value according to controlled environmental settings. The inspection system may therefore avoid producing inaccurate luminosity values based on a change in environment, rather than based on cleanliness of the ware being inspected.
  • the inspection system may receive an identification of the ware to be inspected and the control unit may send one or more signals that automatically adjust the environment according to the received identification.
  • a method includes receiving, with a computing device, a digital image of a ware, calculating a luminosity value for the ware from the digital image, and determining, with the computing device, a cleanliness value for the ware from the calculated luminosity value.
  • a system includes a camera to capture a digital image of a ware, a light source to illuminate the ware, a housing to enclose the ware, the camera, and the light source in a light-tight environment, and an analysis computer to receive the digital image, calculate a luminosity value for the ware from the digital image, and determine a cleanliness value for the ware from the calculated luminosity value.
  • a computer-readable medium such as a computer- readable storage medium, contains, e.g., is encoded with, instructions that cause a programmable processor to receive a digital image of a ware, calculate a luminosity value for the ware from the digital image, and determine a cleanliness value for the ware from the calculated luminosity value.
  • a computer-readable storage medium is encoded with instructions for causing a programmable processor to detect a visual signature of a ware to be washed by a washing apparatus coupled to the programmable processor in an image captured by a camera coupled to the programmable processor, select an item profile corresponding to the detected visual signature, enable the washing apparatus, and automatically disable the washing apparatus after the ware is determined to be clean in accordance with the selected item profile.
  • FIG. 1 is a block diagram illustrating an example system for determining cleanliness of various wares.
  • FIG. 2 is a block diagram illustrating components of an example analysis computer.
  • FIG. 5 is a flowchart illustrating an example method for calculating a cleanliness value of a particular type of ware.
  • FIG. 8 is a block diagram illustrating an example system in which an analysis computer automatically disables a washing apparatus.
  • FIG. 10 is a perspective view of a system for capturing a digital image of a ware in an environment suited for capturing the digital image based on the type of ware to be analyzed.
  • FIG. 11 is a flowchart illustrating an example method for calculating a rating value based on a combination of a luminosity value and a count of pixels within a particular range corresponding to known soiled/stained regions of a surface.
  • FIG. 12 is a screenshot illustrating an example graphical user interface for entering threshold luminosity values and calculating a cleanliness value.
  • FIG. 14 is a flowchart illustrating an example method for selectively disabling a washing apparatus based on a wash cycle associated with a particular ware being washed.
  • FIGS. 16A and 16B are conceptual diagrams illustrating example visual signatures for detecting a glass and a plate using an optical recognition system.
  • FIG. 1 is a block diagram illustrating an example system 2 that determines cleanliness or staining of various surfaces.
  • system 2 includes camera 16, light emitting diode (LED) arrays 20, 24A, 24B, housing 10, and ware 30.
  • LED arrays 20, 24A, 24B may comprise other light sources, such as light bulbs, lasers, or other suitable devices for providing illumination.
  • Housing 10 may fully enclose camera 16, LED arrays 20, 24A, 24B, and ware 30 to provide a light-tight environment. That is, housing 10 may be configured to prevent external light sources from penetrating housing 10 and to prevent light emitted by LED arrays 20, 24A, 24B from escaping housing 10. Because system 2 may analyze surfaces of many different types of wares of widely varying materials, shapes and sizes, the lighting conditions, camera angles and other factors that result in an image from which meaningful cleanliness data may be obtained may be quite different for each type of ware. Therefore, the physical configuration of system 2 may be customized for each type of ware or other surface that may be analyzed.
  • interior surface of housing 10 may be configured according to the specific ware 30 selected for analysis.
  • the interior surface of housing 10 may be colored black when ware 30 comprises a glass or dishware.
  • the interior surface of housing 10 may be colored white when ware 30 comprises a stainless steel coupon.
  • Housing 10 may comprise a plurality of interchangeable liners or sheets of various colors for modifying the internal color of housing 10. In this manner, the environment provided by system 2 may be configured according to the ware 30 being analyzed.
  • Each of LED arrays 20, 24A, 24B may comprise arrays of one or more LEDs. LEDs in an array may be wired in parallel. In some examples, other suitable light sources may be used instead of LEDs for LED arrays 20, 24A, 24B. That is, in other examples, other light sources may be used in addition to or instead of LED arrays.
  • LED array 20 generally provides focused lighting to illuminate ware 30, while LED arrays 24A, 24B provide diffused lighting that illuminate the interior of housing 10.
  • LED array 20 is connected to LED actuator 22. LED actuator 22 may change the angle of LED array 20 with respect to ware 30.
  • LED array 20 need not be positioned directly over ware 30, but may be placed behind or in front of ware 30 and LED array 20 may still illuminate ware 30.
  • LED actuator 22 may also be configured to move LED array 20 up and down arm 14.
  • Arm 14 is also connected to arm actuator 28, which may be configured to move arm 14 forward and back along track 26.
  • LED actuator 22 and arm actuator 28 may comprise mechanical actuators or electro-mechanical actuators controlled by, e.g., analysis computer 40, as described in greater detail below.
  • LED arrays 20, 24A, 24B may be used to illuminate the interior of housing 10.
  • LED arrays 20, 24A, 24B are selectively activated according to the type of ware 30 being analyzed.
  • LED array 20 may be positioned according to the type of ware 30 being analyzed.
  • LED array 20 may be positioned directly above ware 30, with LED arrays 24A and 24B deactivated.
  • LED array 20 may be positioned behind ware 30 (relative to camera 16), and LED actuator 22 may configure the angle of LED array 20 to be aimed at ware 30, with LED arrays 24A and 24B deactivated.
  • LED array 20 may be deactivated and LED arrays 24A, 24B may be activated.
  • the position of camera 16 relative to ware 30 may be determined by the type of ware 30 being analyzed. In general, the position of camera 16 should be such that the field of view of camera 16 is filled by the portion of ware 30 to be analyzed. For example, different varieties of glasses may be analyzed by system 2. Types of glasses include stemless glasses, stemmed glasses, and marked glasses (e.g., etched glasses). When ware 30 comprises a stemmed glass, camera 16 may be positioned such that the field of view includes as much of the bowl of the glass as possible and excludes the stem. When ware 30 comprises a stemless glass, the field of view may include the whole glass. When ware 30 comprises a marked glass, the field of view may include an unmarked portion of the glass and exclude the marked portion of the glass.
  • camera actuator 18 changes the height of camera 16.
  • Camera 16 may also include a zoom function to zoom in on ware 30.
  • ware 30 is positioned on a ware actuator 32.
  • a bottom portion of housing 10 may include markings that indicate a location for ware 30.
  • rail 26 may include markings that indicate a location for ware 30.
  • rail 26 may include a plurality of numbered positions that indicate a position for ware 30, and a user may be provided with an indication of which of the numbered positions corresponds to each type of ware.
  • Analysis computer 40 generally analyzes a digital image of a ware, e.g., ware 30, to determine a luminosity value for the surface of the ware. In other examples, analysis computer 40 may instead determine an intensity value or a brightness value for the ware. Analysis computer 40 may further correlate the luminosity value for the ware to a cleanliness value, a stain value, a fade value, or other value.
  • Conventional methods for analyzing cleanliness of a glass include assigning a numeric value to the glass as a cleanliness value on a scale of one to five, with one being clean and five being soiled.
  • Analysis computer 40 may therefore automatically assign a cleanliness value on a scale of one to five as a correlated value with the luminosity value.
  • the range of the luminosity value may, in some examples, exceed the range of the cleanliness value.
  • analysis computer 40 may calculate a luminosity value on a scale from 0 to 255, where 0 represents "black” and 255 represents “white.”
  • the cleanliness value may generally correspond to spotting, filming, soiling, and/or staining on a variety of wares, such as glasses, dishes, utensils, instruments, or other wares.
  • Stain values may be calculated for wares such as fabric swatches, ceramic tiles, or stainless steel coupons. Fade values may also be calculated for colored fabric swatches that indicate how much of the color of the fabric swatches has faded or how much color is remaining.
  • the luminosity value is a measure of brightness.
  • a relatively less expensive monochromatic camera may be used to perform the techniques of this disclosure, without requiring the use of a color camera or a colorimeter.
  • a color camera may be incorporated into system 2, rather than a monochromatic camera. Additional measurements and calculations may also be made using the color camera and/or a colorimeter, e.g., for particular types of wares, to distinguish between soiling types on the ware, or for other purposes.
  • analysis computer 40 may be configured to analyze a previously captured image of ware 30. In other examples, analysis computer 40 may control camera 16 to capture and retrieve a digital image of ware 30, and then automatically calculate a luminosity value and produce the cleanliness value from the calculated luminosity value. Additionally, in some examples, analysis computer 40 may control camera actuator 18, LED actuator 22, arm actuator 28, ware actuator 32, and/or LED arrays 20, 24A, 24B to provide an environment for capturing a digital image of ware 30. Analysis computer 40 may be configured with a plurality of different types of wares and corresponding environmental settings for each type of ware. Therefore, analysis computer 40 may receive an identification of a type of ware to analyze and automatically configure camera actuator 18, LED actuator 22, arm actuator 28, ware actuator 32, and/or LED arrays 20, 24A, 24B according to the environment settings corresponding to the received ware type identification.
  • Analysis computer 40 may be configured with environment settings for each type of ware 30 to be analyzed. Analysis computer 40 may be pre-configured with these environment settings or a user may customize the environment settings of analysis computer 40. In general, the environment settings are determined such that analysis computer 40 calculates a proper luminosity value for control wares of a particular type. For example, for a particular type of glass, a user may determine environment settings that cause analysis computer 40 to calculate a luminosity value of 0 for a control glass of that type that is known to be clean and that cause analysis computer 40 to calculate a luminosity value of 255 for a control glass of that type that is known to be soiled.
  • the control clean glass may comprise a glass that one or more experts judge to have a cleanliness value of 1 (on a scale of 1 to 5) and the control soiled glass may comprise a glass that the one or more experts judge to have a cleanliness value of 5 (on a scale of 1 to 5).
  • the determined environment settings may be used for analysis of all wares of that particular type.
  • FIG. 2 is a block diagram illustrating components of an example analysis computer 40.
  • analysis computer 40 includes user interface 42, system interface 44, control unit 46, and environment settings database 60.
  • Control unit 46 may comprise a processor or other hardware, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • Control unit 46 also includes analysis module 48 and environment control module 50 in the example of FIG. 2.
  • Analysis module 48 and environment control module 50 may each comprise hardware units, software modules, or combinations thereof to perform the functions described with respect to these modules.
  • instructions for these modules may be encoded in a computer- readable medium (not shown) of analysis computer 40.
  • control unit 46 may execute the instructions to perform the functions described with respect to these modules.
  • User interface 42 comprises one or more interface devices, such as a display, keyboard, mouse, graphical user interface (GUI), command line interface, light pen, touch screen, stylus, or other devices for presenting and receiving information to and from a user.
  • Control unit 46 may cause a display of user interface 42 to present information regarding an analysis of ware 30, such as a calculated luminosity value or a cleanliness value.
  • Control unit 46 may also cause the display to present the captured digital image of ware 30.
  • control unit 46 may cause the display to simultaneously present representations of a plurality of wares, such as a graphical representation of each of the wares that indicates the cleanliness value for each of the plurality of wares.
  • System interface 44 may also include interfaces for electronically manipulating one or more of camera actuator 18, ware actuator 32, arm actuator 28, LED actuator 22, camera 16, and/or LED arrays 20, 24A, 24B.
  • System interface 44 may receive signals from environment control module 50 and/or control unit 46 to manipulate one or more of these components of system 2 and transmit an electrical signal to the corresponding component. For example, system interface 44 may send signals to move camera actuator 18, ware actuator 32, arm actuator 28, and/or LED actuator 22.
  • System interface 44 may also send signals to camera 16 to change settings of the camera (e.g., aperture and/or shutter speed settings), signals to cause camera 16 to capture a digital image, or signals to retrieve one or more digital images from camera 16.
  • System interface 44 may further send signals to enable or disable LEDs of LED arrays 20, 24A, 24B.
  • Analysis computer 40 may store environment settings for one or more types of wares in environment settings database 60. For each type of ware, analysis computer 40 may store an entry in environment settings database 60. Analysis computer 40 may store new entries in environment settings database 60 for new types of wares. A user may interact with analysis computer 40 via user interface 42 to add, view, modify, or delete entries of analysis computer 40. Each entry of environment settings database 60 may include settings for a position of camera 16, a position and angle of LED array 20, whether LED array 20 is enabled or disabled, whether LED arrays 24A, 24B are enabled or disabled, a position for ware actuator 32, and a position for arm 14.
  • each entry of environment settings database 60 may also include instructions to display to a user via user interface 42, such as a color to use on the internal walls of housing 10.
  • entries of environment settings database 60 may include instructions to display to the user as to a position or location of ware 30. The user may therefore read the instructions and place ware 30 in the proper position.
  • Analysis module 48 analyzes digital images retrieved from camera 16 by environment control module 50.
  • analysis module 48 may also analyze digital images stored on a computer-readable medium (not shown) of analysis computer 40, for example, a hard disk, an optical disk, a floppy disk, a flash drive, a Secure Digital card, or other computer-readable medium.
  • analysis module 48 analyzes the digital image to calculate a luminosity value for the digital image.
  • analysis module 48 calculates the luminosity value from all of the digital image, while in other examples, analysis module 48 calculates the luminosity value from only a portion of the digital image, i.e., a region of interest of the digital image.
  • Calculating the luminosity value from a region of interest may exclude non-ware portions of the digital image, glare from the light source(s), or other portions of the digital image that should not be factored into the calculation of the luminosity value.
  • analysis module 48 may calculate a plurality of luminosity values each corresponding to different regions of the digital image, e.g., to calculate luminosity values for each swatch of a fabric swatch array or each tile of a group of ceramic tiles.
  • Analysis module 48 may calculate a region of interest of a digital image according to particular instructions. Analysis module 48 may also receive a definition of a region of interest from a user. In some examples, the region of interest may correspond to the entire digital image. In any case, analysis module 48 calculates a luminosity value based on an average of values of pixels in the region of interest. When the digital image comprises a black-and-white image, each pixel may have a value in a numeric range, e.g., 0 to 255, where 0 represents black and 255 represents white. Analysis module 48 may calculate a luminosity value by calculating the average pixel value of the pixels in the region of interest.
  • Analysis module 48 may also calculate a histogram with bins ranging from 0 to 255, where each bin stores a value representative of the number of pixels in the region of interest with a pixel value of the corresponding bin. For example, if there are 8 pixels with a value of "18," bin 18 of the histogram would store a value of "8.” In some examples, analysis module 48 may cause user interface 42 to display a graphical representation of the histogram. Analysis module 48 may calculate a luminosity value using the average of the pixel values or the histogram. Analysis module 48 may also calculate other statistics, such as a standard deviation of the pixel luminosity values.
  • Analysis module 48 may also calculate luminosity values for color digital images.
  • each pixel of a color digital image may comprise values for red, green, and blue (RGB).
  • Analysis module 48 may convert color digital images in the RGB domain to the YC b C r (or YUV) domain, where Y represents luminosity and C b and C r represent color information for the pixel.
  • Analysis module 48 may then use the Y-values of pixels as pixel luminosity values and calculate an average of the pixel luminosity values in the region of interest using the methods described above, e.g., an average over the pixels or by calculating a histogram.
  • Analysis module 48 may also calculate luminosity values for each pixel in the RGB domain directly.
  • Analysis module 48 may further determine a cleanliness value for ware 30 based on the calculated luminosity value. For certain types of wares, a higher luminosity value indicates that the ware is clean, whereas for other types of wares, a lower luminosity value indicates that the ware is clean.
  • Analysis computer 40 may store correlations for the type of ware being analyzed in environment settings database 60. Each entry of environment settings database 60 may further include one or more thresholds for luminosity values that define cleanliness of the ware for luminosity values. For example, for a particular type of glass, the thresholds may correspond to the conventional ratings of 1 to 5, e.g., a first threshold at 50, a second threshold at 101, a third threshold at 152, and a fourth threshold at 204.
  • analysis module 48 may determine that a glass with a luminosity value between 0 and 50 has a cleanliness value of 1, a glass with a luminosity value between 51 and 101 has a cleanliness value of 2, a glass with a luminosity value between 102 and 153 has a cleanliness value of 3, a glass with a luminosity value between 154 and 203 has a cleanliness value of 4, and a glass with a luminosity value between 204 and 255 has a cleanliness value of 5.
  • the functional relationship between luminosity values and cleanliness values is linear. In other examples, the functional relationship between luminosity values and cleanliness values may be, for example, quadratic, exponential, logarithmic, or another mathematical relationship.
  • Analysis module 48 may cause user interface 42 to display or otherwise present the determined cleanliness value for ware 30 to a user.
  • User interface 42 may also present other data, such as the luminosity value, calculated statistics for the region of interest, a representation of a histogram, a comparison between two or more wares, or other information.
  • FIG. 3 is a block diagram illustrating components of an example environmental control module 50.
  • environmental control module 50 includes a plurality of actuator control modules 52, camera control module 54, and LED control module 56.
  • actuator control modules 52 corresponds to a respective actuator, e.g., camera actuator 18, ware actuator 32, LED actuator 22, and arm actuator 28 of FIG. 1.
  • Other examples with more or fewer actuators may include correspondingly more or fewer actuator control modules 52.
  • each of actuator control modules 52 sends electrical signals to a corresponding actuator to control the actuator.
  • Each of actuator control modules 52 may also include logic in hardware and/or software to determine when to stop sending an electrical signal to the corresponding actuator in order to properly position the actuator. For example, one of actuator control modules 52 may apply a specific voltage or current to the corresponding actuator for a specific amount of time to move the actuator a desired amount. As another example, one of actuator control modules 52 may send different signals to the corresponding actuator based on the desired amount of movement, e.g., when the actuator is capable of calculating distance of movement and/or current position.
  • Camera control module 54 sends one or more electrical signals to camera 16 (FIG. 1) to control camera 16. For example, camera control module 54 may send various electrical signals to camera 16 to adjust aperture settings, shutter speed settings, and zoom settings of camera 16. Camera control module 54 may also send an electrical signal to camera 16 to cause camera 16 to capture a digital image. Camera control module 54 may further send an electrical signal to camera 16 to retrieve a captured digital image, and may receive electrical signal from camera 16 representative of the captured digital image. Upon receiving a digital image from camera 16, camera control module 54 may store the image in a computer-readable medium (not shown) of analysis computer 40 and send a signal to analysis module 48 that an image is available for analysis.
  • a computer-readable medium not shown
  • LED control module 56 may send electrical signals to LED arrays 20, 24A, 24B to enable and/or disable LED arrays 20, 24A, 24B. In some examples, LED control module 56 may send an electrical signal to toggle one or more of LED arrays 20, 24A, 24B on or off. In other examples, LED control module 56 continuously sends an electrical signal to one or more of LED arrays 20, 24A, 24B that are to be turned on and does not send an electrical signal to the other LED arrays to that are to be turned off. LED control module 56 may also provide a specific voltage and/or current to LED arrays 20, 24A, 24B. For example, LED control module 56 may enable LED array 20 by providing .23 amps and 2.2 volts to LED array 20.
  • FIG. 4 is a flowchart illustrating an example method for configuring an environment for a particular type of ware.
  • the example method of FIG. 4 is discussed with respect to a glass, it should be understood that similar methods may be used to configure an environment for any type of surface or ware that can be analyzed by system 2 (FIG. 1) or similar systems.
  • the method of FIG. 4 comprises creating an environment for capturing digital images of wares of the type to be analyzed such that an analysis system determines appropriate cleanliness values for captured digital images of control wares in the environment, where the control wares have known cleanliness values.
  • analysis computer 40 may analyze captured digital images of wares with unknown cleanliness values in the same environment and automatically adjust system 2 to conform to the determined environment (in some examples). Therefore, the method of FIG. 4 comprises one example of a method for calibrating an environment in which to capture digital images of a particular type of ware.
  • control wares of the type being analyzed are obtained (82).
  • two control wares are obtained: a control clean ware and a control soiled ware.
  • the control clean ware may comprise a glass that has never been used or a glass that has been washed a certain number of times.
  • the control clean ware may also be inspected by an expert, who determines that the control clean ware is clean, e.g., has a cleanliness value of 1 on a scale of 1 to 5.
  • control soiled ware may comprise a glass that has been sufficiently soiled, and an expert may determine that the control soiled ware has a cleanliness value of 5 on a scale of 1 to 5.
  • additional control wares may be used, e.g., glasses for which an expert has determined a cleanliness value of 2, 3, or 4.
  • the operator may then capture a first image of the control clean ware (86) and a second image of the control soiled ware (88).
  • the operator may then use analysis computer 40 to calculate a first luminosity value for the first image and a second luminosity value for the second image (90).
  • the method of FIG. 4 includes adjusting the environment settings and capturing new images of the control wares until analysis computer 40 produces appropriate luminosity values for the images of the control wares.
  • a threshold for the image of the control clean glass comprises a clean threshold luminosity value
  • a threshold for the image of the control soiled glass comprises a soiled threshold luminosity value.
  • analysis computer 40 stores the environment settings, e.g., in environment settings database 60 (96).
  • environment settings database 60 96
  • other threshold luminosity values may be used, and additional threshold values may be used for additional control wares.
  • an operator may manually record the environment settings in a notebook or text or other file stored on analysis computer 40.
  • analysis computer 40 may automatically configure the environment of system 2 to capture an image of ware 30 according to the identified type of ware to be analyzed (112). For example, environment control module 50 of analysis computer 40 may query environment settings database 60 with the identification of the ware type to retrieve environment settings from environment settings database 60. Environment control module 50 may then automatically adjust elements of system 2 according to the retrieved environment settings.
  • environment control module 50 may send one or more electrical signals via system interface 44 to camera actuator 18 to position camera 16 at an appropriate height, to ware actuator 32 to position ware 30 an appropriate distance from camera 16, to LED actuator 22 and arm actuator 28 to position LED array 20 at an appropriate position, to one or more of LED arrays 20, 24A, 24B to enable and/or disable LED arrays 20, 24A, 24B, and to camera 16 to adjust aperture settings, shutter speed settings, and zoom settings.
  • Analysis computer 40 may further display instructions via user interface 42 to the user to adjust certain elements of the environment.
  • the display may include instructions to configure an internal color of housing 10 and a position for ware 30 in examples that do not include ware actuator 32.
  • analysis computer 40 may send an electrical signal to camera 16 to capture a digital image of ware 30 (114).
  • Analysis computer 40 may then retrieve the captured digital image from camera 16. In other examples, an operator or other user may manually capture a digital image and transfer the image to analysis computer 40.
  • analysis computer 40 may determine a region of interest of the digital image to analyze (116).
  • the region of interest may comprise the entire digital image.
  • a user may configure the region of interest.
  • analysis computer 40 may display the captured digital image and receive a selection of the image from the user as the region of interest.
  • analysis computer 40 may be configured to automatically select the region of interest.
  • analysis computer 40 may be configured to detect boundaries of ware 30 in the digital image.
  • Analysis module 48 of analysis computer 40 may then calculate a luminosity value for the region of interest (118).
  • analysis module 48 may determine luminosity values for pixels in the region of interest, e.g., on a scale from 0 to 255, where 0 represents black and 255 represents white. Analysis module 48 may then calculate an average luminosity value for the region of interest as the luminosity value, e.g., according to the method of FIG. 6.
  • Analysis module 48 may then correlate the calculated luminosity value with a cleanliness value for the type of ware 30 (120). For example, for a certain type of glass, luminosity values less than 20 may correlate to a cleanliness value of 1, less than 50 may correlate to a cleanliness value of 2, less than 100 may correlate to a cleanliness value of 3, less than 200 may correlate to a cleanliness value of 4, and 200 or greater may correlate to a cleanliness value of 5. Analysis module 48 may then output the cleanliness value correlated to the luminosity value (122), e.g., by displaying the cleanliness value via user interface 42.
  • FIG. 6 is a flowchart illustrating an example method for calculating an average luminosity value for a region of interest of a digital image of a ware to be analyzed. Although the method of FIG. 6 is described as performed by analysis module 48 (FIG. 2), it should be understood that other hardware or software modules, or combinations thereof, may perform the method of FIG. 6.
  • the method of FIG. 6 includes constructing a 256 bin histogram (e.g., an integer array bin[256]) from the region of interest, then for each pixel, identifying the bin corresponding to the luminosity value of the pixel and add one to the current value of the corresponding bin, and then calculating the average luminosity according to the following formula:
  • analysis module 48 constructs a histogram with 256 bins labeled 0 to 255, where the bins each correspond to a numeric value and initializes each of the 256 bins to a value of zero (130).
  • the example method of FIG. 6 uses an array of 256 bins, it should be understood that other sizes may be used. In general, the size of the array corresponds to the range of luminosity values of pixel values in the digital image.
  • the histogram may comprise an integer array with 256 elements indexed by 0, i.e., an array with elements numbered 0 to 255.
  • Analysis module 48 may then determine whether all of the pixels in the region of interest have been analyzed (136). If not ("NO" branch of 136), analysis module 48 may analyze a next pixel of the region of interest and modify the histogram using the luminosity value of that next pixel. When all of the pixels in the region of interest have been analyzed (“YES" branch of 136), analysis module 48 calculates a luminosity value for the region of interest.
  • analysis module 48 first calculates a bin value for each bin of the histogram (138).
  • the bin value of bin is the number of pixels with luminosity value multiplied by .
  • the bin value of bin is bin[i]*i.
  • analysis module 48 accumulates the bin values (140) and divides by the number of pixels in the region of interest (142) to calculate the luminosity value for the region of interest. Analysis module 48 may then return the luminosity value or otherwise output the luminosity value (144).
  • averageLuminosity presents one example implementation of a method for calculating a luminosity value for a region of interest of a digital image of a ware being analyzed.
  • averageLuminosity returns an integer value representative of the calculated luminosity value.
  • Picture[][] is a two-dimensional integer array that stores pixels of the digital image of a ware for analysis.
  • a region of interest ( OI) is defined in terms of boundaries within the array picterie[] [].
  • int averageLuminosity (int picture[][], int ROITop, int ROIBottom, int ROILeft, int ROIRight) ⁇ /* receives a two-dimensional array picture with defined boundaries ROITop,
  • int pel_luminosity luminosity(picture[i] [j]); bin [pel_luminosity ] ++ ;
  • ⁇ int luminosity accumulatedLuminosity / numPixels
  • user interface 150 generally allows a user to modify environment settings relating to lighting, camera settings for camera 16, a type and position for ware 30, and an internal color of housing 10.
  • User interface 150 presents a name for the environment settings in name text box 152.
  • the name presented in name text box 152 may generally correspond to a name of a ware type.
  • a user has entered a name of "Restaurant X Glass" in name text box 152.
  • User interface 150 includes check boxes 154 that allow a user to enable or disable direct lighting and diffused lighting.
  • Direct lighting may correspond to LED array 20 (FIG. 1) and diffused lighting may correspond to LED arrays 24A, 24B (FIG. 1). Therefore, a user may selectively enable and disable LED arrays 20, 24A, 24B by checking or unchecking check boxes 154.
  • user interface 150 may present height text box 156 and angle text box 158 as grayed out and not allow a user to enter data in either height text box 156 or angle text box 158.
  • User interface 150 may also gray out labels "Height" and "Angle” associated with height text box 156 and angle text box 158 when the check box for direct lighting is unchecked.
  • user interface 150 presents height text box 156 and angle text box 158. In this manner, a user may configure a height of LED array 20 and an angle of LED array 20 with respect to ware 30.
  • Analysis computer 40 may therefore cause LED actuator 22 to position LED array 20 at a height of 8.0 inches (20.32 cm) with an angle of 0.0 degrees when capturing an image of a ware of type "Restaurant X Glass.” Analysis computer 40 may also cause arm actuator 28 to position arm 14 such that LED array 20 is positioned directly above ware 30 for a ware of type "Restaurant X Glass.”
  • User interface 150 also presents save button 174, load button 176, and cancel button 178.
  • save button 174 user interface 150 sends data from the text boxes to control unit 46, which in turn stores associated data to environment settings database 60.
  • control unit 46 creates a new entry in environment settings database 60 for the data from user interface 150.
  • control unit 46 updates the entry according to the data received from user interface 150.
  • System 200 includes washing apparatus 202, window 206, housing 208, lights 210A, 210B, camera 212, and analysis computer 214. Some examples may include only one of lights 21 OA, 210B, rather than both lights 21 OA, 210B as shown in FIG. 8. In the example of FIG. 8, light 210A is positioned within washing apparatus 202 and light 210B is positioned outside of washing apparatus 202.
  • Lights 21 OA, 210B may comprise LED arrays or other light sources to illuminate wares 204 via direct or indirect lighting.
  • Washing apparatus 202 applies a washing procedure to wares 204, which may include applying one or more chemicals (such as detergent, rinse agent, disinfectant, sanitizer, etc.) to wares 204.
  • Camera 212 captures digital images of wares 204 during the washing procedure, and analysis computer 214 analyzes the digital images to calculate a cleanliness value for wares 204.
  • Analysis computer 214 may include components similar to those of analysis computer 40 as described with respect to FIGS. 1 and 2.
  • Fabric swatch array 230 generally includes background fabric 238 and a plurality of fabric swatches 232 stitched or otherwise fixed to background fabric 238.
  • background fabric 238 of fabric swatch array 230 may comprise a different color than fabric swatches 232.
  • fabric swatches 232 may each comprise white cloth and background fabric 238 may comprise green or black cloth. In other examples, other colors of fabric swatches 232 and background fabric 238 may be used.
  • Each of fabric swatches 232 may comprise substantially identical pieces of fabric.
  • Fabric swatch arrays such as fabric swatch array 230 may be used to test the efficacy of various cleaning agents, which may include testing removal of soiling, removal of stains, or whether the cleaning agents cause fading in the coloring of fabric swatches 232.
  • fabric swatch array 230 may be used to test the efficacy of a particular type of fabric, e.g., resistance to soiling and staining and resistance to fading.
  • each of fabric swatches 232 may comprise different pieces of fabric to test the effects of a particular type of soiling on each of the fabrics or to test the effects of a particular chemical agent applied to each of the fabrics.
  • each of fabric swatches 232 may be soiled using a different soiling or staining agent, such as topsoil, grass, lipstick, wine, foodstuffs, or other such soiling agents.
  • a different soiling or staining agent such as topsoil, grass, lipstick, wine, foodstuffs, or other such soiling agents.
  • One of fabric swatches 232 e.g., fabric swatch 232A, may be left unsoiled as a control swatch.
  • Fabric swatch array 230 may then be washed in a fabric washing machine using particular chemical cleaning product(s) for which analysis is desired.
  • fabric swatch array 230 may be used as part of system 2 (FIG.
  • analysis computer 40 may instead determine individual cleanliness values for each of fabric swatches 232. That is, camera 16 may capture a digital image of all or a subset of fabric swatches 232, and analysis computer 40 may determine cleanliness values for each of the fabric swatches 232 in the image. In some examples, camera 16 may capture a single image of the entire fabric swatch array 230, while in other examples, camera 16 may capture images of various subsections of fabric swatch array 230.
  • a user may highlight each of fabric swatches 232, and select a region of interest thereof. In some examples, a user may highlight each of fabric swatches 232, and analysis computer 40 may automatically calculate a region of interest. In other examples, analysis computer 40 may be configured to automatically discriminate between fabric swatches 232 and background fabric 238 for fabric swatch array 230 and to automatically calculate a region of interest.
  • analysis computer 40 may be configured to identify the boundaries of each of fabric swatches 232. After identifying the boundaries of one of fabric swatches 232, e.g., fabric swatch 232A, analysis computer 40 may calculate a coordinate system respective to that swatch. For fabric swatch 232A, for example, analysis computer 40 calculates x-axis 234A and _y-axis 236A. Analysis computer 40 may be configured such that the intersection of the x-axis and the _y-axis is located at the center of the fabric swatch. In the example of FIG.
  • analysis computer 40 calculates x-axis 234A and _y-axis 236A such that their intersection (the origin of the coordinate system formed by x-axis 234A and _y-axis 236A) is located at the center of fabric swatch 232.
  • analysis computer 40 may calculate a region of interest centered at the intersection of the x-axis and the _y-axis.
  • the region of interest may comprise the entire image of the respective one of fabric swatches 232, or only a portion thereof.
  • analysis computer 40 calculates a luminosity value for the region of interest, and correlates the luminosity value with a cleanliness value for the corresponding one of fabric swatches 232.
  • Analysis computer 40 may apply these techniques to each of fabric swatches 232 to calculate individual cleanliness values for each of fabric swatches 232.
  • analysis computer 40 may calculate an average cleanliness value for fabric swatch array 230 by calculating an average of the cleanliness values of each of fabric swatches 232.
  • Analysis computer 40 next determines a weighting value to apply to each of the calculated luminosity value and the number of pixels that exceed (or fall between) the threshold value(s) (260).
  • the weighting value may comprise a percentage value w, e.g., a value between 0 and 1 inclusive, where the value w is applied to one of the two determined values (e.g., the luminosity value), and the difference between 1 and w is applied to the other (e.g., the number of pixels above/within the threshold value(s)).
  • analysis computer 40 may calculate a rating value by applying the weighting value to the luminosity value and the number of pixels above/within the threshold value(s) (262). For example, assuming that the luminosity value is L and the number of pixels above a threshold is N, and that the weighting value is w, analysis computer 40 may calculate the rating value R as:
  • the rating value R may further be correlated with a cleanliness value.
  • various rating values may be determined for control clean wares and control stained wares to determine a correlation between rating values and cleanliness values. Then for an experimental surface of a ware corresponding to the same ware type as one of the control wares (that is, a surface with an unknown cleanliness value), analysis computer 40 may calculate a rating value and correlate the rating value with the cleanliness values to determine a cleanliness value for the experimental surface.
  • analysis computer 40 may determine a percentage of pixels in the region of interest for which luminosity values exceed or are within a threshold. Analysis computer 40 may then determine that the percentage is representative of a percentage of the surface that is stained or soiled.
  • FIG. 12 is a screenshot illustrating an example graphical user interface (GUI) 280 for entering threshold luminosity values and calculating a cleanliness value.
  • GUI 280 includes low text field 284 and high text field 286 for setting a minimum luminosity threshold value and a maximum luminosity threshold value, respectively, corresponding to the ware identified by name text field 282.
  • Analysis computer 40 may present GUI 280 to retrieve the minimum luminosity threshold value from the value entered in low text field 284 and the maximum luminosity threshold value from the value entered in high text field 286.
  • a user may enter these values into the corresponding text fields, and analysis computer 40 may retrieve the values when the user selects "Run Test" button 298.
  • Analysis computer 40 may count a pixel (e.g., corresponding to step 258 in FIG. 11) when a luminosity value for the pixel is between the minimum luminosity threshold value and the maximum luminosity threshold value.
  • analysis computer 40 retrieves the weighting values from pixel count weight text box 288 and luminosity weight text box 290. As discussed with respect to FIG. 11, analysis computer 40 may apply the pixel count weight to the pixel count value and the luminosity weight to the luminosity value to calculate a result value (not shown). Analysis computer 40 may then correlate the result value with known cleanliness values to determine a cleanliness value for the surface being analyzed. In the example of FIG. 12, analysis computer 40 has determined that the cleanliness value is a "4," as shown in cleanliness value text box 296.
  • analysis computer 40 may retrieve values from low text box 284, high text box 286, pixel count weight text box 288, and luminosity weight text box 290 when a user selects "Run Test” button 298. Analysis computer 40 may also, in accordance with the method of FIG. 11 , evaluate the pixel count and luminosity for a surface (in particular, a region of interest of the surface) according to the retrieved values. The user may also select "Cancel" button 300 to close GUI 280.
  • FIG. 13 is a block diagram illustrating an example washing apparatus 350 that may be configured to be selectively enabled and/or disabled.
  • washing apparatus 350 comprises analysis unit 354 and wares 352.
  • Washing apparatus 350 washes wares 352, which may comprise, for example, dishes, flatware, glasses, Tosi plates, fabric, stainless steel coupons, or other items or types of items. Washing apparatus 350 applies a washing process to wares 352 based on a configuration from analysis unit 354.
  • control unit 356, and the example modules thereof may each correspond to one or more other hardware units, such as DSPs, FPGAs, ASICs, one or more microprocessors, or any suitable arrangement or combination thereof.
  • Each of comprises user interface module 358, cleanliness evaluation module 360, apparatus interface module 362, item recognition module 364, camera interface module 366, and light interface module 368 may be implemented as one or more hardware, software, and/or firmware units, or any combination thereof.
  • Item profiles 370 may comprise profiles for various wares that may be washed by washing apparatus 350.
  • Each of the profiles may comprise, for example, a visual signature representative of the item, characteristics of a wash cycle to apply to the item, e.g., length of wash cycle, amount of detergent to apply, amount of water to apply, concentration of detergent to apply, a rinse time, a fresh water flush indicator, a conveyor speed (for dish machines having a conveyor), a wash water temperature (or a wash solution temperature), a rinse water temperature (or rinse solution temperature, e.g., when rinse aid is applied to the rinse water), an amount of rinse aid to apply to the rinse water, a wash water or wash solution volume, a rinse water or rinse solution volume, or other such characteristics.
  • a profile comprises a fresh water flush indicator that indicates how often rinse water stored in a sump should be flushed and replaced with fresh water. This value may be expressed as a period of time, e.g., a number of hours, a number of wash cycles, or a combination thereof.
  • a fresh water flush indicator stored in a profile for a pot may indicate that rinse water used for rinsing pots (which tend to be relatively more soiled) should be changed relatively more frequently than a fresh water flush indicator stored in a profile for a drinking glass.
  • control unit 356 may be configured with a timer or other mechanism for determining when to flush and refill a fresh water sump.
  • the fresh water flush indicator of a profile may express a modification to the timer.
  • a profile for a glass may comprise a fresh water flush indicator having a value of zero, indicating that the timer should not be modified when a glass having the profile is washed.
  • a profile for a pot may comprise a fresh water flush indicator having a value of three, indicating that the timer should be adjusted such that the timer is three units (e.g., minutes, hours, wash cycles, or the like) closer to flushing and refilling the fresh water sump.
  • Item recognition module 364 may apply one or more of the profiles from item profiles 370 to determine whether wares 352 comprises any of the items corresponding to item profiles 370. For example, item recognition module 364 may iterate through each item of item profiles 370 to retrieve a visual signature associated with the item and analyze the image of wares 352 to determine whether the visual signature occurs in the image. When item recognition module 364 identifies the visual signature of the item in the image, item recognition module 364 determines that the item is present in wares 352.
  • FIG. 16B is a conceptual diagram illustrating an example visual signature for detecting a plate.
  • plate 454 illustrated as a top-down view, with an edge of the plate at the top
  • plate reflection pattern 456 includes plate reflection pattern 456. That is, light from light source 374 may typically reflect off of an edge of plate 454 such that an image of plate 454 includes plate reflection pattern 456.
  • Item recognition module 364 may be configured to inspect an image to search for values indicative of plate reflection pattern 456.
  • analysis unit 354 finds a pattern of illumination values indicative of plate reflection pattern 456, analysis unit 354 determines that a plate is present in the image at the location of plate reflection pattern 456. In some examples, analysis unit 354 may be further configured to determine that a plurality of plates is present in the image upon detecting a plurality of plate reflection patterns similar to plate reflection pattern 456. Moreover, analysis unit 354 may be configured to determine that one or more glasses and one or more plates are present in an image upon detecting one or more glass reflection patterns such as glass reflection pattern 452 and one or more plate reflection patterns such as plate reflection pattern 456.
  • item recognition module 364 is configured to determine that wares 352 consist of glasses when camera 372 captures an image of wares 352 and item recognition module 364 identifies only one or more glass reflection patterns 452.
  • item recognition module 364 may inform apparatus interface module 362 that wares 352 consist only of glasses.
  • Apparatus interface module 362 may accordingly cause washing apparatus 350 to perform a wash cycle that is optimized for glasses.
  • apparatus interface module 362 may retrieve wash cycle characteristics associated with the profile for glasses from item profiles 370 and apply the wash cycle.
  • an optimal wash cycle for glasses may comprise a relatively short, hot rinse with minimal or no detergent.
  • apparatus interface module 362 may inform cleanliness evaluation module 360 that wares 352 consist of glasses.
  • Cleanliness evaluation module 360 may retrieve a glass cleanliness threshold from the glass profile of item profiles 370. Cleanliness evaluation module 360 may periodically cause camera 372 to capture an image of wares 352 and evaluate the cleanliness of wares 352, e.g., as described above with respect to the method of FIG. 11. That is, upon determining a rating value for the glass, cleanliness evaluation module 360 may compare the rating value to the cleanliness threshold value associated with the particular item profile retrieved from item profiles 370. When the rating value exceeds the cleanliness threshold value, cleanliness evaluation module 360 may inform apparatus interface module 362 that the item is clean. When all of wares 352 are determined to be clean, apparatus interface module 362 may automatically cause washing apparatus 350 to stop the wash process, or to apply a rinse cycle and then stop the wash process.
  • item recognition module 364 may be configured to determine that wares 352 consists of plates when camera 372 captures an image of wares 352 and item recognition module 364 identifies only one or more plate reflection patterns 456. When item recognition module 364 determines that wares 352 consist of plates only, item recognition module 364 may inform apparatus interface module 362 that wares 352 consist of only plates. Accordingly, apparatus interface module 362 causes washing apparatus 350 to deliver a wash cycle that is optimized for plates. In one example, an optimal wash cycle for plates may comprise a relatively longer washing period comprising a first portion during which detergent is applied and a second portion comprising a hot rinse. Alternatively, a similar process may be performed to that described above by which cleanliness evaluation module 360 may determine when the plates are clean and cause apparatus interface module 362 to automatically disable washing apparatus 350 when wares 352 are determined to be clean.
  • apparatus interface module 362 may be configured to control washing apparatus 350 by selecting a predetermined wash cycle based on identified wares of wares 352 and causing washing apparatus 350 to apply the wash cycle to wares 352.
  • apparatus interface module 362 may interact with cleanliness evaluation module 360 to monitor progress of a washing procedure by washing apparatus 350 to automatically disable washing apparatus 350 after cleanliness evaluation module 360 has determined that each of wares 352 are clean.
  • cleanliness evaluation module may periodically cause camera 372 to capture an image of wares 352 via camera interface module 366. Cleanliness evaluation module may then calculate a cleanliness value of each detected one of wares 352.
  • User interface module 358 may receive input via user interface 376 from a user.
  • user interface 376 may comprise one or more buttons, displays, touch-screens, computer interfaces (e.g., universal serial bus (USB) or serial interfaces), knobs, levers, or other means for receiving and/or providing information from/to a user.
  • user interface 376 may receive an indication from a user that a wash cycle should begin.
  • control unit 356 is configured to automatically determine a wash cycle based on visual recognition of wares 352.
  • a user may also provide one or more item profiles to be stored as item profiles 370 via user interface 376.
  • User interface module 358 may be configured to cause user interface 376 to retrieve a representation of a visual signature for a particular item and/or wash characteristics for the item.
  • User interface module 358 may also enable a user to review, modify, or delete existing item profiles of item profiles 370.
  • control unit 356 may be configured such that a user may place a control ware in washing apparatus 350 and item recognition module 364 may automatically determine a visual signature for the item.
  • a user may upload a visual signature for the item via user interface 376.
  • User interface module 358 may retrieve the uploaded visual signature and store the visual signature, along with other item profile information, in item profiles 370.
  • analysis unit 354 may comprise independent, stand-alone hardware unit that can be attached, affixed, coupled, or integrated with washing apparatus 350.
  • analysis computer 214 of FIG. 8 may be configured to perform the tasks discussed above with respect to control unit 356.
  • analysis computer 214 may additionally comprise a user interface, such as user interface 376.
  • Analysis computer 214 may store item profiles 370 in a computer-readable storage medium, such as an internal hard drive, magnetic recording media via a floppy drive, optical media via an optical drive, internal flash memory, an external flash drive via a USB or other interface, or any other suitable computer-readable storage medium.
  • Camera 372 of FIG. 13 may correspond to camera 212 of FIG. 8, while light source 374 may comprise either or both of lights 210 of FIG. 8.
  • FIG. 14 is a flowchart illustrating an example method for selectively disabling a washing apparatus based on a wash cycle associated with a particular ware being washed.
  • the method of FIG. 14 is described with respect to washing apparatus 350, although it should be understood that other washing apparatuses and/or other control units may perform the method of FIG. 14.
  • control unit 356 receives an indication that a washing cycle should begin (400).
  • user interface module 358 of control unit 356 may receive the indication to begin the washing cycle from a user via user interface 376.
  • control unit 356 upon receiving an indication that a door or other enclosure of washing apparatus 350 has been closed, causes camera 372 to capture and image and, upon detection of wares in the image, control unit 356 interprets the presence of wares 352 in washing apparatus 350 as an indication to begin a wash cycle.
  • camera interface module 360 Upon receiving the indication to begin the wash cycle, camera interface module 360 causes camera 372 to capture an image of wares 352 in washing apparatus 350 (402). Camera interface module 360 may also communicate with light interface module 368 to coordinate illumination via light source 374 and image capture via camera 372. Light source 374 may illuminate wares 352 and camera 372 may capture an image of wares 352. Camera interface module 366 may then provide the image to item recognition module 370.
  • Item recognition module 364 may continue to iterate through item profiles 370 until all visual signatures have been attempted.
  • Control unit 356 may then select a wash cycle to apply to wares 352 based on a maximum wash cycle of the selected ones of item profiles 370 (406).
  • Each item profile may correspond to a hierarchical ordering. For example, glasses may correspond to a low level on the hierarchy, plates may correspond to a middle level on the hierarchy, and pots and pans may correspond to a high level on the hierarchy.
  • Control unit 356 may select a wash cycle corresponding to the highest level of the hierarchy represented by the selected ones of the item profiles.
  • apparatus interface module 362 After selecting a wash cycle, apparatus interface module 362 enables washing apparatus according to the selected wash cycle (408). Apparatus interface module 362 may also control particular elements of the wash cycle such as water temperature, water spraying patterns, amount of water to spray during the wash cycle, amount of detergent to apply during the wash cycle, length of the wash cycle, rinsing patterns to apply during the wash cycle, or other elements. Control unit 356 may be configured to determine that wares 352 are clean after the selected wash cycle expires. Accordingly, apparatus interface module 362 may automatically disable washing apparatus 350 after the selected wash cycle expires (410). In this manner, washing apparatus 350 may reduce wasted resources, such as detergent, water, and/or electricity, for items that do not require as much of those resources to become clean. Moreover, analysis unit 354 may automatically select the appropriate wash cycle based on visual signatures of wares 352, which may avoid user error in selecting an inappropriate wash cycle so that the reduction in resources may be more likely to occur.
  • Control unit 356 may be configured to determine that wares
  • FIG. 15 is a flowchart illustrating an example method for selectively disabling a washing apparatus based on a cleanliness value determined for wares being washed by the washing apparatus.
  • control unit 356 receives an indication that a washing cycle should begin (420).
  • user interface module 358 of control unit 356 may receive the indication to begin the washing cycle from a user via user interface 376.
  • control unit 356 upon receiving an indication that a door or other enclosure of washing apparatus 350 has been closed, causes camera 372 to capture and image and, upon detection of wares in the image, control unit 356 interprets the presence of wares 352 in washing apparatus 350 as an indication to begin a wash cycle.
  • camera interface module 360 Upon receiving the indication to begin the wash cycle, camera interface module 360 causes camera 372 to capture an image of wares 352 in washing apparatus 350 (422). Camera interface module 360 may also communicate with light interface module 368 to coordinate illumination via light source 374 and image capture via camera 372. Light source 374 may illuminate wares 352 and camera 372 may capture an image of wares 352. Camera interface module 366 may then provide the image to item recognition module 370.
  • apparatus interface module 362 may enable washing apparatus 350 (426).
  • cleanliness evaluation module 360 may cause camera interface module 366 to retrieve an image of wares 352 using camera 372.
  • Cleanliness evaluation module 360 may evaluate cleanliness values for each of the items in the image (428).
  • each item profile may include a method for determining cleanliness, such as a threshold cleanliness value which must be exceeded for an item of that type to be considered clean. Accordingly, cleanliness evaluation module 360 may determine whether cleanliness values for each item of wares 352 exceeds a corresponding cleanliness threshold value (430).
  • washing apparatus 350 While at least one of wares 352 is not determined to be clean ("NO" branch of 430), that is, while at least one cleanliness value is determined not to exceed a corresponding cleanliness threshold, washing apparatus 350 continues to wash wares 352. Cleanliness evaluation module 360 may then periodically reevaluate cleanliness of each of wares 352. After cleanliness evaluation module 360 determines that the cleanliness values for each of wares 352 exceeds a
  • cleanliness evaluation module 360 informs apparatus interface module 362 that wares 352 are clean.
  • Instructions to perform certain techniques described in this disclosure may also be embodied or encoded in a computer-readable medium, such as a computer- readable storage medium. Instructions embedded or encoded in a computer-readable medium may cause a programmable processor, or other processor, to perform the method, e.g., when the instructions are executed.
  • Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a CD-ROM, a floppy disk, a cassette, magnetic media, optical media, or other computer readable media.

Landscapes

  • Cleaning By Liquid Or Steam (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne une unité de commande pouvant activer et/ou désactiver automatiquement un appareil de lavage sur la base du traitement d'une image des objets à laver par l'appareil de lavage. Dans un exemple, un système comprend une caméra configurée pour capturer une image d'un ou plusieurs objets à laver par un appareil de lavage, un support lisible par ordinateur comprenant une pluralité de profils d'articles, chacun des profils d'articles comprenant une signature visuelle et une ou plusieurs caractéristiques des cycles de lavage, et une unité de commande configurée pour extraire l'image, détecter une des signatures visuelles de la pluralité des profils d'articles dans l'image, sélectionner le profil d'article correspondant à la signature visuelle détectée, activer l'appareil de lavage, et désactiver automatiquement celui-ci après que l'objet ait été déterminé comme étant propre conformément aux profils d'articles sélectionnés.
PCT/IB2010/054787 2009-10-23 2010-10-21 Traitement optique pour commander un appareil de lavage Ceased WO2011048575A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US12/604,893 2009-10-23
US12/604,893 US8509473B2 (en) 2009-06-29 2009-10-23 Optical processing to control a washing apparatus
US12/628,478 2009-12-01
US12/628,478 US8229204B2 (en) 2009-06-29 2009-12-01 Optical processing of surfaces to determine cleanliness

Publications (2)

Publication Number Publication Date
WO2011048575A2 true WO2011048575A2 (fr) 2011-04-28
WO2011048575A3 WO2011048575A3 (fr) 2011-08-04

Family

ID=43900751

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/054787 Ceased WO2011048575A2 (fr) 2009-10-23 2010-10-21 Traitement optique pour commander un appareil de lavage

Country Status (1)

Country Link
WO (1) WO2011048575A2 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013037723A1 (fr) 2011-09-14 2013-03-21 Meiko Maschinenbau Gmbh & Co. Kg Appareil de nettoyage et de désinfection pour le traitement de récipients pour excréments humains
DE102013202540A1 (de) 2013-02-18 2014-08-21 Olympus Winter & Ibe Gmbh Verfahren zum Betrieb einer Aufbereitungsvorrichtung und Aufbereitungsvorrichtung für chirurgische Instrumente
EP3205764A1 (fr) * 2016-02-15 2017-08-16 E.G.O. ELEKTRO-GERÄTEBAU GmbH Procédé et dispositif de nettoyage
DE102017126856A1 (de) * 2017-11-15 2019-05-16 Illinois Tool Works Inc. Spülmaschine sowie Verfahren zum Reinigen von Spülgut in einer Spülmaschine
WO2021113258A1 (fr) * 2019-12-03 2021-06-10 Ecolab Usa Inc. Vérification de l'efficacité d'un procédé de nettoyage
RU2760379C2 (ru) * 2017-07-13 2021-11-24 Канди С.П.А. Способ автоматического определения неправильного положения объекта в рабочей зоне посудомоечной машины
DE102020128333A1 (de) 2020-10-28 2022-04-28 Illinois Tool Works Inc. Transportspülmaschine sowie verfahren zum betreiben einer transportspülmaschine
US11393083B2 (en) 2017-10-03 2022-07-19 Ecolab Usa Inc. Methods and system for performance assessment of cleaning operations
US11666198B2 (en) 2020-10-02 2023-06-06 Ecolab Usa Inc. Monitoring and control of thermal sanitization in automated cleaning machines
US11889963B2 (en) 2020-05-29 2024-02-06 Ecolab Usa Inc. Automated cleaning machine processing using shortened cycle times
US12207776B2 (en) 2020-09-25 2025-01-28 Ecolab Usa Inc. Machine learning classification or scoring of cleaning outcomes in cleaning machines

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292949A (ja) * 2000-04-13 2001-10-23 Hanshin Electric Co Ltd 食器洗浄乾燥器の予想運転所要時間演算方法,及び予想運転所要時間の演算機能を有する食器洗浄乾燥器
JP2004261295A (ja) * 2003-02-28 2004-09-24 Mitsubishi Electric Corp 食器洗浄機
JP4158666B2 (ja) * 2003-09-26 2008-10-01 松下電器産業株式会社 食器洗浄装置

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9468956B2 (en) 2011-09-14 2016-10-18 Meiko Maschinenbau Gmbh & Co. Kg Cleaning and disinfecting apparatus for treating containers for human excretions
WO2013037723A1 (fr) 2011-09-14 2013-03-21 Meiko Maschinenbau Gmbh & Co. Kg Appareil de nettoyage et de désinfection pour le traitement de récipients pour excréments humains
DE102013202540A1 (de) 2013-02-18 2014-08-21 Olympus Winter & Ibe Gmbh Verfahren zum Betrieb einer Aufbereitungsvorrichtung und Aufbereitungsvorrichtung für chirurgische Instrumente
EP3205764A1 (fr) * 2016-02-15 2017-08-16 E.G.O. ELEKTRO-GERÄTEBAU GmbH Procédé et dispositif de nettoyage
CN107080509A (zh) * 2016-02-15 2017-08-22 E.G.O.电气设备制造股份有限公司 清洗工艺和清洗装置
KR20170095750A (ko) * 2016-02-15 2017-08-23 에.게.오. 에렉트로-게래테바우 게엠베하 클리닝 방법 및 클리닝 디바이스
RU2760379C2 (ru) * 2017-07-13 2021-11-24 Канди С.П.А. Способ автоматического определения неправильного положения объекта в рабочей зоне посудомоечной машины
US12141957B2 (en) 2017-10-03 2024-11-12 Ecolab Usa Inc. Methods and system for performance assessment of cleaning operations
US11803957B2 (en) 2017-10-03 2023-10-31 Ecolab Usa Inc. Methods and system for performance assessment of cleaning operations
US11393083B2 (en) 2017-10-03 2022-07-19 Ecolab Usa Inc. Methods and system for performance assessment of cleaning operations
US11950743B2 (en) 2017-11-15 2024-04-09 Illinois Tool Works Inc. Dishwasher and method for cleaning washware in a dishwasher
CN111343894A (zh) * 2017-11-15 2020-06-26 伊利诺斯工具制品有限公司 洗碗机及用于清洁洗碗机中的洗涤器皿的方法
DE102017126856A1 (de) * 2017-11-15 2019-05-16 Illinois Tool Works Inc. Spülmaschine sowie Verfahren zum Reinigen von Spülgut in einer Spülmaschine
WO2019099224A1 (fr) * 2017-11-15 2019-05-23 Illinois Tool Works Inc. Lave-vaisselle et procédé pour laver de la vaisselle dans un lave-vaisselle
CN111343894B (zh) * 2017-11-15 2024-03-22 伊利诺斯工具制品有限公司 洗碗机及用于清洁洗碗机中的洗涤器皿的方法
US12133619B2 (en) 2019-12-03 2024-11-05 Ecolab Usa Inc. Verification of cleaning process efficacy
WO2021113258A1 (fr) * 2019-12-03 2021-06-10 Ecolab Usa Inc. Vérification de l'efficacité d'un procédé de nettoyage
CN114650760A (zh) * 2019-12-03 2022-06-21 埃科莱布美国股份有限公司 清洁过程效果的验证
US11889963B2 (en) 2020-05-29 2024-02-06 Ecolab Usa Inc. Automated cleaning machine processing using shortened cycle times
US12419485B2 (en) 2020-05-29 2025-09-23 Ecolab Usa Inc. Automated cleaning machine processing using shortened cycle times
US12207776B2 (en) 2020-09-25 2025-01-28 Ecolab Usa Inc. Machine learning classification or scoring of cleaning outcomes in cleaning machines
US11666198B2 (en) 2020-10-02 2023-06-06 Ecolab Usa Inc. Monitoring and control of thermal sanitization in automated cleaning machines
US12465190B2 (en) 2020-10-02 2025-11-11 Ecolab Usa Inc. Monitoring and control of thermal sanitization in automated cleaning machines
DE102020128333B4 (de) 2020-10-28 2024-07-25 Illinois Tool Works Inc. Transportspülmaschine sowie verfahren zum betreiben einer transportspülmaschine
DE102020128333A1 (de) 2020-10-28 2022-04-28 Illinois Tool Works Inc. Transportspülmaschine sowie verfahren zum betreiben einer transportspülmaschine

Also Published As

Publication number Publication date
WO2011048575A3 (fr) 2011-08-04

Similar Documents

Publication Publication Date Title
US8509473B2 (en) Optical processing to control a washing apparatus
US8229204B2 (en) Optical processing of surfaces to determine cleanliness
WO2011048575A2 (fr) Traitement optique pour commander un appareil de lavage
US12133619B2 (en) Verification of cleaning process efficacy
US12207776B2 (en) Machine learning classification or scoring of cleaning outcomes in cleaning machines
US12495947B2 (en) Control of cleaning machine cycles using machine vision
US9468956B2 (en) Cleaning and disinfecting apparatus for treating containers for human excretions
DE10048081A1 (de) Verfahren zur Erkennung der Spülgutbeladung und/oder des Verschmutzungsgrades von Spülgut in einer programmgesteuerten Geschirrspülmaschine und Geschirrspülmaschine dafür
CA3102183A1 (fr) Procede d'evaluation de l'adequation de conditions d'eclairage a une detection d'analyte dans un echantillon a l'aide d'une camera d'un dispositif mobile
US20230036605A1 (en) Dishwasher, arrangement having a dishwasher, and method for operating a dishwasher
WO2015197109A1 (fr) Procédé de fonctionnement d'appareil de lavage, et appareil de lavage
US20220065536A1 (en) Cooking appliance and method for operating a cooking appliance
CN115349778B (zh) 扫地机器人的控制方法、装置、扫地机器人及存储介质
US12196734B2 (en) Cooking oil degradation degree determining device, cooking oil degradation degree determination processing device, cooking oil degradation degree determination method, and fryer
US11957292B2 (en) Dishwasher coverage alert system and method
CN119214537A (zh) 清洁设备的清洁方法、自清洁方法、装置、设备及系统
CN108956637B (zh) 脏污程度检测方法、装置、电子设备和智能家电
CN115486784A (zh) 洗碗机及其控制方法、装置和可读存储介质
JP2023125307A (ja) 便座装置及び便器装置
US20230069659A1 (en) Dishwashing appliance and methods for improved calibration using image recognition
TR201706549A2 (tr) Çalişma performansi i̇yi̇leşti̇ri̇len bi̇r akilli kesme tahtasi
CN121433009A (zh) 烤箱控制方法、介质及电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10824557

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10824557

Country of ref document: EP

Kind code of ref document: A2