[go: up one dir, main page]

WO2014181323A1 - System and method of retail image analysis - Google Patents

System and method of retail image analysis Download PDF

Info

Publication number
WO2014181323A1
WO2014181323A1 PCT/IL2014/050377 IL2014050377W WO2014181323A1 WO 2014181323 A1 WO2014181323 A1 WO 2014181323A1 IL 2014050377 W IL2014050377 W IL 2014050377W WO 2014181323 A1 WO2014181323 A1 WO 2014181323A1
Authority
WO
WIPO (PCT)
Prior art keywords
retail
module
image
images
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IL2014/050377
Other languages
French (fr)
Inventor
Guillaume DE LAZZER
Benoit VALIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TRAX SOLUTIONS RETAIL Ltd
Trax Technology Solutions Pte Ltd
Original Assignee
TRAX SOLUTIONS RETAIL Ltd
Trax Technology Solutions Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TRAX SOLUTIONS RETAIL Ltd, Trax Technology Solutions Pte Ltd filed Critical TRAX SOLUTIONS RETAIL Ltd
Publication of WO2014181323A1 publication Critical patent/WO2014181323A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Definitions

  • the present disclosure relates generally to the field of image processing. More particularly, the present disclosure relates to a system and a method of analyzing a retail image of a shelving module.
  • Imaging sensors are increasingly used for monitoring scenes, for example in surveillance contexts.
  • the Applicant of the present application has proposed to use cameras in retail stores for brand management purposes. More particularly, the Applicant has described in a pending patent application a retail unit including a shelving module (for example a refrigerator in a supermarket) and a door module embedding one or more cameras so as to be capable of identifying how retail items contained in the retail unit are displayed.
  • the image acquisition is advantageously triggered by motion detection of the door module and the images captured by the cameras are forwarded to a processor module which can transmit said images to a remote control center using, for example, wireless telecommunication networks.
  • the Applicant has identified a need for limiting amount of data to be transmitted to the remote control center.
  • the present disclosure therefore provides a method of analyzing a retail image, the method comprising: obtaining a retail image representative of a flank side of a shelving module including at least one shelf; detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf; determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment. This may enable to easily detect obstruction on the retail image.
  • detecting the line segment comprises performing edge detection on the retail image.
  • detecting the line segment comprises performing shape recognition on the retail image.
  • the method further comprises determining an orientation of the line segment with respect to a reference orientation and wherein rating the image is further based on orientation of the line segment. This may enable to assess a relative positioning of the imaging sensor and shelving module and thereby enables to detect images suitable for brand management i.e. images with large a field of view.
  • the present disclosure further provides a method of inspecting a shelving module including at least one shelf, the method comprising: imaging the shelving module so as to obtain a set of retail images representative of a flank of the shelving module; analyzing each retail image according to the method of analyzing previously described; discarding retail images whose rating does not meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images; and transmitting the subset of eligible retail images to a remote control center. This may enable to saving transmission costs.
  • imaging the shelving module is performed upon detection of motion near the shelving module.
  • a pivotable door module is arranged on the shelving module and the imaging is performed upon opening of the door module.
  • the present disclosure further provides a processor module for analyzing a retail image
  • the processor module comprising a processor coupled with a memory storing computer readable instructions that when executed causes the processor module to perform the steps of: obtaining a retail image representative of a flank side of a shelving module including at least one shelf; detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf; determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment.
  • the memory further stores computer readable instructions that when executed by the processor causes the processor module to perform the step of determining an orientation of the line segment and wherein rating the image is further based on the orientation of the line segment.
  • the memory further stores computer readable instructions that when executed by the processor and upon receipt of a set of retail images causes the processor module to perform the steps of: analyzing the set of retail images by successively performing the steps previously described on each retail image; discarding retail images whose rating does not meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images.
  • the present disclosure further provides a retail inspection system comprising: an imaging module configured to acquire a set of retail images upon activation; and the processor module as previously described, the processor module being further configured to: receive the set of images from the imaging module; and transmit the subset of eligible retail images to a remote control center.
  • the imaging module may acquire relevant retail images when the imaging module is positioned so as to capture a flank side of a shelving module including at least a shelf.
  • the present disclosure also provides a retail unit comprising: a shelving module including at least one shelf; a door module configured to pivot away from the shelving module; a retail inspection system as previously described, wherein the imaging module is arranged on an inner side of the door module, thereby allowing imaging of a flank side of the shelving module.
  • the imaging module is arranged so that an optical axis of the imaging module is substantially parallel to a plane of extension of the at least one shelf.
  • the present disclosure also provides a computer program adapted to perform the methods previously described.
  • the present disclosure also provides a computer readable storage medium comprising the computer program previously described.
  • corresponding and its derivatives may be used in the present application to associate an object in the physical world to its representation in an image.
  • the term “element” refers to an object in the physical world whereas the term “feature” refers to a component in the image.
  • a feature and a corresponding element are in fact conjugates through the imaging module.
  • the present disclosure also provides a method of analyzing a retail image comprising: obtaining a retail image representative of a display module, wherein the display module comprises at least one predefined reference element; detecting on the retail image a reference feature corresponding to the at least one reference element; detecting on the retail image a reference feature corresponding to the at least one reference element of the display module; determining a dimensional parameter of the reference feature in the retail image rating the retail image based on the dimensional parameter of the reference feature.
  • a reference feature may correspond to a reference element which has multiple occurrences in the surface of interest of the retail shelving module.
  • the multiple occurrences may be scattered over the surface of interest of the shelving module.
  • a ratio between a dimensional parameter of the reference element and a dimensional parameter of the surface of interest of the shelving module may be above a predefined threshold.
  • the dimensional parameter may be any of a width, a height, etc.
  • the reference element may be selected so as to be visible on at least some of the acquired images.
  • the display module may be in some embodiments a shelving module and the reference element may be a shelf corresponding in an image to reference feature in the form of a line segment.
  • Figs. 1A and IB illustrate a shelving module according to some embodiments of the present disclosure.
  • Fig. 2 is a flow diagram illustrating steps of a method of inspecting a shelving module according to some embodiments of the present disclosure.
  • Fig. 3 is a flow diagram illustrating steps of a method of analyzing a retail image according to some embodiments of the present disclosure.
  • Figs. 4A and 4B respectively illustrate schematically a retail image without and with obstruction in some embodiments of the present disclosure.
  • Fig. 5 illustrates schematically a retail image acquired with a non-optimal line of sight orientation in some embodiments of the present disclosure.
  • Figs. 6A and 6B illustrate respectively a retail image acquired with a non- optimal line of sight orientation and an eligible retail image in some embodiments of the present disclosure.
  • Figs. 7A and 7B illustrate schematically a retail unit according to some embodiments of the present disclosure.
  • Fig. 7 A shows the retail unit with a door module open while
  • Fig. 7B shows the retail unit with the door module closed.
  • Fig. 8 illustrates schematically an upper view of the retail unit according to some embodiments of the present disclosure.
  • non-transitory is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable for the application.
  • the phrase “for example,” “such as” , “for instance” and variants thereof describe non-limiting embodiments of the presently disclosed subject matter.
  • Reference in the specification to “one case”, “some cases”, “other cases” or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter.
  • the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same embodiment(s).
  • Figs. 1A and IB illustrate a shelving module 20 according to some embodiments of the present disclosure.
  • the shelving module 20 of Fig. IB is similar to the shelving module 20 of Fig. 1A except from that it is further accommodated within a housing structure 27. Therefore, references to similar elements have not been repeated for the sake of clarity.
  • the shelving module 20 may comprise one or more shelves 21, 22, 23.
  • the shelves 21, 22, 23 may have a plate shape and may comprise an upper extension surface 212, 222, 232, a lower extension surface (not shown) parallel to the upper extension surface 212, 222, 232 and a border frame 211, 221, 231 wherein each border frame 211, 221, 231 is formed by flank edges of the shelves 21, 22, 23.
  • the border frame of each shelf is formed by four flank edges.
  • the flank edge can be understood as the surfaces of the shelf which form a thickness e of the shelf.
  • the shelves 21, 22, 23 may be positioned perpendicular to a vertical axis X.
  • the upper extension surface 212, 222, 232 is configured for accommodating retail items.
  • the shelving module 20 may be positioned so that the lower extension surfaces of the shelves 21, 22, 23 face the ground.
  • a flank side of the shelving module 20 refers to a plane surface comprising a flank edge of the shelves 21, 22, 23 i.e. a surface including at least one flank edge of one shelf.
  • the shelving module 20 may be accommodated in the housing structure 27 (for example in a refrigerator or in a closet).
  • the housing structure 27 may comprise fixed upright walls 271, 272, 273. At least a part of one flank side of the shelving module 20 may not be enveloped by the housing structure 27 thereby defining an access opening of the shelving module 20 enabling inserting and removing retail items in the shelving module 20.
  • the housing 27 may comprise an inner volume for accommodating the shelving unit 20 and may further define an access opening for enabling removing and inserting retail items to an from the shelving module 20.
  • the housing structure 27 may further comprise an upper wall 274 and a lower wall 275.
  • Fig. 2 generally illustrates a method of inspecting a retail unit in some embodiments of the present disclosure.
  • the retail unit may comprise a shelving module as previously described.
  • imaging of the shelving module may be performed so as to capture at least a flank side of the shelving module and allow imaging of a flank edge of a shelf.
  • the imaging may be performed by an imaging module comprising one or more imaging sensors (also referred to as cameras).
  • the optical axis of these imaging sensors may preferably be substantially parallel (for example ⁇ 10 degrees) to the upper extension surface (also referred to as a shelf plane) of the shelves of the shelving module.
  • the imaging may further be performed so that the acquired images allow viewing the retails items intended to be accommodated in the shelving module.
  • the one or more cameras may be mounted on a platform facing the shelving module.
  • the acquired images of the shelving module refer to images of at least part of a surface of interest of the shelving module wherein imaging the surface of interest allows viewing the retail items when the retail items are accommodated in the shelving module.
  • the shelving module may be accommodated in the housing structure and the surface of interest may be the access opening of the shelving module.
  • the retail unit may further comprise a door module configured for pivoting away from the shelving module.
  • the door module may be operated (positioned) in an open state and in a closed state. In the closed state, the door module may face the access opening of the shelving module and in the open state, at least a part of the door module may be distant from the opening access of the shelving module.
  • the access opening may form a doorway i.e. a surface on which the door module lies when the door module is closed. The access opening may be considered as the surface of interest in these embodiments.
  • the shelving module may be accessible through the access opening so as to enable removing and/or inserting retail items.
  • the door module may for example comprise a door panel hinged to a peripheral portion of the housing structure.
  • an imaging module may be arranged on the inner side of the door module thereby allowing imaging of the retail items intended to be disposed in the shelving module.
  • the imaging module may comprise one or more cameras.
  • the inner side of the door module may be understood as the side of the door module facing the access opening when the door module is in the closed state.
  • each imaging sensor is configured so that the line of sight is perpendicular to a plane of extension of the access opening when the door module is open at a predetermined opening angle.
  • the cameras may further be configured so that, at the predetermined opening angle, a lateral field of view (in a direction perpendicular to the vertical axis X) of the cameras allows viewing the width W (or at least 90% of the width) of the opening access and/or of the shelves.
  • the imaging module comprises one camera
  • the camera may be configured so that at the predetermined opening angle, a vertical field of view (in the direction of the vertical axis X) of the camera allows viewing a height H along which the one or more shelves are positioned.
  • the imaging module comprises a plurality of cameras
  • the cameras may be configured so that stitching of several images acquired simultaneously by different cameras may allow reconstructing an image capturing the one or more shelves.
  • the imaging may be triggered by motion detection near the shelving module.
  • the imaging may be triggered by detection of an opening of the door module and may be stopped by detection of a closing of the door module.
  • the imaging module may image continuously and the imaging step S100 may comprise increasing the image acquisition frequency.
  • motion detection may be performed by image stream analysis. Step S100 may lead to a set of retail images of a flank side of the shelving module.
  • the retail unit may further comprise a processor module configured for receiving the acquired images, for analyzing the images so as to define an eligible subset of images and for transmitting the eligible subset to a remote control center, as described hereinafter.
  • the Applicant has found that it is desirable to reduce the amount of images to transmit to the remote control center verifying an assortment of the retail unit. In order to reduce the amount of data to transmit, the Applicant has found that a computationally economical pre-processing may be undertaken to identify a subset of eligible images from the set of acquired retail images.
  • the acquired retail images may be analyzed.
  • the analyzing step SllO may comprise, for at least some of the acquired images and preferably for each acquired image, an obtaining step Sill, a detecting step SI12, a determining step SI13 and a rating step SI14.
  • the retail image representative of the shelving module may be obtained (i.e. received) by the processor module.
  • the processor module may be connected to the imaging module.
  • a reference feature may be tracked in the retail image.
  • the at least one shelf will correspond in the retail image to a straight line i.e. a narrow strip whose thickness in the retail image depends on the actual thickness of the shelf, on the position of the imaging sensor along the vertical axis and on optical parameters of the imaging module.
  • the Applicant has advantageously noticed that the flank edge of the at least one shelf will give rise in the retail image to a straight line which is easily detectable, especially when the shelf accommodates retail items.
  • the color gradient between the flank edge and its environment is generally high because of the retail items accommodated on the shelf and because of shading effect. Therefore, the detection of the straight line corresponding to the flank edge is facilitated.
  • the flank edges of the three shelves 21, 22, 23 of the shelving module may give rise to three straight lines 210, 220, 230 on the retail image capturing the corresponding flank side of the shelving module.
  • Fig. 6B as well provides with a similar illustration provided by a real picture captured by a prototype system built by the Applicant.
  • the tracked reference feature may be a line segment corresponding in the retail image to the flank edge of the at least one shelf of the shelving module. It is appreciated that the term "correspond" is used to associate an object in the physical world to its representation in the acquired image. It is noted that step SI 12 may lead to detecting a plurality of line segments, for example when the shelving module comprises several shelves. Also, step S112 may lead to detecting a plurality of line segments if an object is positioned between the imaging module and the flank edge of the at least one shelf because the line corresponding to the flank edge would be interrupted by the image of the object. Step S112 may comprise a first step of edge detection involving detecting whether a difference of intensity (or of color intensity) between adjacent pixels is above a predetermined threshold.
  • the step of edge detection may lead to obtaining an edge pixel map in which each pixel may be associated with a binary value depending on the intensity gradient being above or below a predefined threshold. Thereafter, a shape recognition step may be performed.
  • the shape recognition may involve determining how the edge pixels (i.e. pixels for which the intensity gradient is above the predefined threshold) are distributed so as to detect line segments.
  • the first step of edge detection may involve a Canny edge detection algorithm and the second step of shape recognition may involve a Hough transform algorithm.
  • a dimension i.e. a pixel size
  • line segments having a dimension below a predetermined threshold may be discarded.
  • the line segments with a dimension below the predetermined threshold may not correspond to the flank edge of the shelves.
  • a sum of the dimension of the line segments detected (and not discarded) may further be computed.
  • a line segment corresponding to the shelf will be correlatively of a smaller dimension. Therefore, measuring the size of line segments in the retail images infers a probability of obstruction of the imaging module and thereby enables to discard these images for avoiding unnecessary transmission to the remote control center.
  • a rating is associated with each retail image.
  • the rating may increase with the sum of the pixel sizes of the line segments. In some embodiments, the rating is proportional to the sum of the pixel sizes of the line segments.
  • an orientation of the line segments may be also determined.
  • the orientation of the line segments may be defined as an angle a formed by the line segment with a horizontal line in the retail image. As shown with reference to Fig. 5 and 6A, this enables to detect that the line of sight of the imaging module is not perpendicular with the flank edge.
  • the rating may decrease with the orientation of the line segments. In some embodiments, the rating may be put to zero when the orientation is above a predetermined threshold angle.
  • the rating may also take into account other parameters such as an indicator of blur motion and/or of a FOV.
  • the rating of the retail images may be compared to an eligibility criterion and the images which do not satisfy the eligibility criterion may be discarded so as to obtain a subset of eligible images.
  • the criterion may be defined by a criterion threshold above which a an image is considered to be eligible.
  • a fourth step S130 the subset of eligible images is communicated to a remote control center, for example using a 3G telecommunication network.
  • the ineligible images may not be communicated to the remote control center.
  • Fig. 7 A and 7B illustrate a retail unit 1 according to some embodiments of the present disclosure.
  • the retail unit 1 comprises a shelving module 20, a door module 30, an imaging module 40 and a processor module 50.
  • Fig. 7 A shows a retail unit with the door module 30 open while Fig. 7B shows the retail unit with the door module 30 closed. It is noted that, even though the illustrated embodiment refers to a retail unit 1 in the form of a refrigerator, the features described herein below can be extended to different types of retail units such as all kinds of cabinets.
  • the shelving module 20 is capable of accommodating one or more retail items 10.
  • the shelving module 20 may comprise one or more shelves 21-25.
  • the retail unit may further comprise a housing 27 partially enclosing the shelving module 20.
  • the housing 27 may comprise an access opening enabling a user to access the one or more shelves 21-25 so as to allow seizing of a retail item from the one or more shelves 21-25.
  • the housing 27 and the shelving module 20 may define a structure in which the retail items are intended to be accommodated and the access opening may be defined as an aperture (or an entrance surface) of the housing enabling to access the retail items.
  • the one or more shelves 21-25 may be disposed in parallel within the housing 27 and may be capable of storing and/or displaying the one or more retail items 10.
  • the one or more retail items may for example comprise drinks, food products, eyewear, medicine, etc.
  • the door module 30 is configured to close at least a part of the saccess opening.
  • the door module 30 may comprise a door panel 31.
  • the door panel 31 may be transparent so as to allow people standing in front of the retail unit 1 to see the retail items 10 intended to be accommodated in the retail unit 1.
  • the door module 30 may comprise a handle (not shown) and may be opened and closed upon operation of the handle by a user.
  • the door panel 31 may be configured to pivot away from the access opening of the shelving module 20.
  • the door panel 31 may further be configured to close the access opening when the door module 30 is operated.
  • a hinge mechanism (not shown) may be positioned on a peripheral portion of the access opening so as to define a pivot axis ⁇ of the door panel 31.
  • the imaging module 40 may be coupled to (or arranged on) on the door module 30 and configured to acquire images of the shelving module 20. More precisely, the imaging module 40 may be configured for imaging the one or more retail items 10 when the one or more retail items 10 are accommodated in the shelving module 20. The imaging module 40 may be arranged on the side of the door panel 31 facing the shelving module 20. Arranging the imaging module 40 as described enables monitoring enclosed shelving modules, particularly for brand management purposes. In fact, monitoring enclosed cabinets is generally difficult because either the door is closed and it is not possible to image through the door (for example because the door is opaque or because frost deposited on the door surface causes image distortion), or the door is open by a user but the user stands in front of the shelving module and obstructs (obscures) the imaging. Therefore, arranging the imaging module as previously described notably enables to reduce image obstruction by users removing/inserting items from/into the shelving module 20.
  • the imaging module 40 may comprise one or more imaging sensors 41, 42.
  • the imaging sensors 41, 42 may for example be digital cameras.
  • the FOV of the imaging sensors may enable to image the opening access.
  • the imaging module comprises several cameras, the FOV of the cameras may overlap so as to enable stitching of images to increase the FOV and image the whole opening access.
  • the cameras may have an FOV of 75*60 degrees.
  • the one or more imaging sensors 41, 42 may each be mounted on the door panel 31 so that a line of sight 415 of said imaging sensors is perpendicular to the access opening when the door panel 31 is open at a predetermined opening angle ⁇ .
  • the opening angle ⁇ may be set between 45° to 75°, preferably around 60°.
  • this angle range corresponds to the position people usually assume when holding the door open for removing or inserting items in a cabinet. Therefore, by configuring the imaging sensors so as to face the access opening at a predetermined angle within the aforementioned angle range, distortions due to door motion are advantageously reduced and imaging is improved.
  • the one or more imaging sensors 41, 42 may be configured so that the field of views 411, 412 of the one or more imaging sensors 41, 42 enable imaging of the whole access opening when the door panel 31 is open at the predetermined opening angle ⁇ .
  • the one or more imaging sensors 41, 42 may be arranged in a peripheral portion of the door panel 31.
  • at least some of the imaging sensors may be arranged on a side of the door panel 31 opposite to a side of the pivot axis ⁇ .
  • the Applicant has found that, in some embodiments in which two imaging sensors 41, 42 are placed on the side of the door panel 31 opposite to the pivot axis ⁇ , it is advantageous to position the imaging sensors respectively in the first third and in the last third of the door panel length, wherein the length of the door panel 31 is defined along a direction parallel to the pivot axis ⁇ direction. This notably enables to further reduce imaging obstruction.
  • the Applicant has also observed that in some embodiments in which at least one imaging sensor is positioned on the side of the door panel 31 opposite to the pivot axis ⁇ , it is advantageous to position the imaging sensor either in the first or in the last third of the door panel length. Therefore, at least one imaging sensor may advantageously be positioned either in the first or in the last third of the door panel length.
  • the one or more imaging sensors 41, 42 may be mounted in one or more corresponding sensor housings.
  • the sensor housings may be mounted on the inside door panel 31.
  • the sensor housings may allow for adjustment and calibration of the angle of the imaging sensor in two perpendicular planes.
  • the sensor housing may allow for an angle adjustment of about 60° in either of the two perpendicular planes.
  • the imaging sensors 41, 42 may be configured to be activated only upon detection of the opening of the door module 20. This enables to reduce the amount of images to acquire and/or to communicate and/or to store. Alternatively, the imaging sensors 41, 42 may be configured for continuously acquiring images at a predetermined frequency and be further configured to increase image frequency acquisition upon detection of the opening of the door module. An increased image acquisition frequency may be of around 15-17 images per second as a continuous acquisition frequency may be of 1-2 images per second. The series of images acquired between an opening and a subsequent closing of the door module may be recorded as an event and the images within each event may be numbered in sequence. All imaging sensors on the door module may take corresponding images that are identified in sequence of each event i.e.
  • the retail unit further comprises a switch that connects when the door module is open and disconnects when the door module is closed so as to detect door module opening.
  • motion detection may be performed by successive images analysis.
  • the processor module 50 is connected to the one or more imaging sensors 41, 42 and configured to receive the images acquired by said imaging sensors 41, 42.
  • the one or more imaging sensors 41, 42 are connected to the processor module 50 via a mini Universal Serial Bus (USB) connecter and one or more USB cables that may additionally power the imaging sensors 41, 42.
  • the one or more USB cables may be tucked into a gasket around the door panel in order to conceal them and to avoid interfering with the movement of the door panel.
  • the processor module 50 may comprise a mini Personal Computer (PC) with one or more USB inputs on the motherboard (for example 6 to 8 inputs).
  • the processor module 50 may use Solid State Drive (SSD) memory and may have a specially LinuxTM based operating system.
  • SSD Solid State Drive
  • the processor module 50 may be powered by an external source and may supply power to the imaging sensors 41, 42 via the USB cables.
  • the processor module 50 may be housed in a compact casing that can be easily mounted on the top or side of the refrigerator.
  • the processor module may be concealed behind a commercial banner.
  • the processor module 50 may be configured for performing image processing.
  • the processor module 50 may be configured to detect a change between two successive images received from the same imaging sensor so as to detect opening and closing of the door module.
  • the processor module 50 may be configured for stitching images received by different imaging sensors at the same instant so as to increase FOV and allow reconstructing complete images of the access opening.
  • the processor module may be configured to perform the methods of analyzing retail images and of inspecting a retail unit previously described.
  • the processor module 50 may connect to the Internet via a wire or wireless connection such as 3G/4G/Wifi/Wimax, etc.
  • a wireless connection may be used, a wireless module may be connected directly to the motherboard or via a USB port.
  • the 3G may use a local 3G SIM card in order to connect to the network and allow the processor module 50 to upload either the acquired images or the mixed images to the remote control center.
  • the images may be processed and items accommodated on the shelving module 20 can be analyzed to check if predefined assortment rules are respected by the retailer store where the retail unit is provided.
  • the Internet connection may also allow for remote software update of the processor module 50.
  • a power adapter for 110/220V may power the processor module 50 and the imaging sensors 41, 42.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Quality & Reliability (AREA)
  • Economics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure provides a method of analyzing a retail image, the method comprising: obtaining a retail image representative of a flank side of a shelving module including at least one shelf; detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf; determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment.

Description

SYSTEM AND METHOD OF RETAIL IMAGE ANALYSIS
TECHNOLOGICAL FIELD
The present disclosure relates generally to the field of image processing. More particularly, the present disclosure relates to a system and a method of analyzing a retail image of a shelving module.
BACKGROUND
Imaging sensors (also referred to generally as cameras or imaging modules in the following) are increasingly used for monitoring scenes, for example in surveillance contexts. Recently, the Applicant of the present application has proposed to use cameras in retail stores for brand management purposes. More particularly, the Applicant has described in a pending patent application a retail unit including a shelving module (for example a refrigerator in a supermarket) and a door module embedding one or more cameras so as to be capable of identifying how retail items contained in the retail unit are displayed. The image acquisition is advantageously triggered by motion detection of the door module and the images captured by the cameras are forwarded to a processor module which can transmit said images to a remote control center using, for example, wireless telecommunication networks.
GENERAL DESCRIPTION
In view of limited available bandwidth, the Applicant has identified a need for limiting amount of data to be transmitted to the remote control center.
The present disclosure therefore provides a method of analyzing a retail image, the method comprising: obtaining a retail image representative of a flank side of a shelving module including at least one shelf; detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf; determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment. This may enable to easily detect obstruction on the retail image.
In some embodiments, detecting the line segment comprises performing edge detection on the retail image.
In some embodiments, detecting the line segment comprises performing shape recognition on the retail image.
In some embodiments, the method further comprises determining an orientation of the line segment with respect to a reference orientation and wherein rating the image is further based on orientation of the line segment. This may enable to assess a relative positioning of the imaging sensor and shelving module and thereby enables to detect images suitable for brand management i.e. images with large a field of view.
The present disclosure further provides a method of inspecting a shelving module including at least one shelf, the method comprising: imaging the shelving module so as to obtain a set of retail images representative of a flank of the shelving module; analyzing each retail image according to the method of analyzing previously described; discarding retail images whose rating does not meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images; and transmitting the subset of eligible retail images to a remote control center. This may enable to saving transmission costs.
In some embodiments, imaging the shelving module is performed upon detection of motion near the shelving module.
In some embodiments, a pivotable door module is arranged on the shelving module and the imaging is performed upon opening of the door module.
The present disclosure further provides a processor module for analyzing a retail image, the processor module comprising a processor coupled with a memory storing computer readable instructions that when executed causes the processor module to perform the steps of: obtaining a retail image representative of a flank side of a shelving module including at least one shelf; detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf; determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment. In some embodiments, the memory further stores computer readable instructions that when executed by the processor causes the processor module to perform the step of determining an orientation of the line segment and wherein rating the image is further based on the orientation of the line segment.
In some embodiments, the memory further stores computer readable instructions that when executed by the processor and upon receipt of a set of retail images causes the processor module to perform the steps of: analyzing the set of retail images by successively performing the steps previously described on each retail image; discarding retail images whose rating does not meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images.
The present disclosure further provides a retail inspection system comprising: an imaging module configured to acquire a set of retail images upon activation; and the processor module as previously described, the processor module being further configured to: receive the set of images from the imaging module; and transmit the subset of eligible retail images to a remote control center. The imaging module may acquire relevant retail images when the imaging module is positioned so as to capture a flank side of a shelving module including at least a shelf.
The present disclosure also provides a retail unit comprising: a shelving module including at least one shelf; a door module configured to pivot away from the shelving module; a retail inspection system as previously described, wherein the imaging module is arranged on an inner side of the door module, thereby allowing imaging of a flank side of the shelving module.
In some embodiments, the imaging module is arranged so that an optical axis of the imaging module is substantially parallel to a plane of extension of the at least one shelf.
The present disclosure also provides a computer program adapted to perform the methods previously described.
The present disclosure also provides a computer readable storage medium comprising the computer program previously described.
The term "corresponding" and its derivatives may be used in the present application to associate an object in the physical world to its representation in an image. For the sake of clarity, in the following, unless otherwise suggested, the term "element" refers to an object in the physical world whereas the term "feature" refers to a component in the image. In other words, a feature and a corresponding element are in fact conjugates through the imaging module.
In an aspect, the present disclosure also provides a method of analyzing a retail image comprising: obtaining a retail image representative of a display module, wherein the display module comprises at least one predefined reference element; detecting on the retail image a reference feature corresponding to the at least one reference element; detecting on the retail image a reference feature corresponding to the at least one reference element of the display module; determining a dimensional parameter of the reference feature in the retail image rating the retail image based on the dimensional parameter of the reference feature.
It is further noted that generally a reference feature may correspond to a reference element which has multiple occurrences in the surface of interest of the retail shelving module. In some embodiments, the multiple occurrences may be scattered over the surface of interest of the shelving module. In some embodiments, a ratio between a dimensional parameter of the reference element and a dimensional parameter of the surface of interest of the shelving module may be above a predefined threshold. For example, the dimensional parameter may be any of a width, a height, etc. In some embodiments, the reference element may be selected so as to be visible on at least some of the acquired images. The display module may be in some embodiments a shelving module and the reference element may be a shelf corresponding in an image to reference feature in the form of a line segment.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
Figs. 1A and IB illustrate a shelving module according to some embodiments of the present disclosure.
Fig. 2 is a flow diagram illustrating steps of a method of inspecting a shelving module according to some embodiments of the present disclosure. Fig. 3 is a flow diagram illustrating steps of a method of analyzing a retail image according to some embodiments of the present disclosure.
Figs. 4A and 4B respectively illustrate schematically a retail image without and with obstruction in some embodiments of the present disclosure.
Fig. 5 illustrates schematically a retail image acquired with a non-optimal line of sight orientation in some embodiments of the present disclosure.
Figs. 6A and 6B illustrate respectively a retail image acquired with a non- optimal line of sight orientation and an eligible retail image in some embodiments of the present disclosure.
Figs. 7A and 7B illustrate schematically a retail unit according to some embodiments of the present disclosure. Fig. 7 A shows the retail unit with a door module open while Fig. 7B shows the retail unit with the door module closed.
Fig. 8 illustrates schematically an upper view of the retail unit according to some embodiments of the present disclosure.
The same references on the figures may refer to analogous elements unless otherwise specified.
DETAILED DESCRIPTION OF EMBODIMENTS
In the following detailed description, specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known features, structures, characteristics, stages, methods, procedures, modules, components and systems, have not been described in detail so as not to obscure the present disclosure.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "calculating", "computing", "determining", "stitching", "configuring", "selecting", "defining", or the like, include action and/or processes of a computer that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities, and/or said data representing the physical objects. The terms "computer", "processor", and "controller" should be expansively construed to cover any kind of electronic device with data processing capabilities. The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a non-transitory computer readable storage medium. The term "non-transitory" is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable for the application.
As used herein, the phrase "for example," "such as" , "for instance" and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one case", "some cases", "other cases" or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiment(s) is included in at least one embodiment of the presently disclosed subject matter. Thus the appearance of the phrase "one case", "some cases", "other cases" or variants thereof does not necessarily refer to the same embodiment(s).
It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.
Any trademark occurring in the text or drawings is the property of its owner and occurs herein merely to explain or illustrate one example of how the presently discussed subject matter may be implemented.
Figs. 1A and IB illustrate a shelving module 20 according to some embodiments of the present disclosure. The shelving module 20 of Fig. IB is similar to the shelving module 20 of Fig. 1A except from that it is further accommodated within a housing structure 27. Therefore, references to similar elements have not been repeated for the sake of clarity.
The shelving module 20 may comprise one or more shelves 21, 22, 23. The shelves 21, 22, 23 may have a plate shape and may comprise an upper extension surface 212, 222, 232, a lower extension surface (not shown) parallel to the upper extension surface 212, 222, 232 and a border frame 211, 221, 231 wherein each border frame 211, 221, 231 is formed by flank edges of the shelves 21, 22, 23. In the embodiment illustrated with rectangular shelves, the border frame of each shelf is formed by four flank edges. The flank edge can be understood as the surfaces of the shelf which form a thickness e of the shelf. The shelves 21, 22, 23 may be positioned perpendicular to a vertical axis X. The upper extension surface 212, 222, 232 is configured for accommodating retail items. The shelving module 20 may be positioned so that the lower extension surfaces of the shelves 21, 22, 23 face the ground. In the following, a flank side of the shelving module 20 refers to a plane surface comprising a flank edge of the shelves 21, 22, 23 i.e. a surface including at least one flank edge of one shelf.
With reference to Fig. IB, the shelving module 20 may be accommodated in the housing structure 27 (for example in a refrigerator or in a closet). The housing structure 27 may comprise fixed upright walls 271, 272, 273. At least a part of one flank side of the shelving module 20 may not be enveloped by the housing structure 27 thereby defining an access opening of the shelving module 20 enabling inserting and removing retail items in the shelving module 20. In other words, the housing 27 may comprise an inner volume for accommodating the shelving unit 20 and may further define an access opening for enabling removing and inserting retail items to an from the shelving module 20. The housing structure 27 may further comprise an upper wall 274 and a lower wall 275.
Fig. 2 generally illustrates a method of inspecting a retail unit in some embodiments of the present disclosure. The retail unit may comprise a shelving module as previously described.
In a first imaging step S100, imaging of the shelving module may be performed so as to capture at least a flank side of the shelving module and allow imaging of a flank edge of a shelf. The imaging may be performed by an imaging module comprising one or more imaging sensors (also referred to as cameras). The optical axis of these imaging sensors may preferably be substantially parallel (for example ± 10 degrees) to the upper extension surface (also referred to as a shelf plane) of the shelves of the shelving module. The imaging may further be performed so that the acquired images allow viewing the retails items intended to be accommodated in the shelving module. For example, the one or more cameras may be mounted on a platform facing the shelving module.
It is understood that the acquired images of the shelving module refer to images of at least part of a surface of interest of the shelving module wherein imaging the surface of interest allows viewing the retail items when the retail items are accommodated in the shelving module. For example, as described hereinafter, the shelving module may be accommodated in the housing structure and the surface of interest may be the access opening of the shelving module.
In some embodiments in which the shelving module is accommodated into a housing structure, the retail unit may further comprise a door module configured for pivoting away from the shelving module. The door module may be operated (positioned) in an open state and in a closed state. In the closed state, the door module may face the access opening of the shelving module and in the open state, at least a part of the door module may be distant from the opening access of the shelving module. The access opening may form a doorway i.e. a surface on which the door module lies when the door module is closed. The access opening may be considered as the surface of interest in these embodiments. The shelving module may be accessible through the access opening so as to enable removing and/or inserting retail items. The door module may for example comprise a door panel hinged to a peripheral portion of the housing structure. Furthermore, an imaging module may be arranged on the inner side of the door module thereby allowing imaging of the retail items intended to be disposed in the shelving module. For example, the imaging module may comprise one or more cameras. The inner side of the door module may be understood as the side of the door module facing the access opening when the door module is in the closed state. In some embodiments, each imaging sensor is configured so that the line of sight is perpendicular to a plane of extension of the access opening when the door module is open at a predetermined opening angle. The cameras may further be configured so that, at the predetermined opening angle, a lateral field of view (in a direction perpendicular to the vertical axis X) of the cameras allows viewing the width W (or at least 90% of the width) of the opening access and/or of the shelves. Further, when the imaging module comprises one camera, the camera may be configured so that at the predetermined opening angle, a vertical field of view (in the direction of the vertical axis X) of the camera allows viewing a height H along which the one or more shelves are positioned. When the imaging module comprises a plurality of cameras, the cameras may be configured so that stitching of several images acquired simultaneously by different cameras may allow reconstructing an image capturing the one or more shelves. In some embodiments, the imaging may be triggered by motion detection near the shelving module. For example, in some embodiments including a door module, the imaging may be triggered by detection of an opening of the door module and may be stopped by detection of a closing of the door module. In some embodiments, the imaging module may image continuously and the imaging step S100 may comprise increasing the image acquisition frequency. In some embodiments including continuous image acquisition, motion detection may be performed by image stream analysis. Step S100 may lead to a set of retail images of a flank side of the shelving module.
The retail unit may further comprise a processor module configured for receiving the acquired images, for analyzing the images so as to define an eligible subset of images and for transmitting the eligible subset to a remote control center, as described hereinafter. The Applicant has found that it is desirable to reduce the amount of images to transmit to the remote control center verifying an assortment of the retail unit. In order to reduce the amount of data to transmit, the Applicant has found that a computationally economical pre-processing may be undertaken to identify a subset of eligible images from the set of acquired retail images.
In a second analyzing step SllO, the acquired retail images may be analyzed. With reference to Fig.3, the analyzing step SllO, may comprise, for at least some of the acquired images and preferably for each acquired image, an obtaining step Sill, a detecting step SI12, a determining step SI13 and a rating step SI14.
In the obtaining step Sill, the retail image representative of the shelving module may be obtained (i.e. received) by the processor module. The processor module may be connected to the imaging module.
In the detecting step SI12, a reference feature may be tracked in the retail image. The Applicant has noticed that when the imaging of the shelving module is performed with a line of sight coplanar with the at least one shelf plane, the at least one shelf will correspond in the retail image to a straight line i.e. a narrow strip whose thickness in the retail image depends on the actual thickness of the shelf, on the position of the imaging sensor along the vertical axis and on optical parameters of the imaging module. In any case, however, the Applicant has advantageously noticed that the flank edge of the at least one shelf will give rise in the retail image to a straight line which is easily detectable, especially when the shelf accommodates retail items. As described herein, the color gradient between the flank edge and its environment is generally high because of the retail items accommodated on the shelf and because of shading effect. Therefore, the detection of the straight line corresponding to the flank edge is facilitated. As shown in Fig. 4A, the flank edges of the three shelves 21, 22, 23 of the shelving module may give rise to three straight lines 210, 220, 230 on the retail image capturing the corresponding flank side of the shelving module. Fig. 6B as well provides with a similar illustration provided by a real picture captured by a prototype system built by the Applicant.
The tracked reference feature may be a line segment corresponding in the retail image to the flank edge of the at least one shelf of the shelving module. It is appreciated that the term "correspond" is used to associate an object in the physical world to its representation in the acquired image. It is noted that step SI 12 may lead to detecting a plurality of line segments, for example when the shelving module comprises several shelves. Also, step S112 may lead to detecting a plurality of line segments if an object is positioned between the imaging module and the flank edge of the at least one shelf because the line corresponding to the flank edge would be interrupted by the image of the object. Step S112 may comprise a first step of edge detection involving detecting whether a difference of intensity (or of color intensity) between adjacent pixels is above a predetermined threshold. The step of edge detection may lead to obtaining an edge pixel map in which each pixel may be associated with a binary value depending on the intensity gradient being above or below a predefined threshold. Thereafter, a shape recognition step may be performed. The shape recognition may involve determining how the edge pixels (i.e. pixels for which the intensity gradient is above the predefined threshold) are distributed so as to detect line segments. For example, the first step of edge detection may involve a Canny edge detection algorithm and the second step of shape recognition may involve a Hough transform algorithm.
In the determining step S113, a dimension (i.e. a pixel size) of each of the one or more line segments may be computed. In some embodiments, line segments having a dimension below a predetermined threshold may be discarded. In fact, the line segments with a dimension below the predetermined threshold may not correspond to the flank edge of the shelves. A sum of the dimension of the line segments detected (and not discarded) may further be computed. As illustrated on Fig. 4B, when a user passes a hand in front of the imaging module and a shelf is at least partially obscured, a line segment corresponding to the shelf will be correlatively of a smaller dimension. Therefore, measuring the size of line segments in the retail images infers a probability of obstruction of the imaging module and thereby enables to discard these images for avoiding unnecessary transmission to the remote control center.
In the rating step S114, a rating is associated with each retail image. The rating may increase with the sum of the pixel sizes of the line segments. In some embodiments, the rating is proportional to the sum of the pixel sizes of the line segments.
Optionally, in an additional step (not shown on Fig. 3), an orientation of the line segments may be also determined. The orientation of the line segments may be defined as an angle a formed by the line segment with a horizontal line in the retail image. As shown with reference to Fig. 5 and 6A, this enables to detect that the line of sight of the imaging module is not perpendicular with the flank edge. The rating may decrease with the orientation of the line segments. In some embodiments, the rating may be put to zero when the orientation is above a predetermined threshold angle.
Optionally, the rating may also take into account other parameters such as an indicator of blur motion and/or of a FOV.
In a third discarding step S120, the rating of the retail images may be compared to an eligibility criterion and the images which do not satisfy the eligibility criterion may be discarded so as to obtain a subset of eligible images. The criterion may be defined by a criterion threshold above which a an image is considered to be eligible.
In a fourth step S130, the subset of eligible images is communicated to a remote control center, for example using a 3G telecommunication network. In some embodiments, the ineligible images may not be communicated to the remote control center.
Fig. 7 A and 7B illustrate a retail unit 1 according to some embodiments of the present disclosure. The retail unit 1 comprises a shelving module 20, a door module 30, an imaging module 40 and a processor module 50. Fig. 7 A shows a retail unit with the door module 30 open while Fig. 7B shows the retail unit with the door module 30 closed. It is noted that, even though the illustrated embodiment refers to a retail unit 1 in the form of a refrigerator, the features described herein below can be extended to different types of retail units such as all kinds of cabinets.
The shelving module 20 is capable of accommodating one or more retail items 10. In some embodiments, the shelving module 20 may comprise one or more shelves 21-25. The retail unit may further comprise a housing 27 partially enclosing the shelving module 20. The housing 27 may comprise an access opening enabling a user to access the one or more shelves 21-25 so as to allow seizing of a retail item from the one or more shelves 21-25. In other words, the housing 27 and the shelving module 20 may define a structure in which the retail items are intended to be accommodated and the access opening may be defined as an aperture (or an entrance surface) of the housing enabling to access the retail items. The one or more shelves 21-25 may be disposed in parallel within the housing 27 and may be capable of storing and/or displaying the one or more retail items 10. The one or more retail items may for example comprise drinks, food products, eyewear, medicine, etc.
The door module 30 is configured to close at least a part of the saccess opening. The door module 30 may comprise a door panel 31. In some embodiments, the door panel 31 may be transparent so as to allow people standing in front of the retail unit 1 to see the retail items 10 intended to be accommodated in the retail unit 1. The door module 30 may comprise a handle (not shown) and may be opened and closed upon operation of the handle by a user. The door panel 31 may be configured to pivot away from the access opening of the shelving module 20. The door panel 31 may further be configured to close the access opening when the door module 30 is operated. In some embodiments, a hinge mechanism (not shown) may be positioned on a peripheral portion of the access opening so as to define a pivot axis Δ of the door panel 31.
The imaging module 40 may be coupled to (or arranged on) on the door module 30 and configured to acquire images of the shelving module 20. More precisely, the imaging module 40 may be configured for imaging the one or more retail items 10 when the one or more retail items 10 are accommodated in the shelving module 20.The imaging module 40 may be arranged on the side of the door panel 31 facing the shelving module 20. Arranging the imaging module 40 as described enables monitoring enclosed shelving modules, particularly for brand management purposes. In fact, monitoring enclosed cabinets is generally difficult because either the door is closed and it is not possible to image through the door (for example because the door is opaque or because frost deposited on the door surface causes image distortion), or the door is open by a user but the user stands in front of the shelving module and obstructs (obscures) the imaging. Therefore, arranging the imaging module as previously described notably enables to reduce image obstruction by users removing/inserting items from/into the shelving module 20.
The imaging module 40 may comprise one or more imaging sensors 41, 42. The imaging sensors 41, 42 may for example be digital cameras. The FOV of the imaging sensors may enable to image the opening access. Preferably, when the imaging module comprises several cameras, the FOV of the cameras may overlap so as to enable stitching of images to increase the FOV and image the whole opening access. The cameras may have an FOV of 75*60 degrees. With reference to Fig. 8 and generally in some embodiments in which the door module 30 comprises a door panel 31 pivoting away from the access opening of the shelving module 20, the one or more imaging sensors 41, 42 may each be mounted on the door panel 31 so that a line of sight 415 of said imaging sensors is perpendicular to the access opening when the door panel 31 is open at a predetermined opening angle Θ. This enables to configure the imaging sensors 41, 42 so as to "face" the access opening when the imaging sensors 41, 42 are at a predetermined distance from the access opening and thereby increases the field of view. In some embodiments, the opening angle Θ may be set between 45° to 75°, preferably around 60°. The Applicant has found that this angle range corresponds to the position people usually assume when holding the door open for removing or inserting items in a cabinet. Therefore, by configuring the imaging sensors so as to face the access opening at a predetermined angle within the aforementioned angle range, distortions due to door motion are advantageously reduced and imaging is improved. Furthermore, the one or more imaging sensors 41, 42 may be configured so that the field of views 411, 412 of the one or more imaging sensors 41, 42 enable imaging of the whole access opening when the door panel 31 is open at the predetermined opening angle Θ. The one or more imaging sensors 41, 42 may be arranged in a peripheral portion of the door panel 31. Advantageously, at least some of the imaging sensors may be arranged on a side of the door panel 31 opposite to a side of the pivot axis Δ. The Applicant has found that, in some embodiments in which two imaging sensors 41, 42 are placed on the side of the door panel 31 opposite to the pivot axis Δ, it is advantageous to position the imaging sensors respectively in the first third and in the last third of the door panel length, wherein the length of the door panel 31 is defined along a direction parallel to the pivot axis Δ direction. This notably enables to further reduce imaging obstruction. The Applicant has also observed that in some embodiments in which at least one imaging sensor is positioned on the side of the door panel 31 opposite to the pivot axis Δ, it is advantageous to position the imaging sensor either in the first or in the last third of the door panel length. Therefore, at least one imaging sensor may advantageously be positioned either in the first or in the last third of the door panel length.
Furthermore, the one or more imaging sensors 41, 42 may be mounted in one or more corresponding sensor housings. The sensor housings may be mounted on the inside door panel 31. The sensor housings may allow for adjustment and calibration of the angle of the imaging sensor in two perpendicular planes. For example, the sensor housing may allow for an angle adjustment of about 60° in either of the two perpendicular planes.
The imaging sensors 41, 42 may be configured to be activated only upon detection of the opening of the door module 20. This enables to reduce the amount of images to acquire and/or to communicate and/or to store. Alternatively, the imaging sensors 41, 42 may be configured for continuously acquiring images at a predetermined frequency and be further configured to increase image frequency acquisition upon detection of the opening of the door module. An increased image acquisition frequency may be of around 15-17 images per second as a continuous acquisition frequency may be of 1-2 images per second. The series of images acquired between an opening and a subsequent closing of the door module may be recorded as an event and the images within each event may be numbered in sequence. All imaging sensors on the door module may take corresponding images that are identified in sequence of each event i.e. images acquired simultaneously by different imaging sensors may be identified according to their sequence number. Further, the number of images acquired in an event is correlated to the average velocity of manual door opening which is estimated at 0.5m/sec. In some embodiments, the retail unit further comprises a switch that connects when the door module is open and disconnects when the door module is closed so as to detect door module opening. In some embodiments in which the imaging sensors are configured for continuous acquisition, motion detection may be performed by successive images analysis.
The processor module 50 is connected to the one or more imaging sensors 41, 42 and configured to receive the images acquired by said imaging sensors 41, 42. In some embodiments, the one or more imaging sensors 41, 42 are connected to the processor module 50 via a mini Universal Serial Bus (USB) connecter and one or more USB cables that may additionally power the imaging sensors 41, 42. The one or more USB cables may be tucked into a gasket around the door panel in order to conceal them and to avoid interfering with the movement of the door panel. The processor module 50 may comprise a mini Personal Computer (PC) with one or more USB inputs on the motherboard (for example 6 to 8 inputs). The processor module 50 may use Solid State Drive (SSD) memory and may have a specially Linux™ based operating system. The processor module 50 may be powered by an external source and may supply power to the imaging sensors 41, 42 via the USB cables. The processor module 50 may be housed in a compact casing that can be easily mounted on the top or side of the refrigerator. Advantageously, the processor module may be concealed behind a commercial banner. Further, the processor module 50 may be configured for performing image processing. For example, the processor module 50 may be configured to detect a change between two successive images received from the same imaging sensor so as to detect opening and closing of the door module. Further, in some embodiments in which several imaging sensors are used to image the access opening, the processor module 50 may be configured for stitching images received by different imaging sensors at the same instant so as to increase FOV and allow reconstructing complete images of the access opening. Additionally, the processor module may be configured to perform the methods of analyzing retail images and of inspecting a retail unit previously described. The processor module 50 may connect to the Internet via a wire or wireless connection such as 3G/4G/Wifi/Wimax, etc. In some embodiments in which a wireless connection is used, a wireless module may be connected directly to the motherboard or via a USB port. In some embodiments in which a 3G connection is used, the 3G may use a local 3G SIM card in order to connect to the network and allow the processor module 50 to upload either the acquired images or the mixed images to the remote control center. Once the images are uploaded to the remote control center, the images may be processed and items accommodated on the shelving module 20 can be analyzed to check if predefined assortment rules are respected by the retailer store where the retail unit is provided. The Internet connection may also allow for remote software update of the processor module 50. A power adapter for 110/220V may power the processor module 50 and the imaging sensors 41, 42.
While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those of ordinary skill in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention. It will be appreciated that the embodiments described above are cited by way of example, and various features thereof and combinations of these features can be varied and modified.
While various embodiments have been shown and described, it will be understood that there is no intent to limit the invention by such disclosure, but rather, it is intended to cover all modifications and alternate constructions falling within the scope of the invention, as defined in the appended claims.

Claims

CLAIMS:
1. A method of analyzing a retail image, the method comprising:
obtaining a retail image representative of a flank side of a shelving module including at least one shelf;
- detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf;
determining a dimension of the line segment in the retail image; and rating the retail image based on the dimension of the line segment.
2. The method according to any one of the preceding claims, wherein detecting the line segment comprises performing edge detection on the retail image.
3. The method according to any one of the preceding claims, wherein detecting the line segment comprises performing shape recognition on the retail image.
4. The method according to any one of the preceding claims, further comprising determining an orientation of the line segment with respect to a reference orientation and wherein rating the image is further based on the orientation of the line segment.
5. A method of inspecting a shelving module including at least one shelf, the method comprising:
imaging the shelving module so as to obtain a set of retail images representative of a flank of the shelving module;
analyzing each retail image according to the method of any of claims 1-4; - discarding retail images whose rating does not meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images; and transmitting the subset of eligible retail images to a remote control center.
6. The method according to claim 5, wherein imaging the shelving module is performed upon detection of motion near the shelving module.
7. The method according to claim 5 or 6, wherein a pivotable door module is arranged on the shelving module and wherein the imaging is performed upon opening of the door module.
5 8. A processor module for analyzing a retail image, the processor module comprising a processor coupled with a memory storing computer readable instructions that when executed causes the processor module to perform the steps of:
obtaining a retail image representative of a flank side of a shelving module including at least one shelf;
10 - detecting in the retail image at least one line segment corresponding to a flank edge of the at least one shelf;
determining a dimension of the line segment in the retail image; rating the retail image based on the dimension of the line segment.
15 9. The processor module according to claim 8, wherein the memory further stores computer readable instructions that when executed by the processor causes the processor module to perform the step of determining an orientation of the line segment and wherein rating the image is further based on the orientation of the line segment.
20 10. The processor module according to any one of claim 8 and 9, wherein the memory further stores computer readable instructions that when executed by the processor and upon reception of a set of retail images causes the processor module to perform the steps of:
analyzing the set of retail images by successively performing the steps of 25 claims 8 or 9 on each retail image;
discarding retail images whose rating do no meet a predetermined eligibility criterion thereby obtaining a subset of eligible retail images.
A retail inspection system comprising:
an imaging module configured to acquire a set of retail images upon activation; and
the processor module according to claim 10, the processor module being further configured to: o receive the set of images from the imaging module; and
o transmit the subset of eligible retail images to a remote control center.
5 12. A retail unit comprising:
a shelving module including at least one shelf;
a door module configured to pivot away from the shelving module;
a retail inspection system according to claim 11 , wherein the imaging module is arranged on an inner side of the door module, thereby allowing 10 imaging of a flank side of the shelving module.
13. The retail unit according to claim 12, wherein the imaging module is arranged so that an optical axis of the imaging module is substantially parallel to a plane of extension of the at least one shelf.
15
14. A computer program adapted to perform the method of any one of claims 1 to 7.
15. A computer readable storage medium comprising the program of claim 14.
20 16. A method of analyzing a retail image, the method comprising:
obtaining a retail image representative of a retail display module, wherein the retail display module comprises at least one predefined reference element;
detecting on the retail image a reference feature corresponding to the at least 25 one reference element of the display module;
determining a dimensional parameter of the reference feature in the retail image;
rating the retail image based on the dimensional parameter of the reference feature.
30
PCT/IL2014/050377 2013-05-05 2014-04-24 System and method of retail image analysis Ceased WO2014181323A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL22614713 2013-05-05
IL226147 2013-05-05

Publications (1)

Publication Number Publication Date
WO2014181323A1 true WO2014181323A1 (en) 2014-11-13

Family

ID=50842303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2014/050377 Ceased WO2014181323A1 (en) 2013-05-05 2014-04-24 System and method of retail image analysis

Country Status (1)

Country Link
WO (1) WO2014181323A1 (en)

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2520409A (en) * 2013-10-31 2015-05-20 Symbol Technologies Inc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
US9424482B2 (en) 2013-06-12 2016-08-23 Symbol Technologies, Llc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
US9911033B1 (en) 2016-09-05 2018-03-06 International Business Machines Corporation Semi-supervised price tag detection
CN109035278A (en) * 2018-07-25 2018-12-18 深圳市荣盛智能装备有限公司 The detection method of fire exit door and its switch state based on image
US10352689B2 (en) 2016-01-28 2019-07-16 Symbol Technologies, Llc Methods and systems for high precision locationing with depth values
US10368662B2 (en) 2013-05-05 2019-08-06 Trax Technology Solutions Pte Ltd. System and method of monitoring retail units
US10387996B2 (en) 2014-02-02 2019-08-20 Trax Technology Solutions Pte Ltd. System and method for panoramic image processing
US10402777B2 (en) 2014-06-18 2019-09-03 Trax Technology Solutions Pte Ltd. Method and a system for object recognition
US10489677B2 (en) 2017-09-07 2019-11-26 Symbol Technologies, Llc Method and apparatus for shelf edge detection
US10505057B2 (en) 2017-05-01 2019-12-10 Symbol Technologies, Llc Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera
US10521914B2 (en) 2017-09-07 2019-12-31 Symbol Technologies, Llc Multi-sensor object recognition system and method
US10572763B2 (en) 2017-09-07 2020-02-25 Symbol Technologies, Llc Method and apparatus for support surface edge detection
US10591918B2 (en) 2017-05-01 2020-03-17 Symbol Technologies, Llc Fixed segmented lattice planning for a mobile automation apparatus
US10663590B2 (en) 2017-05-01 2020-05-26 Symbol Technologies, Llc Device and method for merging lidar data
US10726273B2 (en) 2017-05-01 2020-07-28 Symbol Technologies, Llc Method and apparatus for shelf feature and object placement detection from shelf images
US10731970B2 (en) 2018-12-13 2020-08-04 Zebra Technologies Corporation Method, system and apparatus for support structure detection
US10740911B2 (en) 2018-04-05 2020-08-11 Symbol Technologies, Llc Method, system and apparatus for correcting translucency artifacts in data representing a support structure
US10809078B2 (en) 2018-04-05 2020-10-20 Symbol Technologies, Llc Method, system and apparatus for dynamic path generation
US10823572B2 (en) 2018-04-05 2020-11-03 Symbol Technologies, Llc Method, system and apparatus for generating navigational data
US10832436B2 (en) 2018-04-05 2020-11-10 Symbol Technologies, Llc Method, system and apparatus for recovering label positions
US10949798B2 (en) 2017-05-01 2021-03-16 Symbol Technologies, Llc Multimodal localization and mapping for a mobile automation apparatus
US11003188B2 (en) 2018-11-13 2021-05-11 Zebra Technologies Corporation Method, system and apparatus for obstacle handling in navigational path generation
US11010920B2 (en) 2018-10-05 2021-05-18 Zebra Technologies Corporation Method, system and apparatus for object detection in point clouds
US11015938B2 (en) 2018-12-12 2021-05-25 Zebra Technologies Corporation Method, system and apparatus for navigational assistance
US11042161B2 (en) 2016-11-16 2021-06-22 Symbol Technologies, Llc Navigation control method and apparatus in a mobile automation system
US11079240B2 (en) 2018-12-07 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for adaptive particle filter localization
US11080566B2 (en) 2019-06-03 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for gap detection in support structures with peg regions
US11093896B2 (en) 2017-05-01 2021-08-17 Symbol Technologies, Llc Product status detection system
US11090811B2 (en) 2018-11-13 2021-08-17 Zebra Technologies Corporation Method and apparatus for labeling of support structures
US11100303B2 (en) 2018-12-10 2021-08-24 Zebra Technologies Corporation Method, system and apparatus for auxiliary label detection and association
US11107238B2 (en) 2019-12-13 2021-08-31 Zebra Technologies Corporation Method, system and apparatus for detecting item facings
US11151743B2 (en) 2019-06-03 2021-10-19 Zebra Technologies Corporation Method, system and apparatus for end of aisle detection
US11200677B2 (en) 2019-06-03 2021-12-14 Zebra Technologies Corporation Method, system and apparatus for shelf edge detection
US11327504B2 (en) 2018-04-05 2022-05-10 Symbol Technologies, Llc Method, system and apparatus for mobile automation apparatus localization
US11341663B2 (en) 2019-06-03 2022-05-24 Zebra Technologies Corporation Method, system and apparatus for detecting support structure obstructions
US11367092B2 (en) 2017-05-01 2022-06-21 Symbol Technologies, Llc Method and apparatus for extracting and processing price text from an image set
US11392891B2 (en) 2020-11-03 2022-07-19 Zebra Technologies Corporation Item placement detection and optimization in material handling systems
US11402846B2 (en) 2019-06-03 2022-08-02 Zebra Technologies Corporation Method, system and apparatus for mitigating data capture light leakage
US11416000B2 (en) 2018-12-07 2022-08-16 Zebra Technologies Corporation Method and apparatus for navigational ray tracing
US11450024B2 (en) 2020-07-17 2022-09-20 Zebra Technologies Corporation Mixed depth object detection
US11449059B2 (en) 2017-05-01 2022-09-20 Symbol Technologies, Llc Obstacle detection for a mobile automation apparatus
US11507103B2 (en) 2019-12-04 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for localization-based historical obstacle handling
US11506483B2 (en) 2018-10-05 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for support structure depth determination
US11592826B2 (en) 2018-12-28 2023-02-28 Zebra Technologies Corporation Method, system and apparatus for dynamic loop closure in mapping trajectories
US11593915B2 (en) 2020-10-21 2023-02-28 Zebra Technologies Corporation Parallax-tolerant panoramic image generation
US11600084B2 (en) 2017-05-05 2023-03-07 Symbol Technologies, Llc Method and apparatus for detecting and interpreting price label text
US11662739B2 (en) 2019-06-03 2023-05-30 Zebra Technologies Corporation Method, system and apparatus for adaptive ceiling-based localization
US11822333B2 (en) 2020-03-30 2023-11-21 Zebra Technologies Corporation Method, system and apparatus for data capture illumination control
US11847832B2 (en) 2020-11-11 2023-12-19 Zebra Technologies Corporation Object classification for autonomous navigation systems
US11954882B2 (en) 2021-06-17 2024-04-09 Zebra Technologies Corporation Feature-based georegistration for mobile computing devices
US11960286B2 (en) 2019-06-03 2024-04-16 Zebra Technologies Corporation Method, system and apparatus for dynamic task sequencing
US11978011B2 (en) 2017-05-01 2024-05-07 Symbol Technologies, Llc Method and apparatus for object status detection
DE102024115850A1 (en) * 2024-06-06 2025-12-11 SSI Schäfer AG Automated shelf inspection in high-bay warehouses

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2851833A1 (en) * 2003-02-27 2004-09-03 Alcon Diaz Consulting METHOD FOR MEASURING THE LINEAR OF A PRODUCT ON A SHELF
WO2008107150A1 (en) * 2007-03-02 2008-09-12 Baumer Electric Ag Monitoring system, in particular for analyzing the fill level of shelves
WO2009027836A2 (en) * 2007-08-31 2009-03-05 Accenture Global Services Gmbh Determination of inventory conditions based on image processing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2851833A1 (en) * 2003-02-27 2004-09-03 Alcon Diaz Consulting METHOD FOR MEASURING THE LINEAR OF A PRODUCT ON A SHELF
WO2008107150A1 (en) * 2007-03-02 2008-09-12 Baumer Electric Ag Monitoring system, in particular for analyzing the fill level of shelves
WO2009027836A2 (en) * 2007-08-31 2009-03-05 Accenture Global Services Gmbh Determination of inventory conditions based on image processing

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10368662B2 (en) 2013-05-05 2019-08-06 Trax Technology Solutions Pte Ltd. System and method of monitoring retail units
US9424482B2 (en) 2013-06-12 2016-08-23 Symbol Technologies, Llc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
US9697429B2 (en) 2013-06-12 2017-07-04 Symbol Technologies, Llc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
GB2520409B (en) * 2013-10-31 2016-06-08 Symbol Technologies Llc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
GB2520409A (en) * 2013-10-31 2015-05-20 Symbol Technologies Inc Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
US10387996B2 (en) 2014-02-02 2019-08-20 Trax Technology Solutions Pte Ltd. System and method for panoramic image processing
US10402777B2 (en) 2014-06-18 2019-09-03 Trax Technology Solutions Pte Ltd. Method and a system for object recognition
US10352689B2 (en) 2016-01-28 2019-07-16 Symbol Technologies, Llc Methods and systems for high precision locationing with depth values
US9911033B1 (en) 2016-09-05 2018-03-06 International Business Machines Corporation Semi-supervised price tag detection
US11042161B2 (en) 2016-11-16 2021-06-22 Symbol Technologies, Llc Navigation control method and apparatus in a mobile automation system
US10726273B2 (en) 2017-05-01 2020-07-28 Symbol Technologies, Llc Method and apparatus for shelf feature and object placement detection from shelf images
US10591918B2 (en) 2017-05-01 2020-03-17 Symbol Technologies, Llc Fixed segmented lattice planning for a mobile automation apparatus
US10663590B2 (en) 2017-05-01 2020-05-26 Symbol Technologies, Llc Device and method for merging lidar data
US10505057B2 (en) 2017-05-01 2019-12-10 Symbol Technologies, Llc Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera
US11978011B2 (en) 2017-05-01 2024-05-07 Symbol Technologies, Llc Method and apparatus for object status detection
US11367092B2 (en) 2017-05-01 2022-06-21 Symbol Technologies, Llc Method and apparatus for extracting and processing price text from an image set
US11449059B2 (en) 2017-05-01 2022-09-20 Symbol Technologies, Llc Obstacle detection for a mobile automation apparatus
US11093896B2 (en) 2017-05-01 2021-08-17 Symbol Technologies, Llc Product status detection system
US10949798B2 (en) 2017-05-01 2021-03-16 Symbol Technologies, Llc Multimodal localization and mapping for a mobile automation apparatus
US11600084B2 (en) 2017-05-05 2023-03-07 Symbol Technologies, Llc Method and apparatus for detecting and interpreting price label text
US10521914B2 (en) 2017-09-07 2019-12-31 Symbol Technologies, Llc Multi-sensor object recognition system and method
US10572763B2 (en) 2017-09-07 2020-02-25 Symbol Technologies, Llc Method and apparatus for support surface edge detection
US10489677B2 (en) 2017-09-07 2019-11-26 Symbol Technologies, Llc Method and apparatus for shelf edge detection
US11327504B2 (en) 2018-04-05 2022-05-10 Symbol Technologies, Llc Method, system and apparatus for mobile automation apparatus localization
US10832436B2 (en) 2018-04-05 2020-11-10 Symbol Technologies, Llc Method, system and apparatus for recovering label positions
US10823572B2 (en) 2018-04-05 2020-11-03 Symbol Technologies, Llc Method, system and apparatus for generating navigational data
US10809078B2 (en) 2018-04-05 2020-10-20 Symbol Technologies, Llc Method, system and apparatus for dynamic path generation
US10740911B2 (en) 2018-04-05 2020-08-11 Symbol Technologies, Llc Method, system and apparatus for correcting translucency artifacts in data representing a support structure
CN109035278A (en) * 2018-07-25 2018-12-18 深圳市荣盛智能装备有限公司 The detection method of fire exit door and its switch state based on image
CN109035278B (en) * 2018-07-25 2021-09-17 深圳市荣盛智能装备有限公司 Image-based detection method of fire door and its switch state
US11010920B2 (en) 2018-10-05 2021-05-18 Zebra Technologies Corporation Method, system and apparatus for object detection in point clouds
US11506483B2 (en) 2018-10-05 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for support structure depth determination
US11003188B2 (en) 2018-11-13 2021-05-11 Zebra Technologies Corporation Method, system and apparatus for obstacle handling in navigational path generation
US11090811B2 (en) 2018-11-13 2021-08-17 Zebra Technologies Corporation Method and apparatus for labeling of support structures
US11079240B2 (en) 2018-12-07 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for adaptive particle filter localization
US11416000B2 (en) 2018-12-07 2022-08-16 Zebra Technologies Corporation Method and apparatus for navigational ray tracing
US11100303B2 (en) 2018-12-10 2021-08-24 Zebra Technologies Corporation Method, system and apparatus for auxiliary label detection and association
US11015938B2 (en) 2018-12-12 2021-05-25 Zebra Technologies Corporation Method, system and apparatus for navigational assistance
US10731970B2 (en) 2018-12-13 2020-08-04 Zebra Technologies Corporation Method, system and apparatus for support structure detection
US11592826B2 (en) 2018-12-28 2023-02-28 Zebra Technologies Corporation Method, system and apparatus for dynamic loop closure in mapping trajectories
US11080566B2 (en) 2019-06-03 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for gap detection in support structures with peg regions
US11662739B2 (en) 2019-06-03 2023-05-30 Zebra Technologies Corporation Method, system and apparatus for adaptive ceiling-based localization
US11151743B2 (en) 2019-06-03 2021-10-19 Zebra Technologies Corporation Method, system and apparatus for end of aisle detection
US11960286B2 (en) 2019-06-03 2024-04-16 Zebra Technologies Corporation Method, system and apparatus for dynamic task sequencing
US11200677B2 (en) 2019-06-03 2021-12-14 Zebra Technologies Corporation Method, system and apparatus for shelf edge detection
US11402846B2 (en) 2019-06-03 2022-08-02 Zebra Technologies Corporation Method, system and apparatus for mitigating data capture light leakage
US11341663B2 (en) 2019-06-03 2022-05-24 Zebra Technologies Corporation Method, system and apparatus for detecting support structure obstructions
US11507103B2 (en) 2019-12-04 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for localization-based historical obstacle handling
US11107238B2 (en) 2019-12-13 2021-08-31 Zebra Technologies Corporation Method, system and apparatus for detecting item facings
US11822333B2 (en) 2020-03-30 2023-11-21 Zebra Technologies Corporation Method, system and apparatus for data capture illumination control
US11450024B2 (en) 2020-07-17 2022-09-20 Zebra Technologies Corporation Mixed depth object detection
US11593915B2 (en) 2020-10-21 2023-02-28 Zebra Technologies Corporation Parallax-tolerant panoramic image generation
US11392891B2 (en) 2020-11-03 2022-07-19 Zebra Technologies Corporation Item placement detection and optimization in material handling systems
US11847832B2 (en) 2020-11-11 2023-12-19 Zebra Technologies Corporation Object classification for autonomous navigation systems
US11954882B2 (en) 2021-06-17 2024-04-09 Zebra Technologies Corporation Feature-based georegistration for mobile computing devices
DE102024115850A1 (en) * 2024-06-06 2025-12-11 SSI Schäfer AG Automated shelf inspection in high-bay warehouses

Similar Documents

Publication Publication Date Title
WO2014181323A1 (en) System and method of retail image analysis
EP2994025B1 (en) System and method of monitoring retail units
US9124778B1 (en) Apparatuses and methods for disparity-based tracking and analysis of objects in a region of interest
TWI618916B (en) Method and system for estimating stock on shelf
US9635220B2 (en) Methods and systems for suppressing noise in images
CN114202537B (en) Camera imaging defect detection method, showcase and storage medium
US20210183077A1 (en) Detecting Motion in Images
US12303045B1 (en) Display case door with interior facing camera
US11832025B2 (en) System and methods for computerized health and safety assessments
CN105358924A (en) Refrigeration appliance comprising a camera module
WO2012012555A1 (en) Methods and systems for audience digital monitoring
US20140023243A1 (en) Kernel counter
US11861900B2 (en) Multi-view visual data damage detection
CN111310733A (en) Method, device and equipment for detecting personnel entering and exiting based on monitoring video
EP2826010A1 (en) Method and arrangement for analysing the behaviour of a moving object
EP2257924B1 (en) Method for generating a density image of an observation zone
DE102013211095A1 (en) Refrigeration device with a door
CN113723384A (en) Intelligent order generation method based on fusion after multi-view image acquisition and intelligent vending machine
CN111666792A (en) Image recognition method, image acquisition and recognition method and commodity recognition method
US10726581B2 (en) System and method for scene-space video processing
CN107527363B (en) Refrigerating device storage management system and refrigerating device
CN107527060B (en) Refrigerating device storage management system and refrigerating device
US20240070880A1 (en) 3d virtual construct and uses thereof
CN117795274A (en) Refrigeration device with parallel camera
US20230308611A1 (en) Multi-camera vision system in a refrigerator appliance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14727259

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14727259

Country of ref document: EP

Kind code of ref document: A1