US20230055289A1 - Systems and methods for guided item delivery operations - Google Patents
Systems and methods for guided item delivery operations Download PDFInfo
- Publication number
- US20230055289A1 US20230055289A1 US17/888,308 US202217888308A US2023055289A1 US 20230055289 A1 US20230055289 A1 US 20230055289A1 US 202217888308 A US202217888308 A US 202217888308A US 2023055289 A1 US2023055289 A1 US 2023055289A1
- Authority
- US
- United States
- Prior art keywords
- item
- storage location
- container
- computing device
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/067—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
- G06K19/07—Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
- G06K19/077—Constructional details, e.g. mounting of circuits in the carrier
- G06K19/07749—Constructional details, e.g. mounting of circuits in the carrier the record carrier being capable of non-contact communication, e.g. constructional details of the antenna of a non-contact smart card
- G06K19/07758—Constructional details, e.g. mounting of circuits in the carrier the record carrier being capable of non-contact communication, e.g. constructional details of the antenna of a non-contact smart card arrangements for adhering the record carrier to further objects or living beings, functioning as an identification tag
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10009—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
- G06K7/10366—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves the interrogation device being adapted for miscellaneous applications
- G06K7/10415—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves the interrogation device being adapted for miscellaneous applications the interrogation device being fixed in its position, such as an access control device for reading wireless access cards, or a wireless ATM
- G06K7/10425—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves the interrogation device being adapted for miscellaneous applications the interrogation device being fixed in its position, such as an access control device for reading wireless access cards, or a wireless ATM the interrogation device being arranged for interrogation of record carriers passing by the interrogation device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/225—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- Transportation and delivery of items typically involves loading the items into a container for transportation, such as a vehicle (e.g., a delivery van or the like). Items may then be retrieved from the container and deposited at respective delivery locations, such as residences or businesses. The process by which the items are loaded into the container, however, may impede rapid retrieval of items from the container for delivery.
- a container for transportation such as a vehicle (e.g., a delivery van or the like). Items may then be retrieved from the container and deposited at respective delivery locations, such as residences or businesses. The process by which the items are loaded into the container, however, may impede rapid retrieval of items from the container for delivery.
- FIG. 1 is a diagram of a system for guided item delivery.
- FIG. 2 is a diagram illustrating certain components of the computing device of FIG. 1 .
- FIG. 3 is a diagram illustrating an arrangement of sensors and output devices in the vehicle of FIG. 1 .
- FIG. 4 is a flowchart of a method for guided item delivery.
- FIG. 5 is a diagram illustrating an example performance of blocks 405 and 410 of the method of FIG. 4 .
- FIG. 6 is a diagram illustrating an example performance of block 420 of the method of FIG. 4 .
- FIG. 7 is a diagram illustrating an example performance of block 435 of the method of FIG. 4 .
- FIG. 8 is a diagram illustrating another example performance of block 435 of the method of FIG. 4 .
- Examples disclosed herein are directed to a method, comprising: obtaining an item identifier corresponding to an item for placement in a container, the item being associated with a delivery destination; in response to placement of the item at a storage position within the container, recording a storage location including coordinates of the storage position; monitoring transit information of the container during transport of the container; when the transit information of the container corresponds to the delivery destination, retrieving the recorded storage location of the item within the container; and controlling an output assembly to generate item retrieval guidance based on the recorded storage location.
- Additional examples disclosed herein are directed to a computing device, comprising: computing device, comprising: a memory; and a processor configured to: obtain an item identifier corresponding to an item for placement in a container, the item being associated with a delivery destination; in response to placement of the item at a storage position within the container, record a storage location including coordinates of the storage position; monitor transit information of the container during transport of the container; when the transit information of the container corresponds to the delivery destination, retrieve the recorded storage location of the item within the container; and control an output assembly to generate item retrieval guidance based on the recorded storage location.
- FIG. 1 illustrates a system 100 for guided item delivery operations.
- items such as packages and other freight
- items 104 - 1 , 104 - 2 , and 104 - 3 may be transported from a facility 108 such as a warehouse to destination locations, such as residences 112 - 1 , 112 - 2 , and 112 - 3 .
- the number of items 104 , and the number of destination locations 112 can vary, and need not be equal in other examples (e.g., more than one item 104 can be delivered to a single destination location 112 ).
- the items 104 can be delivered from the facility 108 to the residences 112 in a mobile container, which in the illustrated example is integrated with a vehicle 116 , such as a delivery van.
- vehicle 116 can be implemented as a van, a box truck, a tractor-trailer, or the like and can be controlled by an operator 120 (e.g., a human) or can be autonomous.
- an operator 120 e.g., a human
- a plurality of items 104 are placed in the storage container of the vehicle 116 , e.g., by one or more loading staff at the facility 108 .
- the vehicle 116 then travels (e.g., under the control of an operator 120 or autonomously) to the residences 112 .
- the operator 120 or an autonomous apparatus retrieves the relevant item(s) 104 destined for that residence 112 , removes the relevant item(s) 104 from the vehicle 116 , and delivers the relevant item(s) 104 to the relevant residence 112 before proceeding to the next residence 112 .
- Associations between items 104 and residences 112 can be stored in a central repository 124 , which can also contain data defining a delivery route that specifies a sequence in which the vehicle 116 is to travel to the residences 112 .
- the repository 124 can also contain a variety of other data defining the items 104 , such as sender identities and locations, item identifiers (e.g., uniquely distinguishing each item 104 from other items 104 ), item dimensions (e.g., one or more of width, length, and height), item weights, and the like.
- the number of items 104 in the vehicle 116 can impede the speed with which the operator 120 or drone, or recipient (e.g., in implementations in which the vehicle 116 travels to destinations autonomously, and recipients retrieve items from the vehicle 116 ) can locate and retrieve the items 104 from the vehicle 116 for delivery at each residence 112 .
- the vehicle 116 may contain tens or hundreds of items 104 . Locating specific items 104 among the total load of the vehicle 116 may be time-consuming, and certain items 104 may therefore consume suboptimal periods of time to be delivered.
- delivery of an item 104 may be abandoned by the operator 120 or drone, e.g., if locating the item 104 within the vehicle 116 consumes more than a threshold time period (e.g., thirty seconds, although a wide variety of other thresholds are also contemplated).
- a threshold time period e.g., thirty seconds, although a wide variety of other thresholds are also contemplated.
- the system 100 includes certain components and functionality to track the storage locations of items 104 within the vehicle 116 , e.g., according to a coordinate system established within the vehicle 116 . Tracking the storage locations of the items 104 within the vehicle 116 allows the system 100 to generate retrieval guidance, e.g., via visual and/or audible output(s) perceptible by the operator 120 or drone.
- the retrieval guidance facilitates searching and retrieval of the items 104 from the vehicle 116 by the operator 120 or drone, and may therefore reduce the time consumed by each retrieval and delivery operation (e.g., at a given residence 112 ).
- the system 100 includes a computing device 128 associated with the vehicle 116 and/or the operator 120 .
- the computing device 128 includes, or is communicatively coupled with, sensors disposed within the vehicle 116 as well as an output assembly controllable to generate the retrieval guidance.
- the computing device 128 can also exchange data with the central repository 124 , e.g., via a network 132 implemented as any suitable combination of local and wide-area networks.
- the sensors disposed within the vehicle 116 enable the computing device 128 to track each item 104 as each item 104 is placed within the vehicle 116 , e.g., on a support structure such as a shelf.
- the computing device 128 can therefore record the storage locations of each item 104 .
- the computing device 128 can further obtain storage locations and use the obtained storage locations to generate retrieval guidance upon determining that the vehicle 116 has arrived at a delivery destination (e.g., a residence 112 ).
- the computing device 128 includes a processor 200 , such as a central processing unit (CPU), a graphics processing unit (GPU), or a combination thereof.
- the processor 200 is communicatively coupled with a non-transitory computer-readable storage medium such as a memory 204 , implemented as a suitable combination of volatile and non-volatile memory elements.
- the memory 204 can store a plurality of computer-readable instructions, e.g., in the form of a delivery guidance application 208 executable by the processor 200 to perform functionality discussed in greater detail below.
- the application 208 in other examples, can be implemented as a suite of distinct applications, or as a dedicated hardware element (e.g., an application-specific integrated circuit (ASIC)).
- ASIC application-specific integrated circuit
- the computing device 128 also includes a communications interface 212 enabling communication between the device 128 and other computing devices (e.g., a server hosting the central repository 124 ), via suitable short-range links, networks such as the network 132 , and the like.
- the interface 212 therefore includes suitable hardware elements, executing suitable software and/or firmware, to communicate over the network 132 and/or other communication links.
- the computing device 128 includes, or is otherwise communicatively coupled with, a trigger sensor, such as a radio frequency identification (RFID) reader 224 , barcode scanner, or the like.
- RFID reader 224 can be disposed at a doorway to the storage container of the vehicle 116 .
- the RFID reader 224 can include, for example, a directional reader configured to detect RFID tags affixed to items 104 , or affixed to bins or containers carrying the items 104 , as the items 104 pass through the doorway, as well as to detect the direction in which the RFID tags are traveling (i.e., whether an RFID tag is entering or exiting the vehicle 116 ).
- the computing device 128 also includes, or is otherwise communicatively coupled with, a sensor assembly 216 , and an output assembly 220 .
- the sensor assembly 216 includes any one of, or any suitable combination of, sensors configured to track items 104 a within the vehicle 116 .
- the sensor assembly 216 includes one or more load sensors 228 , e.g., an array of pressure sensors or optical sensors disposed on a support structure (e.g., a shelf) within the vehicle 116 .
- the load sensor 228 can generate data indicating the presence of an item 104 thereon, and optionally one or more of a weight of the item 104 , and dimensions of the item 104 (e.g., a width and length of the item 104 resting on the load sensor 228 ).
- the sensor assembly 216 can further include one or more cameras 232 , such as color and/or depth cameras disposed within the vehicle 116 to observe at least a portion of the storage container of the vehicle 116 . Via the camera(s) 232 , the computing device 128 can obtain sequences of images in which the movement and placement of items 104 within the vehicle 116 can be tracked.
- the sensor assembly 216 can further include one or more mobile cameras 232 a , e.g., implemented as components of a wearable computing device such as a pair of smart glasses or the like.
- the sensor assembly 216 can include a microphone configured to capture voice or other audible signals describing a position of an item 104 within the vehicle 116 .
- the output assembly 220 can include any one of, or any combination of, indicator lights such as one or more laser pointers 236 , e.g., controllable to direct a beam of light towards various positions within the vehicle to provide a visible indication of the location.
- the indicator light(s) can also include one or more electronic labels 240 , e.g., disposed on an edge of a shelf within the vehicle 116 .
- the label(s) 240 can include a controllable display, an array of addressable light emitting diodes (LEDs), a reflecting or fluorescing surface, or the like.
- the label(s) 240 can be affixed to items 104 , e.g., as distinct controllable electronic labels affixed to respective items 104 prior to loading into the vehicle 116 , and connectable with the computing device 128 via short-range radio technologies (e.g., RFID, Bluetooth, or the like).
- the output assembly 220 can further include one or more speakers 244 , e.g., mounted within the vehicle 116 , controllable to generate output audible to the operator 120 .
- the output assembly 220 can include one or more displays 248 , e.g., flat panel or other suitable displays, disposed within the vehicle 116 and controllable by the processor 200 to present various information to the operator 120 .
- certain components of the sensor assembly 216 and/or the output assembly 220 can be integrated with a further computing device, distinct from and in communication with the computing device 128 .
- the computing device 128 can be deployed in the vehicle 116 as a tablet computer affixed to the vehicle 116 .
- the operator 120 meanwhile, can carry a mobile computing device such as a wrist-mounted computer and/or a head-mounted device 308 (e.g., smart glasses, or the like).
- the device 128 carried by the operator 120 can include one or more sensors and/or output devices, such as a camera, a display, and a speaker.
- the computing device 128 can therefore obtain sensor data directly from some sensors, and control some output devices directly, while obtaining sensor data from other sensors via the computing device 128 carried by the operator 120 .
- the computing device 128 can also control some output devices directly, while controlling other output devices by sending instructions to the computing device 128 carried by the operator 120 .
- FIG. 3 an example arrangement of certain elements of the sensor assembly 216 and the output assembly 220 within the vehicle 116 is illustrated.
- the sensors 216 e.g., camera 232
- output devices 220 e.g., laser pointer 236
- the RFID sensor(s) 224 can be disposed at a doorway 300 into the cargo area. In vehicles 116 with more than one doorway to the cargo area, additional RFID sensors may be provided.
- the vehicle 116 can include at least one support structure such as a shelf 304 (two shelves 304 at approximately the same height are shown in FIG. 3 ), onto which items 104 can be placed. At least one of the shelves 304 (and in the illustrated example, both of the shelves 304 ) may carry a load sensor 228 , enabling the locations of items 104 on the shelves 304 to be detected and reported to the computing device 128 .
- the vehicle 116 can also, in some examples, include load sensors 228 on other support structures, such as a floor of the vehicle 116 .
- the vehicle 116 can further include at least one camera 232 , e.g., mounted to a ceiling of the cargo area, with a field of view (FOV) that encompasses at least a portion of the cargo area.
- FOV field of view
- more than one camera 232 can be deployed to provide greater coverage of the cargo area.
- the cameras 232 can be mounted to walls instead of, or in addition to, the ceiling.
- the cameras 232 may also have movable lens assemblies, to redirect their FOVs.
- the laser pointer 236 is also shown as being ceiling-mounted in the present example, and can also have a movable emitter, enabling the computing device 128 to control the direction in which the laser pointer 236 emits a beam of light.
- the electronic label 240 is disposed on an edge of a shelf 304 (each shelf edge in the vehicle can carry an electronic label 240 , in some examples), such that the label 240 faces into the aisle of the cargo area, and is thus visible to the operator 120 when the operator 120 is in the cargo area.
- some sensors and/or output devices can be integrated with a mobile computing device associated with the vehicle 116 .
- the operator 120 can carry a wearable computing device 128 such as a wrist-mounted device (not shown) or a pair of smart glasses 308 .
- the glasses 308 can include the display 248 , e.g., to implement a heads-up display mechanism.
- the glasses 308 can also include the mobile camera 232 a , and/or in some implementations can include a mobile laser pointer 236 . Because the glasses 308 are mobile, the camera 232 a is not at a fixed, predetermined position within the vehicle 116 , as is the case with the cameras 232 , the laser pointer 236 , and the like.
- the glasses 308 can therefore also include a motion sensor such as an inertial measurement unit (IMU).
- IMU inertial measurement unit
- the glasses 308 can, via control of the IMU and the camera 232 a affixed to the glasses 308 , track the location of the glasses 308 within the cargo area of the vehicle 116 , and thereby register images captured by the glasses 308 to a predetermined coordinate system of the cargo area.
- IMU inertial measurement unit
- FIG. 4 a method 400 for guided item delivery operations is shown.
- the method 400 will be described in conjunction with its performance in the system 100 (e.g., by the computing device 128 in conjunction with the sensor assembly 216 and the output assembly 220 ).
- the computing device 128 determines and records a location of an item 104 within the vehicle 116 (e.g., according to a predetermined coordinate system representing the space within the cargo area of the vehicle 116 ), and later generates perceptible output indicating that location, to facilitate retrieval of the item by the operator 120 , e.g., reducing time spent searching for the item 104 by the operator 120 .
- the computing device 128 is configured to obtain an identifier of an item 104 to be placed within the cargo area of the vehicle 116 .
- a plurality of items 104 may be placed in a staging area of the facility 108 , for example, in preparation for loading the items 104 into the vehicle 116 .
- Each item 104 is previously assigned an identifier (e.g., an alphanumeric string) that uniquely distinguishes the item 104 from other items 104 in transit.
- the computing device 128 can be configured to obtain the item identifier via the RFID reader 224 , e.g., when the item 104 crosses the doorway 300 into the vehicle 116 .
- the item 104 includes an RFID tag storing the item identifier (e.g. embedded in a label affixed to the item 104 ), and the RFID tag is configured to transmit the item identifier to the reader 224 upon interrogation by the reader 224 .
- the computing device 128 can obtain the item identifier from a barcode scanner, e.g., implemented by the glasses 308 , incorporated into the computing device 128 , or another imaging device associated with the vehicle 116 or the facility 108 .
- the computing device 128 Upon obtaining the item identifier, the computing device 128 is configured to initiate tracking of the item 104 , via at least one sensor of the sensor assembly 216 .
- the computing device 128 can control the cameras 232 to begin capturing respective sequences of images depicting the interior of the vehicle 116 .
- the images from the cameras 232 may be combined, using stored calibration data defining the location of each camera 232 within the vehicle 116 , to generate a composite image.
- the location of the camera 232 a mounted on the glasses 308 is generally not stored in predetermined calibration data, as the glasses 308 are mobile relative to the vehicle 116 .
- Images captured by the glasses 308 can instead be registered to a coordinate system corresponding to the glasses themselves, and the glasses 308 can include processing hardware configured to track the location of the glasses 308 (and therefore the camera 232 a included thereon) within the vehicle 116 , e.g., via use of images from the camera 232 a and the IMU or other motion sensor.
- processing hardware configured to track the location of the glasses 308 (and therefore the camera 232 a included thereon) within the vehicle 116 , e.g., via use of images from the camera 232 a and the IMU or other motion sensor.
- Various mechanisms will occur to those skilled in the art to combine image and/or motion sensor data to generate pose estimations. Examples of such mechanisms include those implemented by the ARCore software development kit provided by Google LLC, and the ARKit software development kit provided by Apple Inc.
- Various other mechanisms can also be implemented by the glasses 308 to implement simultaneous localization and mapping (SLAM) functionality, e.g., to generate a three-dimensional representation of the cargo area of the vehicle, using images from the camera 232 a and the motion sensor, substantially in real time.
- the vehicle 116 can include markers or other visual anchors with predetermined coordinates in the vehicle-specific coordinate system, enabling the glasses 308 to register such tracked poses with the vehicle's coordinate system, and thereby register images captured by the glasses with the vehicle coordinate system.
- Example locationing mechanisms for mobile devices are set out in U.S. Patent Publication No. US 2021/0233256, the contents of which is incorporated herein by reference.
- only a subset of the cameras 232 may be activated at block 405 .
- the ceiling-mounted cameras 232 may be omitted, or their activation may be omitted from block 405 .
- the computing device 128 may transmit a tracking request or other suitable instruction to a mobile computing device such as the glasses 308 .
- Tracking of the item 104 as initiated at block 405 can include processing the image(s) captured by the camera(s) 232 to detect features such as edges, colors, or the like, and to determine the location of the item 104 in each image based on such features.
- a ceiling-mounted camera 232 can be configured to capture a sequence of images depicting a shelf 304 , and to compare each image with the preceding image to identify edges, blocks of color, or the like, that are different from the preceding image.
- the computing device 128 is configured to determine whether a tracked item 104 has been placed, e.g., on a shelf 304 .
- the determination at block 410 can include, for example, determining whether an item 104 was detected in the above-mentioned sequence of images, and has not moved in a threshold period of time and/or number of images in the sequence.
- a sequence of images 500 , 504 , and 508 is shown, e.g., captured by a ceiling-mounted camera 232 .
- the image 500 the earliest in the sequence
- a shelf 304 is visible, and no items 104 are detected thereon.
- the item 104 - 1 is detected adjacent to the shelf 304 (the images 504 and 508 omit a hand of the operator 120 for simplicity), but the determination at block 410 is negative, as the detected item has not remained in a detected location for a threshold period of time (e.g., five image frames in the sequence captured by the camera 232 , or any other suitable threshold).
- a threshold period of time e.g., five image frames in the sequence captured by the camera 232 , or any other suitable threshold.
- the computing device 128 therefore continues tracking the item 104 - 1 at block 405 .
- the image 508 captured after the image 504 , also depicts the item 104 - 1 , having been positioned on the shelf 304 . If the item 104 - 1 remains in the illustrated position for a threshold period of time as noted above, the determination at block 410 is affirmative.
- the computing device 128 can optionally (as indicated by dashed lines defining block 415 ) validate the location of the item detected via the tracking process initiated at block 405 .
- the computing device 128 can monitor sensor data from the load sensor 228 , and determine whether the load sensor data indicates a matching location for the item 104 as determined from the image sequence.
- a first storage location 600 is indicated in an overhead view of the cargo area of the vehicle 116 , as determined from the images 500 , 504 , and 508 .
- a second location 604 is also illustrated, as determined from the load sensor 228 .
- the computing device 128 can be configured, at block 415 , to determine whether the locations 600 and 604 are separated by less than a threshold distance.
- the computing device 128 can also be configured to retrieve item 104 dimensions (e.g., height and width) and/or an item 104 weight from the repository 124 , and compare the retrieved data to dimensions and/or weight from the load sensor 228 .
- the determination at block 415 is negative.
- the computing device 128 proceeds to block 420 .
- the load sensor 228 can be employed instead of the cameras 232 to track item 104 locations and determine whether an item 104 has been placed at block 410 . For example, if the locations, dimensions, and/or weight of an item 104 reported by the load sensor 228 remains stable for a threshold time period, the determination at block 410 is affirmative.
- the computing device 128 is configured to record the storage location.
- the computing device 128 can, for example, store coordinates in a coordinate system 608 (as shown in FIG. 6 ) corresponding to the item 104 .
- the coordinates can be stored in the memory 204 , and/or can be uploaded to the central repository 124 for storage.
- FIG. 6 illustrates an example data record 612 containing the storage location.
- the storage location e.g., expressed in coordinates in the coordinate system 608
- the storage location is stored in association with the item identifier obtained at block 405 , and can also be stored in association with an identifier of the vehicle 116 .
- Various other data can be stored in association with the item identifier, including item dimensions, weight, and a delivery destination (e.g., a mailing address, global positioning system (GPS) coordinates, or the like).
- GPS global positioning system
- the computing device 128 is configured to repeat the performances of blocks 405 to 420 for each item placed in the vehicle 116 .
- the computing device 128 can be configured to return from block 420 to block 405 until an instruction is received indicating that loading of the vehicle 116 is complete.
- the computing device 128 proceeds to block 425 .
- the computing device 128 is configured to determine whether a current place (e.g., a latitude/longitude, mailing address, or GPS data) of the vehicle 116 matches a delivery destination associated with any of the items 104 for which storage locations were recorded via block 420 .
- the current place of the vehicle 116 is also referred to as current transit information.
- the computing device 128 stores route data defining an order in which the items 104 are to be delivered.
- the computing device 128 can therefore, in some examples, determine a distance between the current transit information of the vehicle 116 , and the delivery destination associated with the next item 104 in the route data. When the distance is below a predetermined threshold, the computing device 128 can proceed to block 430 . Otherwise, the computing device 128 can continue monitoring the current transit information of the vehicle 116 .
- the computing device 128 is configured to retrieve the storage locations previously recorded for any items 104 associated with the next current delivery destination.
- the computing device 128 is configured to control the output assembly 220 to generate retrieval guidance to facilitate retrieval of the item(s) 104 from the vehicle 116 by the operator 120 for delivery.
- the retrieval guidance generated via the output assembly 220 can take various forms.
- the computing device 128 can retrieve the storage location 700 of the item 104 - 1 , and control the laser pointer 236 to emit a beam 704 of light towards the storage location 700 , e.g., to illuminate the item 104 - 1 .
- the computing device 128 may control the laser pointer 236 to generate a flashing beam rather than a solid beam, e.g., to indicate that the relevant item 104 is behind other items 104 .
- the computing device 128 can be configured to control an electronic label 240 to indicate a portion thereof adjacent to the recorded storage location of the relevant item 104 .
- the computing device 128 can control a portion 712 of the label 240 adjacent to the item 104 - 1 to illuminate, flash, modify a graphic, or the like.
- Control of the portion 712 can include storing, e.g., in the memory 204 , a mapping between the coordinate system of the cargo area in the vehicle 116 and the electronic labels 240 .
- the mapping can include, for example, an indication of the coordinates of each of a plurality of segments of an electronic label 240 .
- the computing device 128 can then select the closest segment (or set of segments) to the recorded storage location, and transmit a command to the selected segment(s) of the e-label 240 to illuminate or generate other suitable output signals.
- the computing device 128 can transmit a command to illuminate or generate other perceptible output to the relevant controllable label.
- the computing device 128 can generate retrieval guidance by transmitting guidance data to a mobile device worn by the operator 120 , such as a wrist-mounted computer and/or the glasses 308 .
- the retrieval guidance can include an indication 800 of the relevant item identifier (and may also include, in some examples, other information corresponding to the item 104 , such as a product name, a product image, a bin identifier, an indication of how many items 104 are to be retrieved for the current destination, and the like).
- the retrieval guidance data can also include an overlay 804 or other visual indicator, rendered on the display 248 at a location corresponding to the actual location of the item 104 - 1 .
- the computing device 128 can be configured to await receipt of a confirmation command, e.g., from the operator 120 , that a first item 104 has been retrieved before generating further retrieval guidance for the next item 104 for the same delivery destination.
- a confirmation command e.g., from the operator 120
- the retrieval guidance can include audible guidance, such as text-to-speech playback of a zone identifier.
- the computing device can maintain a set of zone definitions, and can determine which zone contains the storage location for an item 104 to be delivered.
- the computing device 128 can retrieve a name of that zone (e.g., “back-right, top shelf”), and generate audible output naming the relevant zone, via the speaker 244 .
- the computing device 128 can continue to generate retrieval guidance until, for example, detecting that the item(s) 104 corresponding to the current delivery destination have been removed from the vehicle 116 .
- the computing device 128 can detect such removal, for example via the RFID reader 224 , and/or via updated sensor data from the load sensor 228 indicating removal of the relevant item(s) 104 from the shelf 304 .
- the computing device 128 is configured to determine whether items 104 remain to be delivered. When the determination at block 440 is negative, the computing device 128 returns to block 425 , awaiting arrival of the vehicle 116 at the next delivery destination. When the determination is affirmative, performance of the method 400 ends.
- a includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element.
- the terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein.
- the terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%.
- the term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
- a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- processors or “processing devices”
- FPGAs field programmable gate arrays
- unique stored program instructions including both software and firmware
- some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic.
- ASICs application specific integrated circuits
- an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein.
- Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Economics (AREA)
- Toxicology (AREA)
- Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Development Economics (AREA)
- Tourism & Hospitality (AREA)
- Quality & Reliability (AREA)
- General Business, Economics & Management (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Hardware Design (AREA)
- Controlling Sheets Or Webs (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
- This application claims priority from U.S. Provisional Application No. 63/234,146, filed Aug. 17, 2021, the contents of which is incorporated herein by reference.
- Transportation and delivery of items, e.g., the transportation and delivery of packages to specified destinations, typically involves loading the items into a container for transportation, such as a vehicle (e.g., a delivery van or the like). Items may then be retrieved from the container and deposited at respective delivery locations, such as residences or businesses. The process by which the items are loaded into the container, however, may impede rapid retrieval of items from the container for delivery.
- The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
-
FIG. 1 is a diagram of a system for guided item delivery. -
FIG. 2 is a diagram illustrating certain components of the computing device ofFIG. 1 . -
FIG. 3 is a diagram illustrating an arrangement of sensors and output devices in the vehicle ofFIG. 1 . -
FIG. 4 is a flowchart of a method for guided item delivery. -
FIG. 5 is a diagram illustrating an example performance ofblocks FIG. 4 . -
FIG. 6 is a diagram illustrating an example performance ofblock 420 of the method ofFIG. 4 . -
FIG. 7 is a diagram illustrating an example performance ofblock 435 of the method ofFIG. 4 . -
FIG. 8 is a diagram illustrating another example performance ofblock 435 of the method ofFIG. 4 . - Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
- The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
- Examples disclosed herein are directed to a method, comprising: obtaining an item identifier corresponding to an item for placement in a container, the item being associated with a delivery destination; in response to placement of the item at a storage position within the container, recording a storage location including coordinates of the storage position; monitoring transit information of the container during transport of the container; when the transit information of the container corresponds to the delivery destination, retrieving the recorded storage location of the item within the container; and controlling an output assembly to generate item retrieval guidance based on the recorded storage location.
- Additional examples disclosed herein are directed to a computing device, comprising: computing device, comprising: a memory; and a processor configured to: obtain an item identifier corresponding to an item for placement in a container, the item being associated with a delivery destination; in response to placement of the item at a storage position within the container, record a storage location including coordinates of the storage position; monitor transit information of the container during transport of the container; when the transit information of the container corresponds to the delivery destination, retrieve the recorded storage location of the item within the container; and control an output assembly to generate item retrieval guidance based on the recorded storage location.
-
FIG. 1 illustrates asystem 100 for guided item delivery operations. A wide variety of items, such as packages and other freight, are transported from origin locations to destination locations, often via a variety of intermediate locations. In the illustrated example, items 104-1, 104-2, and 104-3 (collectively referred to as the items 104, and generically referred to as an item 104; similar nomenclature is also employed for other components in the discussion below) may be transported from afacility 108 such as a warehouse to destination locations, such as residences 112-1, 112-2, and 112-3. The number of items 104, and the number of destination locations 112, can vary, and need not be equal in other examples (e.g., more than one item 104 can be delivered to a single destination location 112). - The items 104 can be delivered from the
facility 108 to the residences 112 in a mobile container, which in the illustrated example is integrated with avehicle 116, such as a delivery van. Thevehicle 116 can be implemented as a van, a box truck, a tractor-trailer, or the like and can be controlled by an operator 120 (e.g., a human) or can be autonomous. In general, a plurality of items 104 are placed in the storage container of thevehicle 116, e.g., by one or more loading staff at thefacility 108. Thevehicle 116 then travels (e.g., under the control of anoperator 120 or autonomously) to the residences 112. At each residence, theoperator 120 or an autonomous apparatus such as a drone retrieves the relevant item(s) 104 destined for that residence 112, removes the relevant item(s) 104 from thevehicle 116, and delivers the relevant item(s) 104 to the relevant residence 112 before proceeding to the next residence 112. Associations between items 104 and residences 112 can be stored in acentral repository 124, which can also contain data defining a delivery route that specifies a sequence in which thevehicle 116 is to travel to the residences 112. Therepository 124 can also contain a variety of other data defining the items 104, such as sender identities and locations, item identifiers (e.g., uniquely distinguishing each item 104 from other items 104), item dimensions (e.g., one or more of width, length, and height), item weights, and the like. - The number of items 104 in the
vehicle 116 can impede the speed with which theoperator 120 or drone, or recipient (e.g., in implementations in which thevehicle 116 travels to destinations autonomously, and recipients retrieve items from the vehicle 116) can locate and retrieve the items 104 from thevehicle 116 for delivery at each residence 112. For example, early in a delivery run, thevehicle 116 may contain tens or hundreds of items 104. Locating specific items 104 among the total load of thevehicle 116 may be time-consuming, and certain items 104 may therefore consume suboptimal periods of time to be delivered. In some examples, delivery of an item 104 may be abandoned by theoperator 120 or drone, e.g., if locating the item 104 within thevehicle 116 consumes more than a threshold time period (e.g., thirty seconds, although a wide variety of other thresholds are also contemplated). - The
system 100 includes certain components and functionality to track the storage locations of items 104 within thevehicle 116, e.g., according to a coordinate system established within thevehicle 116. Tracking the storage locations of the items 104 within thevehicle 116 allows thesystem 100 to generate retrieval guidance, e.g., via visual and/or audible output(s) perceptible by theoperator 120 or drone. The retrieval guidance facilitates searching and retrieval of the items 104 from thevehicle 116 by theoperator 120 or drone, and may therefore reduce the time consumed by each retrieval and delivery operation (e.g., at a given residence 112). - As described in greater detail below, the
system 100 includes acomputing device 128 associated with thevehicle 116 and/or theoperator 120. Thecomputing device 128 includes, or is communicatively coupled with, sensors disposed within thevehicle 116 as well as an output assembly controllable to generate the retrieval guidance. As illustrated inFIG. 1 , thecomputing device 128 can also exchange data with thecentral repository 124, e.g., via anetwork 132 implemented as any suitable combination of local and wide-area networks. The sensors disposed within thevehicle 116 enable thecomputing device 128 to track each item 104 as each item 104 is placed within thevehicle 116, e.g., on a support structure such as a shelf. Thecomputing device 128 can therefore record the storage locations of each item 104. Thecomputing device 128 can further obtain storage locations and use the obtained storage locations to generate retrieval guidance upon determining that thevehicle 116 has arrived at a delivery destination (e.g., a residence 112). - Turning to
FIG. 2 , certain components of thecomputing device 128, sensors, and output assembly are shown. As illustrated inFIG. 2 , thecomputing device 128 includes aprocessor 200, such as a central processing unit (CPU), a graphics processing unit (GPU), or a combination thereof. Theprocessor 200 is communicatively coupled with a non-transitory computer-readable storage medium such as amemory 204, implemented as a suitable combination of volatile and non-volatile memory elements. Thememory 204 can store a plurality of computer-readable instructions, e.g., in the form of adelivery guidance application 208 executable by theprocessor 200 to perform functionality discussed in greater detail below. Theapplication 208, in other examples, can be implemented as a suite of distinct applications, or as a dedicated hardware element (e.g., an application-specific integrated circuit (ASIC)). - The
computing device 128 also includes acommunications interface 212 enabling communication between thedevice 128 and other computing devices (e.g., a server hosting the central repository 124), via suitable short-range links, networks such as thenetwork 132, and the like. Theinterface 212 therefore includes suitable hardware elements, executing suitable software and/or firmware, to communicate over thenetwork 132 and/or other communication links. - The
computing device 128 includes, or is otherwise communicatively coupled with, a trigger sensor, such as a radio frequency identification (RFID)reader 224, barcode scanner, or the like. TheRFID reader 224 can be disposed at a doorway to the storage container of thevehicle 116. TheRFID reader 224 can include, for example, a directional reader configured to detect RFID tags affixed to items 104, or affixed to bins or containers carrying the items 104, as the items 104 pass through the doorway, as well as to detect the direction in which the RFID tags are traveling (i.e., whether an RFID tag is entering or exiting the vehicle 116). - The
computing device 128 also includes, or is otherwise communicatively coupled with, asensor assembly 216, and anoutput assembly 220. Thesensor assembly 216 includes any one of, or any suitable combination of, sensors configured to track items 104 a within thevehicle 116. In this example, thesensor assembly 216 includes one ormore load sensors 228, e.g., an array of pressure sensors or optical sensors disposed on a support structure (e.g., a shelf) within thevehicle 116. Theload sensor 228 can generate data indicating the presence of an item 104 thereon, and optionally one or more of a weight of the item 104, and dimensions of the item 104 (e.g., a width and length of the item 104 resting on the load sensor 228). - The
sensor assembly 216 can further include one ormore cameras 232, such as color and/or depth cameras disposed within thevehicle 116 to observe at least a portion of the storage container of thevehicle 116. Via the camera(s) 232, thecomputing device 128 can obtain sequences of images in which the movement and placement of items 104 within thevehicle 116 can be tracked. Thesensor assembly 216 can further include one or moremobile cameras 232 a, e.g., implemented as components of a wearable computing device such as a pair of smart glasses or the like. In further examples, thesensor assembly 216 can include a microphone configured to capture voice or other audible signals describing a position of an item 104 within thevehicle 116. - The
output assembly 220 can include any one of, or any combination of, indicator lights such as one ormore laser pointers 236, e.g., controllable to direct a beam of light towards various positions within the vehicle to provide a visible indication of the location. The indicator light(s) can also include one or moreelectronic labels 240, e.g., disposed on an edge of a shelf within thevehicle 116. The label(s) 240 can include a controllable display, an array of addressable light emitting diodes (LEDs), a reflecting or fluorescing surface, or the like. In further examples, the label(s) 240 can be affixed to items 104, e.g., as distinct controllable electronic labels affixed to respective items 104 prior to loading into thevehicle 116, and connectable with thecomputing device 128 via short-range radio technologies (e.g., RFID, Bluetooth, or the like). Theoutput assembly 220 can further include one ormore speakers 244, e.g., mounted within thevehicle 116, controllable to generate output audible to theoperator 120. Further, theoutput assembly 220 can include one ormore displays 248, e.g., flat panel or other suitable displays, disposed within thevehicle 116 and controllable by theprocessor 200 to present various information to theoperator 120. - In some examples, certain components of the
sensor assembly 216 and/or theoutput assembly 220 can be integrated with a further computing device, distinct from and in communication with thecomputing device 128. For example, thecomputing device 128 can be deployed in thevehicle 116 as a tablet computer affixed to thevehicle 116. Theoperator 120, meanwhile, can carry a mobile computing device such as a wrist-mounted computer and/or a head-mounted device 308 (e.g., smart glasses, or the like). Thedevice 128 carried by theoperator 120 can include one or more sensors and/or output devices, such as a camera, a display, and a speaker. Thecomputing device 128 can therefore obtain sensor data directly from some sensors, and control some output devices directly, while obtaining sensor data from other sensors via thecomputing device 128 carried by theoperator 120. Thecomputing device 128 can also control some output devices directly, while controlling other output devices by sending instructions to thecomputing device 128 carried by theoperator 120. - Turning to
FIG. 3 , an example arrangement of certain elements of thesensor assembly 216 and theoutput assembly 220 within thevehicle 116 is illustrated. As shown inFIG. 3 , which illustrates a side view of the vehicle 116 (bottom) and an overhead view of the cargo area of the vehicle 116 (middle), the sensors 216 (e.g., camera 232) and output devices 220 (e.g., laser pointer 236) can be disposed at various positions within thevehicle 116. For example, the RFID sensor(s) 224 can be disposed at adoorway 300 into the cargo area. Invehicles 116 with more than one doorway to the cargo area, additional RFID sensors may be provided. - The
vehicle 116 can include at least one support structure such as a shelf 304 (twoshelves 304 at approximately the same height are shown inFIG. 3 ), onto which items 104 can be placed. At least one of the shelves 304 (and in the illustrated example, both of the shelves 304) may carry aload sensor 228, enabling the locations of items 104 on theshelves 304 to be detected and reported to thecomputing device 128. Thevehicle 116 can also, in some examples, includeload sensors 228 on other support structures, such as a floor of thevehicle 116. - The
vehicle 116 can further include at least onecamera 232, e.g., mounted to a ceiling of the cargo area, with a field of view (FOV) that encompasses at least a portion of the cargo area. In some examples, as illustrated, more than onecamera 232 can be deployed to provide greater coverage of the cargo area. In some examples, thecameras 232 can be mounted to walls instead of, or in addition to, the ceiling. Thecameras 232 may also have movable lens assemblies, to redirect their FOVs. Thelaser pointer 236 is also shown as being ceiling-mounted in the present example, and can also have a movable emitter, enabling thecomputing device 128 to control the direction in which thelaser pointer 236 emits a beam of light. Theelectronic label 240 is disposed on an edge of a shelf 304 (each shelf edge in the vehicle can carry anelectronic label 240, in some examples), such that thelabel 240 faces into the aisle of the cargo area, and is thus visible to theoperator 120 when theoperator 120 is in the cargo area. - As noted above, some sensors and/or output devices can be integrated with a mobile computing device associated with the
vehicle 116. For example, theoperator 120 can carry awearable computing device 128 such as a wrist-mounted device (not shown) or a pair ofsmart glasses 308. Theglasses 308 can include thedisplay 248, e.g., to implement a heads-up display mechanism. Theglasses 308 can also include themobile camera 232 a, and/or in some implementations can include amobile laser pointer 236. Because theglasses 308 are mobile, thecamera 232 a is not at a fixed, predetermined position within thevehicle 116, as is the case with thecameras 232, thelaser pointer 236, and the like. Theglasses 308 can therefore also include a motion sensor such as an inertial measurement unit (IMU). Theglasses 308 can, via control of the IMU and thecamera 232 a affixed to theglasses 308, track the location of theglasses 308 within the cargo area of thevehicle 116, and thereby register images captured by theglasses 308 to a predetermined coordinate system of the cargo area. - Turning to
FIG. 4 , amethod 400 for guided item delivery operations is shown. Themethod 400 will be described in conjunction with its performance in the system 100 (e.g., by thecomputing device 128 in conjunction with thesensor assembly 216 and the output assembly 220). In general, via performance of themethod 400, thecomputing device 128 determines and records a location of an item 104 within the vehicle 116 (e.g., according to a predetermined coordinate system representing the space within the cargo area of the vehicle 116), and later generates perceptible output indicating that location, to facilitate retrieval of the item by theoperator 120, e.g., reducing time spent searching for the item 104 by theoperator 120. - At
block 405, thecomputing device 128 is configured to obtain an identifier of an item 104 to be placed within the cargo area of thevehicle 116. A plurality of items 104 may be placed in a staging area of thefacility 108, for example, in preparation for loading the items 104 into thevehicle 116. Each item 104 is previously assigned an identifier (e.g., an alphanumeric string) that uniquely distinguishes the item 104 from other items 104 in transit. - The
computing device 128 can be configured to obtain the item identifier via theRFID reader 224, e.g., when the item 104 crosses thedoorway 300 into thevehicle 116. In some examples, the item 104 includes an RFID tag storing the item identifier (e.g. embedded in a label affixed to the item 104), and the RFID tag is configured to transmit the item identifier to thereader 224 upon interrogation by thereader 224. In other examples, thecomputing device 128 can obtain the item identifier from a barcode scanner, e.g., implemented by theglasses 308, incorporated into thecomputing device 128, or another imaging device associated with thevehicle 116 or thefacility 108. - Upon obtaining the item identifier, the
computing device 128 is configured to initiate tracking of the item 104, via at least one sensor of thesensor assembly 216. For example, thecomputing device 128 can control thecameras 232 to begin capturing respective sequences of images depicting the interior of thevehicle 116. In some examples, the images from thecameras 232 may be combined, using stored calibration data defining the location of eachcamera 232 within thevehicle 116, to generate a composite image. The location of thecamera 232 a mounted on theglasses 308 is generally not stored in predetermined calibration data, as theglasses 308 are mobile relative to thevehicle 116. Images captured by theglasses 308 can instead be registered to a coordinate system corresponding to the glasses themselves, and theglasses 308 can include processing hardware configured to track the location of the glasses 308 (and therefore thecamera 232 a included thereon) within thevehicle 116, e.g., via use of images from thecamera 232 a and the IMU or other motion sensor. Various mechanisms will occur to those skilled in the art to combine image and/or motion sensor data to generate pose estimations. Examples of such mechanisms include those implemented by the ARCore software development kit provided by Google LLC, and the ARKit software development kit provided by Apple Inc. Various other mechanisms can also be implemented by theglasses 308 to implement simultaneous localization and mapping (SLAM) functionality, e.g., to generate a three-dimensional representation of the cargo area of the vehicle, using images from thecamera 232 a and the motion sensor, substantially in real time. Thevehicle 116 can include markers or other visual anchors with predetermined coordinates in the vehicle-specific coordinate system, enabling theglasses 308 to register such tracked poses with the vehicle's coordinate system, and thereby register images captured by the glasses with the vehicle coordinate system. Example locationing mechanisms for mobile devices are set out in U.S. Patent Publication No. US 2021/0233256, the contents of which is incorporated herein by reference. - In other examples, only a subset of the
cameras 232 may be activated atblock 405. For example, in some implementations the ceiling-mountedcameras 232 may be omitted, or their activation may be omitted fromblock 405. In such implementations, thecomputing device 128 may transmit a tracking request or other suitable instruction to a mobile computing device such as theglasses 308. - Tracking of the item 104 as initiated at
block 405 can include processing the image(s) captured by the camera(s) 232 to detect features such as edges, colors, or the like, and to determine the location of the item 104 in each image based on such features. For example, a ceiling-mountedcamera 232 can be configured to capture a sequence of images depicting ashelf 304, and to compare each image with the preceding image to identify edges, blocks of color, or the like, that are different from the preceding image. - At
block 410, thecomputing device 128 is configured to determine whether a tracked item 104 has been placed, e.g., on ashelf 304. The determination atblock 410 can include, for example, determining whether an item 104 was detected in the above-mentioned sequence of images, and has not moved in a threshold period of time and/or number of images in the sequence. - For example, turning to
FIG. 5 , a sequence ofimages camera 232. In the image 500 (the earliest in the sequence) ashelf 304 is visible, and no items 104 are detected thereon. In theimage 504, the item 104-1 is detected adjacent to the shelf 304 (theimages operator 120 for simplicity), but the determination atblock 410 is negative, as the detected item has not remained in a detected location for a threshold period of time (e.g., five image frames in the sequence captured by thecamera 232, or any other suitable threshold). Thecomputing device 128 therefore continues tracking the item 104-1 atblock 405. Theimage 508, captured after theimage 504, also depicts the item 104-1, having been positioned on theshelf 304. If the item 104-1 remains in the illustrated position for a threshold period of time as noted above, the determination atblock 410 is affirmative. - At
block 415, upon determining that an item 104 has been placed, thecomputing device 128 can optionally (as indicated by dashed lines defining block 415) validate the location of the item detected via the tracking process initiated atblock 405. For example, thecomputing device 128 can monitor sensor data from theload sensor 228, and determine whether the load sensor data indicates a matching location for the item 104 as determined from the image sequence. Turning toFIG. 6 , afirst storage location 600 is indicated in an overhead view of the cargo area of thevehicle 116, as determined from theimages second location 604 is also illustrated, as determined from theload sensor 228. Thecomputing device 128 can be configured, atblock 415, to determine whether thelocations computing device 128 can also be configured to retrieve item 104 dimensions (e.g., height and width) and/or an item 104 weight from therepository 124, and compare the retrieved data to dimensions and/or weight from theload sensor 228. When thelocations block 415 is negative. When the determination atblock 415 is affirmative, thecomputing device 128 proceeds to block 420. - In some examples, the
load sensor 228 can be employed instead of thecameras 232 to track item 104 locations and determine whether an item 104 has been placed atblock 410. For example, if the locations, dimensions, and/or weight of an item 104 reported by theload sensor 228 remains stable for a threshold time period, the determination atblock 410 is affirmative. - At
block 420, having determined (and optionally validated) a storage location for the item 104 within thevehicle 116, thecomputing device 128 is configured to record the storage location. Thecomputing device 128 can, for example, store coordinates in a coordinate system 608 (as shown inFIG. 6 ) corresponding to the item 104. The coordinates can be stored in thememory 204, and/or can be uploaded to thecentral repository 124 for storage.FIG. 6 illustrates anexample data record 612 containing the storage location. The storage location (e.g., expressed in coordinates in the coordinate system 608) is stored in association with the item identifier obtained atblock 405, and can also be stored in association with an identifier of thevehicle 116. Various other data can be stored in association with the item identifier, including item dimensions, weight, and a delivery destination (e.g., a mailing address, global positioning system (GPS) coordinates, or the like). - Returning to
FIG. 4 , thecomputing device 128 is configured to repeat the performances ofblocks 405 to 420 for each item placed in thevehicle 116. For example, thecomputing device 128 can be configured to return fromblock 420 to block 405 until an instruction is received indicating that loading of thevehicle 116 is complete. - Following completion of the loading process, and recordal of the storage locations of each item 104 in the
vehicle 116, thecomputing device 128 proceeds to block 425. Atblock 425, thecomputing device 128 is configured to determine whether a current place (e.g., a latitude/longitude, mailing address, or GPS data) of thevehicle 116 matches a delivery destination associated with any of the items 104 for which storage locations were recorded viablock 420. The current place of thevehicle 116 is also referred to as current transit information. In general, thecomputing device 128 stores route data defining an order in which the items 104 are to be delivered. Thecomputing device 128 can therefore, in some examples, determine a distance between the current transit information of thevehicle 116, and the delivery destination associated with the next item 104 in the route data. When the distance is below a predetermined threshold, thecomputing device 128 can proceed to block 430. Otherwise, thecomputing device 128 can continue monitoring the current transit information of thevehicle 116. - At
block 430, following an affirmative determination atblock 425, thecomputing device 128 is configured to retrieve the storage locations previously recorded for any items 104 associated with the next current delivery destination. Atblock 435, thecomputing device 128 is configured to control theoutput assembly 220 to generate retrieval guidance to facilitate retrieval of the item(s) 104 from thevehicle 116 by theoperator 120 for delivery. - The retrieval guidance generated via the
output assembly 220 can take various forms. For example, as shown inFIG. 7 , thecomputing device 128 can retrieve thestorage location 700 of the item 104-1, and control thelaser pointer 236 to emit abeam 704 of light towards thestorage location 700, e.g., to illuminate the item 104-1. In some examples, when another storage location indicates that the item 104-1 is behind other items, thecomputing device 128 may control thelaser pointer 236 to generate a flashing beam rather than a solid beam, e.g., to indicate that the relevant item 104 is behind other items 104. - In other examples, instead of or in addition to the
beam 704, thecomputing device 128 can be configured to control anelectronic label 240 to indicate a portion thereof adjacent to the recorded storage location of the relevant item 104. For example, as shown in thedetail 708 ofFIG. 7 , thecomputing device 128 can control aportion 712 of thelabel 240 adjacent to the item 104-1 to illuminate, flash, modify a graphic, or the like. Control of theportion 712 can include storing, e.g., in thememory 204, a mapping between the coordinate system of the cargo area in thevehicle 116 and theelectronic labels 240. The mapping can include, for example, an indication of the coordinates of each of a plurality of segments of anelectronic label 240. Thecomputing device 128 can then select the closest segment (or set of segments) to the recorded storage location, and transmit a command to the selected segment(s) of the e-label 240 to illuminate or generate other suitable output signals. In examples in which some or all items 104 carry respective controllable electronic labels, e.g., with LEDs or other output devices, thecomputing device 128 can transmit a command to illuminate or generate other perceptible output to the relevant controllable label. - In further examples, as shown in
FIG. 8 , thecomputing device 128 can generate retrieval guidance by transmitting guidance data to a mobile device worn by theoperator 120, such as a wrist-mounted computer and/or theglasses 308. For example, the retrieval guidance can include anindication 800 of the relevant item identifier (and may also include, in some examples, other information corresponding to the item 104, such as a product name, a product image, a bin identifier, an indication of how many items 104 are to be retrieved for the current destination, and the like). The retrieval guidance data can also include anoverlay 804 or other visual indicator, rendered on thedisplay 248 at a location corresponding to the actual location of the item 104-1. - When a given delivery destination corresponds to more than one item 104, the
computing device 128 can be configured to await receipt of a confirmation command, e.g., from theoperator 120, that a first item 104 has been retrieved before generating further retrieval guidance for the next item 104 for the same delivery destination. - In further examples, the retrieval guidance can include audible guidance, such as text-to-speech playback of a zone identifier. For example, the computing device can maintain a set of zone definitions, and can determine which zone contains the storage location for an item 104 to be delivered. The
computing device 128 can retrieve a name of that zone (e.g., “back-right, top shelf”), and generate audible output naming the relevant zone, via thespeaker 244. - The
computing device 128 can continue to generate retrieval guidance until, for example, detecting that the item(s) 104 corresponding to the current delivery destination have been removed from thevehicle 116. Thecomputing device 128 can detect such removal, for example via theRFID reader 224, and/or via updated sensor data from theload sensor 228 indicating removal of the relevant item(s) 104 from theshelf 304. - Following the generation of retrieval guidance at
block 435, atblock 440 thecomputing device 128 is configured to determine whether items 104 remain to be delivered. When the determination atblock 440 is negative, thecomputing device 128 returns to block 425, awaiting arrival of thevehicle 116 at the next delivery destination. When the determination is affirmative, performance of themethod 400 ends. - In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
- The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
- Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- Certain expressions may be employed herein to list combinations of elements. Examples of such expressions include: “at least one of A, B, and C”; “one or more of A, B, and C”; “at least one of A, B, or C”; “one or more of A, B, or C”. Unless expressly indicated otherwise, the above expressions encompass any combination of A and/or B and/or C.
- It will be appreciated that some embodiments may be comprised of one or more specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
- Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
- The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/888,308 US20230055289A1 (en) | 2021-08-17 | 2022-08-15 | Systems and methods for guided item delivery operations |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163234146P | 2021-08-17 | 2021-08-17 | |
US17/888,308 US20230055289A1 (en) | 2021-08-17 | 2022-08-15 | Systems and methods for guided item delivery operations |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230055289A1 true US20230055289A1 (en) | 2023-02-23 |
Family
ID=84357798
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/888,308 Pending US20230055289A1 (en) | 2021-08-17 | 2022-08-15 | Systems and methods for guided item delivery operations |
US17/888,322 Active 2043-08-10 US12315185B2 (en) | 2021-08-17 | 2022-08-15 | Object identification using surface optical artifacts |
US19/196,694 Pending US20250259328A1 (en) | 2021-08-17 | 2025-05-01 | Object Identification Using Surface Optical Artifacts |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/888,322 Active 2043-08-10 US12315185B2 (en) | 2021-08-17 | 2022-08-15 | Object identification using surface optical artifacts |
US19/196,694 Pending US20250259328A1 (en) | 2021-08-17 | 2025-05-01 | Object Identification Using Surface Optical Artifacts |
Country Status (3)
Country | Link |
---|---|
US (3) | US20230055289A1 (en) |
BE (1) | BE1029651B1 (en) |
WO (1) | WO2023023000A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230252399A1 (en) * | 2021-07-25 | 2023-08-10 | Trackonomy Systems, Inc. | System and method for detection and tracking of assets in a vehicle |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080055084A1 (en) * | 2006-08-25 | 2008-03-06 | William Kress Bodin | Item position indicator and optimized item retrieval for a sensor equipped storage unit |
US20170330144A1 (en) * | 2016-05-11 | 2017-11-16 | Amazon Technologies, Inc. | Mobile pickup units |
US20210158241A1 (en) * | 2019-11-21 | 2021-05-27 | Intelligrated Headquarters, Llc | Methods and systems for task execution in a workplace |
US20230267406A1 (en) * | 2020-05-07 | 2023-08-24 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for a smart product handoff integrated platform |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040247823A1 (en) | 2000-12-08 | 2004-12-09 | Hansen Roger A. | Decal organization tool |
US7566004B2 (en) | 2004-10-29 | 2009-07-28 | Symbol Technologies Inc. | Method and apparatus for extending the range of a product authentication device |
US9208394B2 (en) * | 2005-09-05 | 2015-12-08 | Alpvision S.A. | Authentication of an article of manufacture using an image of the microstructure of it surface |
WO2009115611A2 (en) | 2008-03-20 | 2009-09-24 | Universite De Geneve | Secure item identification and authentication system and method based on unclonable features |
US8534543B1 (en) * | 2012-05-18 | 2013-09-17 | Sri International | System and method for authenticating a manufactured product with a mobile device |
US20140160337A1 (en) * | 2012-07-24 | 2014-06-12 | GVBB Holdings, S.A.R.L | Camera viewfinder comprising a projector |
FR3002057B1 (en) * | 2013-02-11 | 2016-09-30 | Novatec | METHOD FOR MAKING AN IDENTIFICATION AND AUTHENTICATION LABEL AND DEVICE THEREOF |
US9595038B1 (en) * | 2015-05-18 | 2017-03-14 | Amazon Technologies, Inc. | Inventory confirmation |
WO2020044373A1 (en) * | 2018-09-01 | 2020-03-05 | Ali Sharique | System and method for preventing counterfeiting of a product |
-
2022
- 2022-08-15 WO PCT/US2022/040372 patent/WO2023023000A1/en active Application Filing
- 2022-08-15 US US17/888,308 patent/US20230055289A1/en active Pending
- 2022-08-15 US US17/888,322 patent/US12315185B2/en active Active
- 2022-08-17 BE BE20225642A patent/BE1029651B1/en active IP Right Grant
-
2025
- 2025-05-01 US US19/196,694 patent/US20250259328A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080055084A1 (en) * | 2006-08-25 | 2008-03-06 | William Kress Bodin | Item position indicator and optimized item retrieval for a sensor equipped storage unit |
US20170330144A1 (en) * | 2016-05-11 | 2017-11-16 | Amazon Technologies, Inc. | Mobile pickup units |
US20210158241A1 (en) * | 2019-11-21 | 2021-05-27 | Intelligrated Headquarters, Llc | Methods and systems for task execution in a workplace |
US20230267406A1 (en) * | 2020-05-07 | 2023-08-24 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for a smart product handoff integrated platform |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230252399A1 (en) * | 2021-07-25 | 2023-08-10 | Trackonomy Systems, Inc. | System and method for detection and tracking of assets in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
WO2023023000A1 (en) | 2023-02-23 |
BE1029651A1 (en) | 2023-02-27 |
US12315185B2 (en) | 2025-05-27 |
US20250259328A1 (en) | 2025-08-14 |
US20230058995A1 (en) | 2023-02-23 |
BE1029651B1 (en) | 2023-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210383320A1 (en) | Object location in a delivery vehicle | |
US11841452B2 (en) | Identifying an asset sort location | |
US11858010B2 (en) | Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same | |
US11935169B2 (en) | Displaying items of interest in an augmented reality environment | |
US20220042814A1 (en) | Hands-free augmented reality system for picking and/or sorting assets | |
US11769107B1 (en) | Methods, apparatuses and computer program products for generating logistics zones | |
US11803803B2 (en) | Electronically connectable packaging systems configured for shipping items | |
KR102047048B1 (en) | Logistics delivery method based on IoT | |
CN105059327A (en) | Train number identification method and device | |
US20230055289A1 (en) | Systems and methods for guided item delivery operations | |
US20230356966A1 (en) | Systems and Methods for Optimized Container Loading Operations | |
US20220237557A1 (en) | Shipment delivery system that optimizes routes for parcel delivery | |
JP2005170579A (en) | Information management system | |
US20250307759A1 (en) | Dynamic Generation and Updating of Item Storage Mapping Data | |
CA3046378A1 (en) | Identifying an asset sort location | |
WO2023192020A1 (en) | Package identification using multiple rfid signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: ZEBRA TECHNOLOGIES CORPORATION, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BIANCULLI, TOM D;O'HAGAN, JAMES J.;HUBBARD, STUART PETER;SIGNING DATES FROM 20220810 TO 20230809;REEL/FRAME:067287/0394 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION COUNTED, NOT YET MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |