US20090109295A1 - Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code - Google Patents
Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code Download PDFInfo
- Publication number
- US20090109295A1 US20090109295A1 US12/262,114 US26211408A US2009109295A1 US 20090109295 A1 US20090109295 A1 US 20090109295A1 US 26211408 A US26211408 A US 26211408A US 2009109295 A1 US2009109295 A1 US 2009109295A1
- Authority
- US
- United States
- Prior art keywords
- container
- estimate
- sensed
- camera
- create
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G1/00—Storing articles, individually or in orderly arrangement, in warehouses or magazines
- B65G1/02—Storage devices
- B65G1/04—Storage devices mechanical
- B65G1/137—Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/0755—Position control; Position detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/12—Platforms; Forks; Other load supporting or gripping members
- B66F9/18—Load gripping or retaining means
- B66F9/186—Container lifting frames
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G2203/00—Indexing code relating to control or detection of the articles or the load carriers during conveying
- B65G2203/04—Detection means
- B65G2203/041—Camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G63/00—Transferring or trans-shipping at storage areas, railway yards or harbours or in opening mining cuts; Marshalling yard installations
- B65G63/002—Transferring or trans-shipping at storage areas, railway yards or harbours or in opening mining cuts; Marshalling yard installations for articles
- B65G63/004—Transferring or trans-shipping at storage areas, railway yards or harbours or in opening mining cuts; Marshalling yard installations for articles for containers
Definitions
- This invention relates to operating at least one camera to create an image of a container by an apparatus on a container handler for use in estimating the container's code.
- Optical characteristic systems have been in use for several years in container shipping and storage yards, but have had some problems.
- the cameras have tended to be rigidly mounted to the container handlers and unresponsive to the actual position of the containers with respect to the cameras, which leads to the cameras being operated far more often than if the container's position was known and used. Methods and apparatus are needed to address this issue and take advantage of the opportunity that solving these problems provides.
- At least one camera is configured to mount on a container handler, the camera is operated so that the camera is active only when a container being handled is in range to create the container images. Further, the camera may be operated so that the container can actively be found for image capture. A position estimate of the container is created and the camera controlled with at least one directive in response to the position estimate to create at least one container image used to create an estimate of the container code of the container.
- the apparatus embodying the invention may include a first module generating the position estimate received by a second module to create directives for controlling the cameras.
- the first module may at least partly provide the means for creating the position estimate and the second module may at least partly provide the means for controlling at least one camera with at least one directive in response to the position estimate.
- the first module may further receive an estimate of the container size of the container further affecting the directives.
- the first module may communicate with a handler interface to receive at least one of the following from sensors on or in the container handler: a sensed container presence, a sensed stack height, a container size estimate, a twistlock sensed state, a spreader sensed state, a sensed landing state, and/or a sensed hoist height.
- the camera may be stationary or capable of directed movement.
- the camera may be operated by any combination of: initiating image capture, adjusting the focal length, altering the shutter speed, pivoting in one or two angular degrees of freedom, and/or positioning on a track. At least two cameras may preferably be operated with separate directives.
- the second module may use at least one camera and lighting module containing the camera and a light source, possibly with light enabling and/or flash controls.
- the optical characteristic system may or may not include the first module and/or the second module.
- the optical characteristic system may be configured to mount on the container handler, or be at a distance with a wireless transceiver employed to deliver the container images from the container handler to the optical characteristic system.
- FIG. 1 shows an example of the apparatus and method operating at least one camera by creating a position estimate of a container being handled by a container handler and controlling the camera with a directive in response to the position estimate to create a container image used by an optical characteristic system to create an estimate of the container's code for further use by a container management system.
- FIG. 2 shows some possible details of the position estimate of FIG. 1 .
- FIG. 3 shows some possible details of the directive for the camera of FIG. 1 .
- FIG. 4 shows a refinement of some aspects of FIG. 1 showing a second camera and a wireless transceiver for sending at least the container image to the optical characteristic system.
- FIG. 5 shows a handler interface communicating with sensors on or in the container handler to aid in creating the position estimate by the first module of FIGS. 1 and 4 .
- FIGS. 6A and 6B show examples of the container of FIG. 1 and its container code.
- FIG. 6C shows an example of a container code estimate of FIG. 6 B's container code.
- FIG. 7 shows an example of a stack of containers and a sensed stack height.
- FIGS. 8A to 8D show examples of the use of the directives for the camera.
- FIG. 9 shows an example of a camera and lighting module for use in or with the second module of FIGS. 1 and 4 .
- FIGS. 10A and 10B show the directive of the camera to position it on a track.
- FIG. 11 shows various combinations of the first module and second module, possibly included in the optical characteristic system of FIGS. 1 and 4 , possibly including at least one instance of at least one of a neural network, an inferential engine, a finite state machine and/or a computer instructed by a program system in a computer readable memory.
- FIG. 12 shows a flowchart of the program system of FIG. 11 including two program steps, that may themselves be distinct program systems residing in separate computer readable memories in some embodiments of the invention.
- FIG. 13 shows various combinations of the first module, second module and/or the optical characteristic system including the handler interface of FIG. 5 and/or including an interface to two instances of the camera and lighting module of FIG. 9 and/or including an enhanced container image.
- FIG. 14 to 16 show flowcharts of some details of the first program system or program step of FIG. 12 , creating the position estimate.
- FIG. 17 shows a flowchart of some details of the second program system or program step of FIG. 12 , controlling the camera with a directive in response to the position estimate.
- FIG. 18 shows a refinement of the program system of FIG. 12 to include using the enhanced container image of FIG. 13 to create the container code estimate.
- This invention relates to operating at least one camera to create an image of a container by an apparatus on a container handler for use in estimating the container's code.
- the camera is operated so that the camera is active only when a container being handled is in focal range of the camera lens to create the container images. Further, the camera may be operated so that the container can actively be found by the camera for image capture.
- a position estimate of the container is created and the camera controlled with at least one directive in response to the position estimate to create at least one container image used to create an estimate of the container code of the container.
- FIG. 1 shows the operation of at least one camera 40 configured to mount on a container handler 2 by creating a position estimate 20 of the position 14 of a container 10 being handled by the container handler and controlling the camera with at least one directive 50 in response to the position estimate to create at least one container image 42 .
- the container image is used to create a container code estimate 70 of the container code 12 of the container.
- the container code estimate may be generated by an optical characteristic system 60 and sent to a container management system 6 for a container facility, such as a terminal shipyard, a railway terminal, a container storage facility and/or a factory.
- the container position indicated by position estimate 20 may be based upon a position reference 18 that may or may not coincide with the location the camera.
- the apparatus embodying the invention may include a first module 100 generating the position estimate 20 used by a second module 200 to create the directive 50 used by the camera 40 .
- the first module may at least partly provide the means for creating the position estimate and the second module may at least partly provide the means for controlling at least one camera with at least one directive in response to the position estimate.
- the apparatus may further include at least one light source 4 .
- the container images 42 may sometimes be unreadable by the optical characteristic system 60 , whether or not mounted on the container handler 2 . These container images may be sent to a second optical characteristic system that may use a human operator to determine the container code estimate 70 for the Container Management System 6 .
- FIG. 2 shows the position estimate 20 of FIG. 1 may include at least one of the following: a first angular estimate 22 , a second angular estimate 24 , a distance estimate 26 , a height estimate 28 , an X-axis estimate 30 , a Y-axis estimate 32 , a Z-axis estimate 34 , and/or at least one fixed location estimates 36 .
- FIG. 3 shows some details of the directive 50 used to control one or more of the cameras 40 of FIG. 1 , and may include at least one of the following: an image capture directive 51 , a focal length 52 , a shutter seed 53 , a track position 54 , a first angular directive 56 , and/or a second angular directive 58 .
- the first module 100 may further receive an estimate of the container size 16 of the container 10 , as shown in FIG. 1 .
- the container size may be a member of (but is not limited to) the container size group consisting of ten feet, twenty feet, twenty four feet, thirty three feet, forty five feet and fifty three feet.
- the container handler 2 may include one or more of the following: a drayman truck, a UTR type truck, a bomb cart, a wheeled over the road chassis, a chassis rotator, a quay crane, a side picker, a top loader, a straddle carrier, a reach stacker and a Rubber Tire Gantry (RTG) crane.
- the invention includes specific embodiments suited for individual container handler collection, which will be discussed later.
- a drayman truck may be used to haul containers on chassis over open roads whereas a UTR type truck is restricted to operate in a container terminal such as a shipyard or rail yard.
- Some embodiments of the invention send the container image 42 to the optical characteristic system 60 to create the container code estimate 70 as shown in FIGS. 1 and 4 .
- the optical characteristic system may be configured to mount on the container handler 2 , or be at a distance with a wireless transceiver 90 employed to deliver 92 the container images from the container handler to the optical characteristic system.
- the optical characteristic system may include the first module 100 and/or the second module 200 as shown in FIGS. 11 and 13 .
- FIG. 5 shows the first module 100 may communicate with a handler interface 140 to receive at least one of the following from the container handler 2 :
- FIGS. 6A and 6B show two examples of containers 10 and their container codes 12 , the first written vertically and the second written horizontally.
- FIG. 6C shows a container code estimate 70 of a container code. Note that the container code estimate of FIG. 6C does not completely agree with the container code of FIG. 6B . Enhancing the container image 42 to create an enhanced container image 76 shown in FIG. 13 can reduce these discrepancies. This will be discussed with regards FIGS. 13 , 17 and 18 hereafter.
- FIG. 7 shows an example of a stack of containers 10 and the sensed stack height 108 .
- containers may be stacked higher than four containers, and as shown in this Figure, may typically be stacked up to seven containers high.
- containers range between eight and ten feet in height and usually between eight and a half feet and nine and a half feet in height.
- FIGS. 8A to 8D show some examples of the directives 50 used to operate the camera 40 .
- the camera may be operated by any combination of the following: a fixed camera for forty foot containers and a fixed camera for twenty foot containers, pivoting the camera in a first angular degree of freedom 202 by a first angular directive 56 , pivoting the camera in a second angular degree of freedom 204 by a second angular directive 58 , adjusting the focal length 206 to 208 of the camera.
- FIG. 9 shows a camera and lighting module 230 that may be included in the second module 200 .
- the camera and lighting module includes a camera 40 and a light source 4 .
- the camera may be operating based upon one or more of the following:
- the directives 50 may also enable a lighting directive 55 for stimulating a lighting control 220 trigger the light source either to strobe or to be steadily turned on.
- the light source may include flood lights, infra-red sources, arrays of Light Emitting Diodes (LEDs) and/or Xenon light sources.
- the camera 40 may be positioned on a track 230 in response to the track position 54 at a first track position 234 as shown in FIG. 10A .
- FIG. 10B shows the camera on the track at a second track position 236 .
- the track may include one rail as shown in FIG. 10B or more than one rail as shown in FIG. 10A .
- FIG. 11 shows an optical characteristic system 60 for mounting on a container handler as previously shown and including the first module 100 and/or the second module 200 .
- the optical characteristic system and/or the first module and/or the second module may include at least one instance of a neural network 70 and/or an inferential engine 72 and/or a finite state machine 74 and/or a computer 80 accessibly coupled 84 to a computer readable memory 82 and instructed by a program system 300 including program steps residing in the memory.
- a neural network 70 maintains a collection of neurons and a collection of synaptic connections between the neurons. Neural networks are stimulated at their neurons leading through their synaptic connections to the firing of other neurons. Examples of neural networks include but are not limited to aromatic chemical compound detectors used to detect the presence of bombs and drugs.
- an inferential engine 72 maintains a collection of inference rules and a fact database and responds to queries and assertions by invoking rules and accessing the fact database.
- inferential engines include fuzzy logic controllers and constraint based decision engines used to determine paths through networks based upon the network constraints, such as the current location of parked and moving vehicles and available storage locations for containers.
- a finite state machine 74 receives at least one input, maintains and updates at least one state and generates at least one output based upon the value of at least one of the inputs and/or the value of at least one of the states.
- a computer 80 includes at least one data processor and at least one instruction processor instructed by the program system 300 , where each of the data processors is instructed by at least one of the instruction processors.
- the boxes denote steps or program steps of at least one of the invention's methods and may further denote at least one dominant response in a neural network 70 , and/or at least one state transition of the finite state machine 74 , and/or at least one inferential link in the inferential engine 72 , and/or a program step, or operation, or program thread, executing upon the computer 80 .
- Each of these steps may at least partly support the operation to be performed as part of a means for an operation or use.
- Other circuitry such as network interfaces, radio transmitters, radio receivers, specialized encoders and/or decoders, sensors, memory management and so on may also be involved in performing the operation further providing the means for the operation.
- starting in a flowchart is denoted by a rounded box with the word “Start” in it and may refer to at least one of the following: entering a subroutine or a macro instruction sequence in the computer 80 , and/or of directing a state transition of the finite state machine 74 , possibly pushing of a return state, and/or entering a deeper node of the inferential engine 72 and/or stimulating a list of neurons in the neural network 70 .
- termination in a flowchart is denoted by a rounded box with the word “Exit” in it and may refer to completion of those operations, which may result in at least one of the following: a return to dormancy of the firing of the neurons in the neural network 70 , and/or traversal to a higher node in the inferential graph 72 of the fact database and/or the rules collection, and/or possibly return to a previously pushed state in the finite state machine 74 , and/or in a subroutine return in the computer 80 .
- FIG. 12 shows a flowchart of the program system 300 of FIG. 11 , including at least one of the following:
- FIG. 13 shows a refinement of various embodiments shown in FIG. 12 , where the computer 80 is first communicatively coupled 142 to the handler interface 140 , which may be preferred for the first module 100 whether or not included in the optical characteristic system 60 . Also, the computer is second communicatively coupled 234 , possibly through a camera interface 232 to a first and second camera and lighting modules, which may be preferred for the second module 200 whether or not included in the optical characteristic system. The computer is third communicatively coupled 94 to the wireless transceiver 90 , which may be preferred for the second module, again whether or not included in the optical characteristic system.
- At least one of the first 142 , second 234 and third 94 communicative couplings may include a wireline communications protocol, which may further includes at least one of the following: a Synchronous Serial Interface protocol, an Ethernet protocol, a Serial Peripheral Interface protocol, an RS-232 protocol, and Inter-IC protocol (sometimes abbreviated as I2C), a Universal Serial Bus (USB) protocol, a Controller Area Network (CAN) protocol, a firewire protocol, which may include implementations of a version of the IEEE 1394 protocol, an RS-485 protocol and/or an RS-422 protocol.
- a wireline communications protocol which may further includes at least one of the following: a Synchronous Serial Interface protocol, an Ethernet protocol, a Serial Peripheral Interface protocol, an RS-232 protocol, and Inter-IC protocol (sometimes abbreviated as I2C), a Universal Serial Bus (USB) protocol, a Controller Area Network (CAN) protocol, a firewire protocol, which may include implementations of a version of the IEEE 1394
- the wireless transceiver 90 may include a radio frequency tag terminal and/or a radio frequency transmitter and receiver compliant with at least one wireless signaling convention that may implement at least one of a Time Division Multiple Access (TDMA) scheme, a Frequency Division Multiple Access (FDMA) scheme, and/or a spread spectrum scheme, such as:
- TDMA Time Division Multiple Access
- FDMA Frequency Division Multiple Access
- spread spectrum scheme such as:
- the first module 100 may use two position references 18 as shown in FIG. 1 , one near the first camera and lighting module 230 and the second near the second camera and lighting module, calculating two position estimates to readily generate components to their directives 50 , such as the first angular estimate 52 to be used as the first angular directive 56 , and so on.
- the computer readable memory 82 may further include various combinations of some or all of the following: the position estimate 20 , the container image 42 , the second container image 46 , a directive 50 preferably for the first camera and lighting module 230 , a second directive preferably for the second camera and lighting module, an enhanced container image 76 , and/or the container code estimate 72 .
- the first and second container images may be created by the first camera and light module, and may be used to create the enhanced container image.
- FIG. 14 shows some details of the first program system as the program step 150 of FIG. 12 , and may include at least one of the following program steps:
- FIG. 15 further shows some details of the first program system as the program step 150 , and may further include program step 166 to create the position estimate 20 based upon at least one of the following: the sensed container presence 104 , the sensed stack height 108 , the container size estimate 112 , the twistlock sensed state 116 , the spreader sensed state 120 , the sensed landing state 124 , the sensed hoist height 128 , and/or sensed container weight 132 .
- Various individual and combinations of these sensed states may be used for instance to determine a fixed location, such as landing on the bed of the bomb cart 84 .
- FIG. 16 shows a further refinement of the first program step as the program step 150 , and may include at least one of the following program steps:
- FIG. 17 shows some details of the second program system as the program step 250 , and may include at least one of the following program steps:
- FIG. 18 shows some further details of the program system 300 of FIG. 12 , including the program step 302 that uses the enhanced container image 76 to create the container code estimate 70 .
- the handler interface 140 may vary for different container handlers 2 .
- the container handler may include a Programmable Logic Controller (PLC) Interface coupled via a wireline protocol to position estimate 20 to get crane spreader interface status and position, and may further, possibly separately couple sensors to a crane hoist and trolley drum for estimates of the spreader vertical and horizontal position relative to dock and/or a sensor for determining the hoist and trolley position, for instance by using a tachometer signal from the trolley and hoist motors, proximity switches, optical encoders, or a laser beam.
- PLC Programmable Logic Controller
- the handler interface may include a wireline network interface to at least one of the sensors of the container handler.
- a wireline network interface may implement an interface to at least one of the wireline communications protocols mentioned previously. Additional sensors of the RTG and Quay Crane may require the sensing of the hoist position (the vertical height) by coupling to the hoist drum with a tachometer sensor, proximity, or optical sensors, and/or digital encoders.
- the handler interface 140 may include a wireline network interface to at least one of the sensors of the container handler. Other sensors may be accessible to the handler interface through separate wireline network interfaces and/or wireline network couplings.
- the handler interface 140 may include a wireline network interface to at least one, and possibly all the accessed sensors of the container handler. Alternatively, more than one wireline network interfaces and/or wireline network couplings may be used.
- the handler interface 140 may further receive any or all of the following information that may be forwarded to the container management system 6 : the location of the container 10 , a sensed operator identity of the operator operating the container handler 2 , a container radio frequency tag, a container weight, a container damage estimate, an indication of the container handler moving in a reverse motion, a frequent stops count, a fuel level estimate, a compass reading, a collision state, a wind speed estimate, a vehicle speed, and an estimate of the state of a vehicle braking system.
- the location of the container may be in terms of a three dimensional location and/or a stack or tier location.
- the handler interface 140 may include a second radio transceiver providing a radio frequency tag interface capable of locating the container handler 2 and/or identifying the container 10 and/or its container code 12 .
- the handler interface 140 may include a third radio transceiver using a Global Positioning System and/or a Differential Global Position System to determine the location of the container 2 .
- a third radio transceiver using a Global Positioning System and/or a Differential Global Position System to determine the location of the container 2 .
- two transceivers may be employed, one for transmitting the optical characteristics and container images, and the other for monitoring and controlling system's powering up and powering down processes.
- the handler interface 140 may include an interface to a short range and/or low power sonar, radar, or laser that may provide a position estimate 20 of the container 10 .
- the radar may preferably be non-toxic for humans and possibly livestock and other animals in or near the containers.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Transportation (AREA)
- Structural Engineering (AREA)
- Mechanical Engineering (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- General Physics & Mathematics (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Physics & Mathematics (AREA)
- Civil Engineering (AREA)
- Geology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Warehouses Or Storage Devices (AREA)
- General Factory Administration (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
- This patent application claims the benefit of Provisional Patent Application No. 60/983,888 filed Oct. 30, 2007, which is incorporated herein by reference.
- This invention relates to operating at least one camera to create an image of a container by an apparatus on a container handler for use in estimating the container's code.
- Optical characteristic systems have been in use for several years in container shipping and storage yards, but have had some problems. The cameras have tended to be rigidly mounted to the container handlers and unresponsive to the actual position of the containers with respect to the cameras, which leads to the cameras being operated far more often than if the container's position was known and used. Methods and apparatus are needed to address this issue and take advantage of the opportunity that solving these problems provides.
- At least one camera is configured to mount on a container handler, the camera is operated so that the camera is active only when a container being handled is in range to create the container images. Further, the camera may be operated so that the container can actively be found for image capture. A position estimate of the container is created and the camera controlled with at least one directive in response to the position estimate to create at least one container image used to create an estimate of the container code of the container.
- The apparatus embodying the invention may include a first module generating the position estimate received by a second module to create directives for controlling the cameras. The first module may at least partly provide the means for creating the position estimate and the second module may at least partly provide the means for controlling at least one camera with at least one directive in response to the position estimate. The first module may further receive an estimate of the container size of the container further affecting the directives.
- The first module may communicate with a handler interface to receive at least one of the following from sensors on or in the container handler: a sensed container presence, a sensed stack height, a container size estimate, a twistlock sensed state, a spreader sensed state, a sensed landing state, and/or a sensed hoist height.
- The camera may be stationary or capable of directed movement. The camera may be operated by any combination of: initiating image capture, adjusting the focal length, altering the shutter speed, pivoting in one or two angular degrees of freedom, and/or positioning on a track. At least two cameras may preferably be operated with separate directives.
- The second module may use at least one camera and lighting module containing the camera and a light source, possibly with light enabling and/or flash controls.
- The optical characteristic system may or may not include the first module and/or the second module. The optical characteristic system may be configured to mount on the container handler, or be at a distance with a wireless transceiver employed to deliver the container images from the container handler to the optical characteristic system.
-
FIG. 1 shows an example of the apparatus and method operating at least one camera by creating a position estimate of a container being handled by a container handler and controlling the camera with a directive in response to the position estimate to create a container image used by an optical characteristic system to create an estimate of the container's code for further use by a container management system. -
FIG. 2 shows some possible details of the position estimate ofFIG. 1 . -
FIG. 3 shows some possible details of the directive for the camera ofFIG. 1 . -
FIG. 4 shows a refinement of some aspects ofFIG. 1 showing a second camera and a wireless transceiver for sending at least the container image to the optical characteristic system. -
FIG. 5 shows a handler interface communicating with sensors on or in the container handler to aid in creating the position estimate by the first module ofFIGS. 1 and 4 . -
FIGS. 6A and 6B show examples of the container ofFIG. 1 and its container code. -
FIG. 6C shows an example of a container code estimate of FIG. 6B's container code. -
FIG. 7 shows an example of a stack of containers and a sensed stack height. -
FIGS. 8A to 8D show examples of the use of the directives for the camera. -
FIG. 9 shows an example of a camera and lighting module for use in or with the second module ofFIGS. 1 and 4 . -
FIGS. 10A and 10B show the directive of the camera to position it on a track. -
FIG. 11 shows various combinations of the first module and second module, possibly included in the optical characteristic system ofFIGS. 1 and 4 , possibly including at least one instance of at least one of a neural network, an inferential engine, a finite state machine and/or a computer instructed by a program system in a computer readable memory. -
FIG. 12 shows a flowchart of the program system ofFIG. 11 including two program steps, that may themselves be distinct program systems residing in separate computer readable memories in some embodiments of the invention. -
FIG. 13 shows various combinations of the first module, second module and/or the optical characteristic system including the handler interface ofFIG. 5 and/or including an interface to two instances of the camera and lighting module ofFIG. 9 and/or including an enhanced container image. -
FIG. 14 to 16 show flowcharts of some details of the first program system or program step ofFIG. 12 , creating the position estimate. -
FIG. 17 shows a flowchart of some details of the second program system or program step ofFIG. 12 , controlling the camera with a directive in response to the position estimate. - And
FIG. 18 shows a refinement of the program system ofFIG. 12 to include using the enhanced container image ofFIG. 13 to create the container code estimate. - This invention relates to operating at least one camera to create an image of a container by an apparatus on a container handler for use in estimating the container's code. Rather than over using at least one camera configured to mount on a container handler, the camera is operated so that the camera is active only when a container being handled is in focal range of the camera lens to create the container images. Further, the camera may be operated so that the container can actively be found by the camera for image capture. A position estimate of the container is created and the camera controlled with at least one directive in response to the position estimate to create at least one container image used to create an estimate of the container code of the container.
- Referring to the drawings more particularly by reference numbers,
FIG. 1 shows the operation of at least onecamera 40 configured to mount on acontainer handler 2 by creating aposition estimate 20 of theposition 14 of acontainer 10 being handled by the container handler and controlling the camera with at least onedirective 50 in response to the position estimate to create at least onecontainer image 42. The container image is used to create acontainer code estimate 70 of thecontainer code 12 of the container. The container code estimate may be generated by anoptical characteristic system 60 and sent to acontainer management system 6 for a container facility, such as a terminal shipyard, a railway terminal, a container storage facility and/or a factory. The container position indicated byposition estimate 20 may be based upon aposition reference 18 that may or may not coincide with the location the camera. - The apparatus embodying the invention may include a
first module 100 generating theposition estimate 20 used by asecond module 200 to create thedirective 50 used by thecamera 40. The first module may at least partly provide the means for creating the position estimate and the second module may at least partly provide the means for controlling at least one camera with at least one directive in response to the position estimate. The apparatus may further include at least onelight source 4. - Note that in certain embodiments of the invention, the
container images 42 may sometimes be unreadable by theoptical characteristic system 60, whether or not mounted on thecontainer handler 2. These container images may be sent to a second optical characteristic system that may use a human operator to determine thecontainer code estimate 70 for theContainer Management System 6. -
FIG. 2 shows theposition estimate 20 ofFIG. 1 may include at least one of the following: a firstangular estimate 22, a secondangular estimate 24, adistance estimate 26, aheight estimate 28, anX-axis estimate 30, a Y-axis estimate 32, a Z-axis estimate 34, and/or at least onefixed location estimates 36. -
FIG. 3 shows some details of thedirective 50 used to control one or more of thecameras 40 ofFIG. 1 , and may include at least one of the following: animage capture directive 51, afocal length 52, a shutter seed 53, atrack position 54, a firstangular directive 56, and/or a secondangular directive 58. - The
first module 100 may further receive an estimate of thecontainer size 16 of thecontainer 10, as shown inFIG. 1 . By way of example, the container size may be a member of (but is not limited to) the container size group consisting of ten feet, twenty feet, twenty four feet, thirty three feet, forty five feet and fifty three feet. - The
container handler 2 may include one or more of the following: a drayman truck, a UTR type truck, a bomb cart, a wheeled over the road chassis, a chassis rotator, a quay crane, a side picker, a top loader, a straddle carrier, a reach stacker and a Rubber Tire Gantry (RTG) crane. The invention includes specific embodiments suited for individual container handler collection, which will be discussed later. As used herein a drayman truck may be used to haul containers on chassis over open roads whereas a UTR type truck is restricted to operate in a container terminal such as a shipyard or rail yard. - Some embodiments of the invention send the
container image 42 to the opticalcharacteristic system 60 to create thecontainer code estimate 70 as shown inFIGS. 1 and 4 . The optical characteristic system may be configured to mount on thecontainer handler 2, or be at a distance with awireless transceiver 90 employed to deliver 92 the container images from the container handler to the optical characteristic system. The optical characteristic system may include thefirst module 100 and/or thesecond module 200 as shown inFIGS. 11 and 13 . -
FIG. 5 shows thefirst module 100 may communicate with ahandler interface 140 to receive at least one of the following from the container handler 2: -
- a
presence sensor 102 may create a sensed container present 104, the sensed container present may be a form of “Yes” or “No”, or may further at least partly delineate thecontainer size 16, and/or the landed sensedstate 124 or twistlock sensedstate 116 to determine container presence. Note that the sensed container present may further delineate presence of one or both of dual twenty foot containers in certain embodiments; - a stack height sensor 106 may create a sensed
stack height 108, the sensed stack height is shown inFIG. 7 ; - a
size sensor 110 may create acontainer size estimate 112 and/or aspreader sensor 118 may create a spreader sensedstate 120, the container sensed size and/or the spreader sensed state may indicate thecontainer size 16 ofFIGS. 1 and 4 ; - a
twistlock sensor 114 may create a twistlock sensedstate 116, the twistlock sensed state may be a form of “Yes” or “No” indicating whether the twistlock is engaged with thecontainer 10 or not; - a landing sensor 122 may create a sensed
landing state 124, the landing state may be a form of “Yes” or “No”; - a hoist
sensor 126 may create a sensed hoistheight 128, the sensed hoist height is shown inFIG. 4 ; and/or - a
weight sensor 130 may create a sensedcontainer weight 132. Note that in some embodiments of the invention, the weight sensor may include a strain gauge and the sensed container weight may be measured in terms of a strain reading from the strain gauge.
- a
-
FIGS. 6A and 6B show two examples ofcontainers 10 and theircontainer codes 12, the first written vertically and the second written horizontally.FIG. 6C shows acontainer code estimate 70 of a container code. Note that the container code estimate ofFIG. 6C does not completely agree with the container code ofFIG. 6B . Enhancing thecontainer image 42 to create anenhanced container image 76 shown inFIG. 13 can reduce these discrepancies. This will be discussed with regardsFIGS. 13 , 17 and 18 hereafter. -
FIG. 7 shows an example of a stack ofcontainers 10 and the sensedstack height 108. In some environments, containers may be stacked higher than four containers, and as shown in this Figure, may typically be stacked up to seven containers high. Typically, containers range between eight and ten feet in height and usually between eight and a half feet and nine and a half feet in height. -
FIGS. 8A to 8D show some examples of thedirectives 50 used to operate thecamera 40. The camera may be operated by any combination of the following: a fixed camera for forty foot containers and a fixed camera for twenty foot containers, pivoting the camera in a first angular degree offreedom 202 by a firstangular directive 56, pivoting the camera in a second angular degree offreedom 204 by a secondangular directive 58, adjusting thefocal length 206 to 208 of the camera. -
FIG. 9 shows a camera andlighting module 230 that may be included in thesecond module 200. The camera and lighting module includes acamera 40 and alight source 4. The camera may be operating based upon one or more of the following: -
- a first
angular directive 52 stimulating afirst pivot control 212 to pivot the camera in the first angular degree offreedom 202 as shown inFIGS. 8A to 8C ; - a second
angular directive 54 stimulating asecond pivot control 214 to pivot the camera in the second angular degree offreedom 204; - a
focal length 56 stimulating thefocal length control 216 to alter the camera's focal length as shown inFIGS. 8A and 8D ; - a
shutter speed 59 stimulating ashutter speed control 218; and - an
image capture directive 57 stimulating animage capture control 210.
- a first
- The
directives 50 may also enable alighting directive 55 for stimulating alighting control 220 trigger the light source either to strobe or to be steadily turned on. The light source may include flood lights, infra-red sources, arrays of Light Emitting Diodes (LEDs) and/or Xenon light sources. - The
camera 40 may be positioned on atrack 230 in response to thetrack position 54 at afirst track position 234 as shown inFIG. 10A .FIG. 10B shows the camera on the track at asecond track position 236. The track may include one rail as shown inFIG. 10B or more than one rail as shown inFIG. 10A . -
FIG. 11 shows an opticalcharacteristic system 60 for mounting on a container handler as previously shown and including thefirst module 100 and/or thesecond module 200. The optical characteristic system and/or the first module and/or the second module may include at least one instance of aneural network 70 and/or aninferential engine 72 and/or afinite state machine 74 and/or acomputer 80 accessibly coupled 84 to a computerreadable memory 82 and instructed by aprogram system 300 including program steps residing in the memory. - While there are numerous implementations of the optical
characteristic system 60, thefirst module 100, and thesecond module 200, the Figures and discussion will focus on discussion the implementation of the invention's embodiments and methods in terms of discussing just onecomputer 80, and unless otherwise useful will refrain from going beyond summarizing salient details in the interests of clarifying this disclosure. However this effort to clarify the invention is not meant to limit the scope of the claims. - As used herein, a
neural network 70 maintains a collection of neurons and a collection of synaptic connections between the neurons. Neural networks are stimulated at their neurons leading through their synaptic connections to the firing of other neurons. Examples of neural networks include but are not limited to aromatic chemical compound detectors used to detect the presence of bombs and drugs. - As used herein, an
inferential engine 72 maintains a collection of inference rules and a fact database and responds to queries and assertions by invoking rules and accessing the fact database. Examples of inferential engines include fuzzy logic controllers and constraint based decision engines used to determine paths through networks based upon the network constraints, such as the current location of parked and moving vehicles and available storage locations for containers. - As used herein, a
finite state machine 74 receives at least one input, maintains and updates at least one state and generates at least one output based upon the value of at least one of the inputs and/or the value of at least one of the states. - As used herein, a
computer 80 includes at least one data processor and at least one instruction processor instructed by theprogram system 300, where each of the data processors is instructed by at least one of the instruction processors. - Some of the following figures show flowcharts of at least one embodiment of at least one of the methods of the invention, which may include arrows signifying a flow of control, and sometimes data, supporting various implementations.
- The boxes denote steps or program steps of at least one of the invention's methods and may further denote at least one dominant response in a
neural network 70, and/or at least one state transition of thefinite state machine 74, and/or at least one inferential link in theinferential engine 72, and/or a program step, or operation, or program thread, executing upon thecomputer 80. - Each of these steps may at least partly support the operation to be performed as part of a means for an operation or use. Other circuitry such as network interfaces, radio transmitters, radio receivers, specialized encoders and/or decoders, sensors, memory management and so on may also be involved in performing the operation further providing the means for the operation.
- The operation of starting in a flowchart is denoted by a rounded box with the word “Start” in it and may refer to at least one of the following: entering a subroutine or a macro instruction sequence in the
computer 80, and/or of directing a state transition of thefinite state machine 74, possibly pushing of a return state, and/or entering a deeper node of theinferential engine 72 and/or stimulating a list of neurons in theneural network 70. - The operation of termination in a flowchart is denoted by a rounded box with the word “Exit” in it and may refer to completion of those operations, which may result in at least one of the following: a return to dormancy of the firing of the neurons in the
neural network 70, and/or traversal to a higher node in theinferential graph 72 of the fact database and/or the rules collection, and/or possibly return to a previously pushed state in thefinite state machine 74, and/or in a subroutine return in thecomputer 80. -
FIG. 12 shows a flowchart of theprogram system 300 ofFIG. 11 , including at least one of the following: -
-
program step 150 creates theposition estimate 20 of thecontainer 10, which may implement a first program system instructing a first computer in thefirst module 100; and -
program step 250 controls at least onecamera 40 with the directive 50 in response to the position estimate, which may implement a second program system instructing a second computer in thesecond module 200.
-
-
FIG. 13 shows a refinement of various embodiments shown inFIG. 12 , where thecomputer 80 is first communicatively coupled 142 to thehandler interface 140, which may be preferred for thefirst module 100 whether or not included in the opticalcharacteristic system 60. Also, the computer is second communicatively coupled 234, possibly through acamera interface 232 to a first and second camera and lighting modules, which may be preferred for thesecond module 200 whether or not included in the optical characteristic system. The computer is third communicatively coupled 94 to thewireless transceiver 90, which may be preferred for the second module, again whether or not included in the optical characteristic system. - At least one of the first 142, second 234 and third 94 communicative couplings may include a wireline communications protocol, which may further includes at least one of the following: a Synchronous Serial Interface protocol, an Ethernet protocol, a Serial Peripheral Interface protocol, an RS-232 protocol, and Inter-IC protocol (sometimes abbreviated as I2C), a Universal Serial Bus (USB) protocol, a Controller Area Network (CAN) protocol, a firewire protocol, which may include implementations of a version of the IEEE 1394 protocol, an RS-485 protocol and/or an RS-422 protocol.
- The
wireless transceiver 90 may include a radio frequency tag terminal and/or a radio frequency transmitter and receiver compliant with at least one wireless signaling convention that may implement at least one of a Time Division Multiple Access (TDMA) scheme, a Frequency Division Multiple Access (FDMA) scheme, and/or a spread spectrum scheme, such as: -
- examples of the TDMA scheme may include the GSM access scheme;
- examples of the FDMA scheme may include the AMPs scheme;
- the spread spectrum scheme may use at least one of a Code Division Multiple Access (CDMA) scheme, a Frequency Hopping Multiple Access (FHMA) scheme, a Time Hopping Multiple Access (THMA) scheme and an Orthogonal Frequency Division Multiple Access (OFDM) scheme;
- examples of the CDMA scheme may include, but are not limited to, an IS-95 access scheme and/or a Wideband CDMA (W-CDMA) access scheme;
- examples of the OFDM scheme may include, but are not limited to, a version of the IEEE 802.11 access scheme; and
- another example of a spread spectrum scheme is the ANSI 371.1 scheme for radio frequency identification and/or location tags.
- In certain embodiments, the
first module 100 may use two position references 18 as shown inFIG. 1 , one near the first camera andlighting module 230 and the second near the second camera and lighting module, calculating two position estimates to readily generate components to theirdirectives 50, such as the firstangular estimate 52 to be used as the firstangular directive 56, and so on. - In various embodiments of the invention, the computer
readable memory 82 may further include various combinations of some or all of the following: theposition estimate 20, thecontainer image 42, thesecond container image 46, a directive 50 preferably for the first camera andlighting module 230, a second directive preferably for the second camera and lighting module, anenhanced container image 76, and/or thecontainer code estimate 72. The first and second container images may be created by the first camera and light module, and may be used to create the enhanced container image. -
FIG. 14 shows some details of the first program system as theprogram step 150 ofFIG. 12 , and may include at least one of the following program steps: -
-
program step 152 senses the presence of thecontainer 10 to create the sensedcontainer presence 104, possibly through thehandler interface 140 communicating with apresence sensor 102 on or in thecontainer handler 2 as shown inFIG. 5 ; -
program step 154 senses the stack height of the container to create the sensedstack height 108, possibly through the handler interface communicating with the stack height sensor 106 or the sensed hoistheight 128; -
program step 156 senses the size of the container to create thecontainer size estimate 112, possibly through the handler interface communicating with thesize sensor 110; -
program step 158 senses the twistlock state of the twistlock controlled by the container handler to create the twistlock sensedstate 116, possibly through the handler interface communicating with thetwistlock sensor 114. The twistlock state and its sensed state may preferably take values indicating “twistlock on” and “twistlock off”; -
program step 160 senses the spreader state of the spreader controlled by the container handler to create the spreader sensedstate 120, possibly through the handler interface communicating with thespreader sensor 118. The spreader state and the spreader sensed state may indicate thecontainer size 16 ofFIGS. 1 and 4 ; -
program step 162 senses the landing state of the spreader on a container to create the sensedlanding state 124, possibly through the handler interface communicating with the landing sensor 122. The landing state and sensed landing state may indicate “landed” and “not landed” in some form possibly further indicating if a spreader is “landed” on top of a container such that the twistlocks may be activated; -
program step 164 senses the height of the hoist controlled by the container handler to create the sensed hoistheight 128, possibly through the handler interface communicating with the hoistsensor 126; and/or -
program step 165 senses the weight of the container to create the sensedcontainer weight 132. Note that in some embodiments of the invention, a strain gauge may be used and the sensed container weight may be measured in terms of a strain reading from the strain gauge. Note that in some embodiments the hoist height and the stack height may be considered essentially the same. As used herein, the stack height refers to the number of containers (typically an assortment of 8.5 feet and 9.5 feet boxes) in a stack, whereas the hoist height refers to the actual distance from the hoist to the ground. In many situations, the stack height may be determined from hoistheight 128.
-
-
FIG. 15 further shows some details of the first program system as theprogram step 150, and may further includeprogram step 166 to create theposition estimate 20 based upon at least one of the following: the sensedcontainer presence 104, the sensedstack height 108, thecontainer size estimate 112, the twistlock sensedstate 116, the spreader sensedstate 120, the sensedlanding state 124, the sensed hoistheight 128, and/or sensedcontainer weight 132. Various individual and combinations of these sensed states may be used for instance to determine a fixed location, such as landing on the bed of thebomb cart 84. -
FIG. 16 shows a further refinement of the first program step as theprogram step 150, and may include at least one of the following program steps: -
-
program step 168 calculates the firstangular estimate 22 based upon theX-axis estimate 30, the Y-axis estimate 32, and/or the Z-axis estimate 34; -
program step 170 calculates the secondangular estimate 24 based upon the X-axis estimate, the Y-axis estimate, and/or the Z-axis estimate; -
program step 172 calculates thedistance estimate 26 based upon the X-axis estimate, the Y-axis estimate, and/or the Z-axis estimate; -
program step 174 calculates thefocal length 52 based upon the distance estimate; - program step 176 uses the fixed
location estimate 36 to determine at least one of the X-axis estimate, the Y-axis estimate, the Z-axis estimate, the first angular estimate, the second angular estimate, the distance estimate, and/or the focal length;
-
-
FIG. 17 shows some details of the second program system as theprogram step 250, and may include at least one of the following program steps: -
-
program step 252 initiates theimage capture 57 of thecontainer image 40, possibly by using theimage capture control 210; this program step may be used with fixed position cameras as well as cameras that may be positioned on a track or pivoted; - program step 254 adjusts the camera based upon the
focal length 52 as shown inFIGS. 8A and 8D and possibly using thefocal length control 216; -
program step 256 fixes the shutter seed 53, possibly by using ashutter speed control 218; -
program step 258 powers at least onelight source 4 based upon a lighting enable 55, possibly by using alighting control 220; -
program step 260 pivots thecamera 40 in a first angular degree offreedom 202 by a firstangular directive 56 as shown inFIGS. 8A to 8C . The step may be implemented using thefirst pivot control 212 as shown inFIG. 9 ; -
program step 262 pivots the camera in a second angular degree offreedom 204 by a secondangular directive 58, possibly using thesecond pivot control 214; -
program step 264 moves the camera on thetrack 230 ofFIGS. 11A and 11B to atrack position 54, possibly by using atrack position control 232; -
program step 266 uses at least two container images, for example 40 and 46 as shown inFIGS. 4 and 13 , to create anenhanced container image 76. By way of example, the two images may be used to remove motion induced blurring or noise in the enhanced image, or to increase contrast about the characters of thecontainer code 12 as shown inFIGS. 6A and 6B , or to refine and/or infer the edges of the characters. - Note that in some embodiments of the invention, the
container images 42 may be compressed, possibly by thecontainer handler 2, thefirst module 100, thesecond module 200, the opticalcharacteristic system 60 and/or thecontainer management system 6. Any or all of these apparatus components may store the container images as is or in a compressed format.
-
-
FIG. 18 shows some further details of theprogram system 300 ofFIG. 12 , including theprogram step 302 that uses the enhancedcontainer image 76 to create thecontainer code estimate 70. - The
handler interface 140 may vary fordifferent container handlers 2. For example when the container handler is a quay crane or an RTG crane, the container handler may include a Programmable Logic Controller (PLC) Interface coupled via a wireline protocol to positionestimate 20 to get crane spreader interface status and position, and may further, possibly separately couple sensors to a crane hoist and trolley drum for estimates of the spreader vertical and horizontal position relative to dock and/or a sensor for determining the hoist and trolley position, for instance by using a tachometer signal from the trolley and hoist motors, proximity switches, optical encoders, or a laser beam. Also, the handler interface may include a wireline network interface to at least one of the sensors of the container handler. Any of these interface approaches may provide sensor reading of a hoist or trolley position. As used herein, a wireline network interface may implement an interface to at least one of the wireline communications protocols mentioned previously. Additional sensors of the RTG and Quay Crane may require the sensing of the hoist position (the vertical height) by coupling to the hoist drum with a tachometer sensor, proximity, or optical sensors, and/or digital encoders. - Another example, when the
container handler 2 is a side picker, a top loader (also referred to as a top handler), a straddle carrier or a reach stacker, thehandler interface 140 may include a wireline network interface to at least one of the sensors of the container handler. Other sensors may be accessible to the handler interface through separate wireline network interfaces and/or wireline network couplings. - A third example, when the
container handler 2 is a UTR type truck or a bomb cart, thehandler interface 140 may include a wireline network interface to at least one, and possibly all the accessed sensors of the container handler. Alternatively, more than one wireline network interfaces and/or wireline network couplings may be used. - The
handler interface 140 may further receive any or all of the following information that may be forwarded to the container management system 6: the location of thecontainer 10, a sensed operator identity of the operator operating thecontainer handler 2, a container radio frequency tag, a container weight, a container damage estimate, an indication of the container handler moving in a reverse motion, a frequent stops count, a fuel level estimate, a compass reading, a collision state, a wind speed estimate, a vehicle speed, and an estimate of the state of a vehicle braking system. The location of the container may be in terms of a three dimensional location and/or a stack or tier location. - The
handler interface 140 may include a second radio transceiver providing a radio frequency tag interface capable of locating thecontainer handler 2 and/or identifying thecontainer 10 and/or itscontainer code 12. - The
handler interface 140 may include a third radio transceiver using a Global Positioning System and/or a Differential Global Position System to determine the location of thecontainer 2. In certain preferred embodiments, two transceivers may be employed, one for transmitting the optical characteristics and container images, and the other for monitoring and controlling system's powering up and powering down processes. - The
handler interface 140 may include an interface to a short range and/or low power sonar, radar, or laser that may provide aposition estimate 20 of thecontainer 10. The radar may preferably be non-toxic for humans and possibly livestock and other animals in or near the containers. - The preceding embodiments provide examples of the invention, and are not meant to constrain the scope of the following claims.
Claims (22)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/262,114 US20090109295A1 (en) | 2007-10-30 | 2008-10-30 | Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US98388807P | 2007-10-30 | 2007-10-30 | |
| US12/262,114 US20090109295A1 (en) | 2007-10-30 | 2008-10-30 | Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20090109295A1 true US20090109295A1 (en) | 2009-04-30 |
Family
ID=40581554
Family Applications (5)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/262,114 Abandoned US20090109295A1 (en) | 2007-10-30 | 2008-10-30 | Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code |
| US12/262,130 Active 2029-11-17 US8146813B2 (en) | 2007-10-30 | 2008-10-30 | Methods and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars |
| US12/262,125 Active 2032-01-12 US8488884B2 (en) | 2007-10-30 | 2008-10-30 | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
| US13/422,442 Active US8720777B2 (en) | 2007-10-30 | 2012-03-16 | Method and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars |
| US13/943,667 Active US9530114B2 (en) | 2007-10-30 | 2013-07-16 | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
Family Applications After (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/262,130 Active 2029-11-17 US8146813B2 (en) | 2007-10-30 | 2008-10-30 | Methods and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars |
| US12/262,125 Active 2032-01-12 US8488884B2 (en) | 2007-10-30 | 2008-10-30 | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
| US13/422,442 Active US8720777B2 (en) | 2007-10-30 | 2012-03-16 | Method and apparatus processing container images and/or identifying codes for front end loaders or container handlers servicing rail cars |
| US13/943,667 Active US9530114B2 (en) | 2007-10-30 | 2013-07-16 | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
Country Status (3)
| Country | Link |
|---|---|
| US (5) | US20090109295A1 (en) |
| EP (3) | EP2215585A4 (en) |
| WO (3) | WO2009058382A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090110283A1 (en) * | 2007-10-30 | 2009-04-30 | Henry King | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
| CN108137230A (en) * | 2016-09-12 | 2018-06-08 | 爱鸥自动化系统有限公司 | Picking auxiliary device |
| US20190026915A1 (en) * | 2017-07-21 | 2019-01-24 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
Families Citing this family (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9177210B2 (en) * | 2007-10-30 | 2015-11-03 | Hki Systems And Service Llc | Processing container images and identifiers using optical character recognition and geolocation |
| WO2011116334A2 (en) * | 2010-03-18 | 2011-09-22 | Paceco | Object ocr and location tagging systems |
| KR101243882B1 (en) * | 2010-09-07 | 2013-03-20 | 동아대학교 산학협력단 | System for remote controlled RTGC using wireless network |
| EP2636620A1 (en) * | 2012-03-07 | 2013-09-11 | The Procter and Gamble Company | Apparatus for handling layers of goods. |
| US9147175B2 (en) | 2013-03-14 | 2015-09-29 | Mi-Jack Products, Inc. | Dynamic inventory tracking and transfer system |
| CN103612860B (en) * | 2013-11-23 | 2015-11-18 | 冶金自动化研究设计院 | Based on Wide and Thick Slab warehouse for finished product warehouse-in location and the position-recognizing system of machine vision |
| US9981812B2 (en) | 2014-06-05 | 2018-05-29 | Steve Foldesi | Automated handling of shipping containers and connectors |
| RU2705509C2 (en) | 2014-09-29 | 2019-11-07 | Авери Деннисон Корпорейшн | Radio frequency identification mark for bus tracking |
| FI20155171A7 (en) | 2015-03-13 | 2016-09-14 | Conexbird Oy | Container inspection arrangement, method, equipment and software |
| CN105739380A (en) * | 2016-01-29 | 2016-07-06 | 广东信源物流设备有限公司 | Automatic sorting matrix system based on intelligent robot |
| US11142442B2 (en) | 2017-02-10 | 2021-10-12 | Arrow Acquisition, Llc | System and method for dynamically controlling the stability of an industrial vehicle |
| AU2018315631B2 (en) | 2017-08-11 | 2023-11-09 | Bucher Municipal Pty Ltd | A refuse collection system |
| US10519631B2 (en) | 2017-09-22 | 2019-12-31 | Caterpillar Inc. | Work tool vision system |
| WO2019133477A1 (en) | 2017-12-29 | 2019-07-04 | Ooo Iss-Soft | Systems and methods for image stitching |
| US10340002B1 (en) | 2018-03-30 | 2019-07-02 | International Business Machines Corporation | In-cell differential read-out circuitry for reading signed weight values in resistive processing unit architecture |
| US11157810B2 (en) | 2018-04-16 | 2021-10-26 | International Business Machines Corporation | Resistive processing unit architecture with separate weight update and inference circuitry |
| CN110164533B (en) * | 2019-04-29 | 2021-11-09 | 成都莱孚科技有限责任公司 | Test paper tracing management system for pet disease detection |
| CN110659704A (en) * | 2019-09-29 | 2020-01-07 | 西安邮电大学 | Logistics express mail information identification system and method |
| CN111163574A (en) * | 2020-01-03 | 2020-05-15 | 浙江方大智控科技有限公司 | All-weather state detection control system of intelligent street lamp controller |
| CN111289144A (en) * | 2020-03-27 | 2020-06-16 | 杭州电力设备制造有限公司 | Bus fault monitoring system and method for high-voltage equipment |
| CA3115408A1 (en) | 2020-04-17 | 2021-10-17 | Oshkosh Corporation | Refuse vehicle with spatial awareness |
| WO2021258195A1 (en) * | 2020-06-22 | 2021-12-30 | Canscan Softwares And Technologies Inc. | Image-based system and method for shipping container management with edge computing |
| CN111951006A (en) * | 2020-08-10 | 2020-11-17 | 链博(成都)科技有限公司 | Alliance chain consensus method, system and terminal |
| CN112498270B (en) * | 2020-11-17 | 2022-08-16 | 广州小鹏汽车科技有限公司 | Vehicle identity recognition method, vehicle, intelligent terminal and storage medium |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5917602A (en) * | 1998-04-30 | 1999-06-29 | Inex Inc. | System and method for image acquisition for inspection of articles on a moving conveyor |
| US5926268A (en) * | 1996-06-04 | 1999-07-20 | Inex, Inc. | System and method for stress detection in a molded container |
| US20030191555A1 (en) * | 2002-04-09 | 2003-10-09 | Paceco Corp. | Method and apparatus for quay container crane-based automated optical container code recognition with positional identification |
| US20040015264A1 (en) * | 2002-05-31 | 2004-01-22 | Siemens Aktiengesellschaft, Munchen, Germany | Apparatus and method for verification of container numbers during unloading and loading of ships by container cranes in container terminals |
| US20040126015A1 (en) * | 2002-12-31 | 2004-07-01 | Hadell Per Anders | Container identification and tracking system |
| US20040215367A1 (en) * | 2000-08-04 | 2004-10-28 | King Henry S. | Method and apparatus supporting container identification for multiple quay cranes |
| US20050201592A1 (en) * | 2004-03-15 | 2005-09-15 | Peach Christopher S. | Method and apparatus for controlling cameras and performing optical character recognition of container code and chassis code |
| US20060153455A1 (en) * | 2001-08-02 | 2006-07-13 | Paceco Corp. | Method and apparatus of automated optical container code recognition with positional identification for a transfer container crane |
| US20060220851A1 (en) * | 2004-08-12 | 2006-10-05 | Wherenet Corp | System and method for tracking containers in grounded marine terminal operations |
| US20070040911A1 (en) * | 2005-01-31 | 2007-02-22 | Riley Larry E | Under vehicle inspection system |
| US20070050115A1 (en) * | 2005-08-24 | 2007-03-01 | Rockwell Automation Technologies, Inc. | Model-based control for crane control and underway replenishment |
| US20080002916A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Using extracted image text |
| US7953247B2 (en) * | 2007-05-21 | 2011-05-31 | Snap-On Incorporated | Method and apparatus for wheel alignment |
Family Cites Families (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6237053B1 (en) * | 1998-06-30 | 2001-05-22 | Symbol Technologies, Inc. | Configurable operating system having multiple data conversion applications for I/O connectivity |
| US6356802B1 (en) * | 2000-08-04 | 2002-03-12 | Paceco Corp. | Method and apparatus for locating cargo containers |
| JP2003232888A (en) * | 2001-12-07 | 2003-08-22 | Global Nuclear Fuel-Japan Co Ltd | Integrity confirmation inspection system and integrity confirmation method for transported object |
| US7508956B2 (en) * | 2003-06-04 | 2009-03-24 | Aps Technology Group, Inc. | Systems and methods for monitoring and tracking movement and location of shipping containers and vehicles using a vision based system |
| JP2005280940A (en) * | 2004-03-30 | 2005-10-13 | Mitsui Eng & Shipbuild Co Ltd | Container terminal management system |
| EP1748944A4 (en) * | 2004-05-14 | 2010-10-06 | Paceco Corp | Method and apparatus for making status reporting devices for container handlers |
| ATE337245T1 (en) * | 2004-07-06 | 2006-09-15 | Perpetuma | METHOD AND DEVICE FOR TRANSFER OF CARGO |
| US7813540B1 (en) * | 2005-01-13 | 2010-10-12 | Oro Grande Technologies Llc | System and method for detecting nuclear material in shipping containers |
| EP1861866A4 (en) * | 2005-02-25 | 2010-05-26 | Maersk Inc | System and process for improving container flow in a port facility |
| US7769221B1 (en) * | 2005-03-10 | 2010-08-03 | Amazon Technologies, Inc. | System and method for visual verification of item processing |
| JP4716322B2 (en) * | 2005-12-01 | 2011-07-06 | 株式会社日本環境プロジェクト | Container management system |
| JP4087874B2 (en) * | 2006-02-01 | 2008-05-21 | ファナック株式会社 | Work picking device |
| US7646336B2 (en) * | 2006-03-24 | 2010-01-12 | Containertrac, Inc. | Automated asset positioning for location and inventory tracking using multiple positioning techniques |
| US7755541B2 (en) * | 2007-02-13 | 2010-07-13 | Wherenet Corp. | System and method for tracking vehicles and containers |
| US20090109295A1 (en) * | 2007-10-30 | 2009-04-30 | Henry King | Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code |
| US8189919B2 (en) * | 2007-12-27 | 2012-05-29 | Chung Mong Lee | Method and system for container identification |
-
2008
- 2008-10-30 US US12/262,114 patent/US20090109295A1/en not_active Abandoned
- 2008-10-30 EP EP08844149A patent/EP2215585A4/en not_active Withdrawn
- 2008-10-30 EP EP08845888A patent/EP2215826A4/en not_active Withdrawn
- 2008-10-30 WO PCT/US2008/012402 patent/WO2009058382A1/en not_active Ceased
- 2008-10-30 US US12/262,130 patent/US8146813B2/en active Active
- 2008-10-30 US US12/262,125 patent/US8488884B2/en active Active
- 2008-10-30 EP EP08844783A patent/EP2215825A4/en not_active Withdrawn
- 2008-10-30 WO PCT/US2008/012383 patent/WO2009058371A1/en not_active Ceased
- 2008-10-30 WO PCT/US2008/012405 patent/WO2009058384A2/en not_active Ceased
-
2012
- 2012-03-16 US US13/422,442 patent/US8720777B2/en active Active
-
2013
- 2013-07-16 US US13/943,667 patent/US9530114B2/en active Active
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5926268A (en) * | 1996-06-04 | 1999-07-20 | Inex, Inc. | System and method for stress detection in a molded container |
| US5917602A (en) * | 1998-04-30 | 1999-06-29 | Inex Inc. | System and method for image acquisition for inspection of articles on a moving conveyor |
| US20040215367A1 (en) * | 2000-08-04 | 2004-10-28 | King Henry S. | Method and apparatus supporting container identification for multiple quay cranes |
| US20060153455A1 (en) * | 2001-08-02 | 2006-07-13 | Paceco Corp. | Method and apparatus of automated optical container code recognition with positional identification for a transfer container crane |
| US20030191555A1 (en) * | 2002-04-09 | 2003-10-09 | Paceco Corp. | Method and apparatus for quay container crane-based automated optical container code recognition with positional identification |
| US20040015264A1 (en) * | 2002-05-31 | 2004-01-22 | Siemens Aktiengesellschaft, Munchen, Germany | Apparatus and method for verification of container numbers during unloading and loading of ships by container cranes in container terminals |
| US20040126015A1 (en) * | 2002-12-31 | 2004-07-01 | Hadell Per Anders | Container identification and tracking system |
| US20050201592A1 (en) * | 2004-03-15 | 2005-09-15 | Peach Christopher S. | Method and apparatus for controlling cameras and performing optical character recognition of container code and chassis code |
| US7231065B2 (en) * | 2004-03-15 | 2007-06-12 | Embarcadero Systems Corporation | Method and apparatus for controlling cameras and performing optical character recognition of container code and chassis code |
| US20060220851A1 (en) * | 2004-08-12 | 2006-10-05 | Wherenet Corp | System and method for tracking containers in grounded marine terminal operations |
| US20070040911A1 (en) * | 2005-01-31 | 2007-02-22 | Riley Larry E | Under vehicle inspection system |
| US20070050115A1 (en) * | 2005-08-24 | 2007-03-01 | Rockwell Automation Technologies, Inc. | Model-based control for crane control and underway replenishment |
| US20080002916A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Using extracted image text |
| US7953247B2 (en) * | 2007-05-21 | 2011-05-31 | Snap-On Incorporated | Method and apparatus for wheel alignment |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090110283A1 (en) * | 2007-10-30 | 2009-04-30 | Henry King | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
| US8488884B2 (en) * | 2007-10-30 | 2013-07-16 | Paceco Corp. | Method and apparatus for operating, interfacing and/or managing for at least one optical characteristic system for container handlers in a container yard |
| CN108137230A (en) * | 2016-09-12 | 2018-06-08 | 爱鸥自动化系统有限公司 | Picking auxiliary device |
| US10504199B2 (en) | 2016-09-12 | 2019-12-10 | Aioi Systems Co., Ltd. | Picking assistant system |
| US20190026915A1 (en) * | 2017-07-21 | 2019-01-24 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
| WO2019014764A1 (en) | 2017-07-21 | 2019-01-24 | Blackberry Limited | MAPPING METHOD AND SYSTEM FOR FACILITATING DISTRIBUTION |
| US10546384B2 (en) * | 2017-07-21 | 2020-01-28 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
| EP3655903A4 (en) * | 2017-07-21 | 2021-05-12 | BlackBerry Limited | MAPPING PROCESS AND SYSTEM TO FACILITATE DISTRIBUTION |
| US11689700B2 (en) | 2017-07-21 | 2023-06-27 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
| US20230276029A1 (en) * | 2017-07-21 | 2023-08-31 | Blackberry Limited | Method and system for mapping to facilitate dispatching |
| US12267629B2 (en) * | 2017-07-21 | 2025-04-01 | Malikie Innovations Limited | Image sensor apparatus, method and non-transitory computer readable medium for capturing image data of shipping containers |
Also Published As
| Publication number | Publication date |
|---|---|
| US8488884B2 (en) | 2013-07-16 |
| US8146813B2 (en) | 2012-04-03 |
| EP2215826A1 (en) | 2010-08-11 |
| WO2009058384A3 (en) | 2009-07-02 |
| US9530114B2 (en) | 2016-12-27 |
| US8720777B2 (en) | 2014-05-13 |
| EP2215585A4 (en) | 2012-08-08 |
| US20090110283A1 (en) | 2009-04-30 |
| EP2215585A2 (en) | 2010-08-11 |
| US20140147045A1 (en) | 2014-05-29 |
| US20090108065A1 (en) | 2009-04-30 |
| US20130062408A1 (en) | 2013-03-14 |
| EP2215825A4 (en) | 2012-08-08 |
| EP2215826A4 (en) | 2012-08-08 |
| WO2009058382A1 (en) | 2009-05-07 |
| WO2009058384A2 (en) | 2009-05-07 |
| WO2009058371A1 (en) | 2009-05-07 |
| EP2215825A1 (en) | 2010-08-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20090109295A1 (en) | Method and apparatus for operating at least one camera from a position estimate of a container to estimate its container code | |
| AU2024204528B2 (en) | Systems and methods for vehicle position calibration using rack leg identification | |
| US12055946B2 (en) | Systems and methods for vehicle position calibration using rack leg identification and mast sway compensation | |
| US12326733B2 (en) | Systems and methods for out of aisle localization and vehicle position calibration using rack leg identification |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: PACECO CORP., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KING, HENRY;TAKEHARA, TORU;REEL/FRAME:021766/0498 Effective date: 20081030 |
|
| AS | Assignment |
Owner name: HKI SYSTEMS AND SERVICE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PACECO CORP.;REEL/FRAME:031339/0295 Effective date: 20131001 |
|
| AS | Assignment |
Owner name: HKI SYSTEMS AND SERVICE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PACECO CORP.;REEL/FRAME:032068/0128 Effective date: 20131001 Owner name: HKI SYSTEMS AND SERVICE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PACECO CORP.;REEL/FRAME:032068/0230 Effective date: 20131001 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |