US20170285644A1 - Forklift - Google Patents
Forklift Download PDFInfo
- Publication number
- US20170285644A1 US20170285644A1 US15/471,107 US201715471107A US2017285644A1 US 20170285644 A1 US20170285644 A1 US 20170285644A1 US 201715471107 A US201715471107 A US 201715471107A US 2017285644 A1 US2017285644 A1 US 2017285644A1
- Authority
- US
- United States
- Prior art keywords
- pallet
- forklift
- fork
- sensor
- openings
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/063—Automatically guided
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/0755—Position control; Position detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G05D2201/0216—
Definitions
- An art disclosed herein relates to a forklift capable of carrying a load placed on a pallet.
- the forklift To perform an accurate load-lifting operation by a forklift, the forklift needs to be moved accurately to a lifting position of a pallet that is to be lifted. To do so, a relative position of the pallet with respect to the forklift needs to be detected accurately.
- a relative position of the pallet with respect to the forklift needs to be detected accurately.
- the forklift of Japanese Patent Application Publication No. 2013-230903 for example in a case where there is no space around the pallet, namely, in a case where an object exists in contact with a lateral surface of the pallet, or in a case where pallets are placed side by side without clearances therebetween, it had been difficult to detect both ends of the pallet in the width direction. If the relative position of the pallet cannot be detected, the forklift cannot be accurately positioned to the lifting position of the pallet.
- the disclosure herein provides an art that allows a relative position of a pallet with respect to a forklift to be accurately detected irrespective of what environment the pallet is placed in.
- a forklift disclosed herein is capable of carrying a load placed on a pallet having two openings into which a fork of the forklift is inserted.
- the forklift comprises a sensor configured to irradiate laser light toward a predetermined space forward of the fork, and measure a distance from the sensor to an object located in the predetermined space based on reflected light of the laser light reflected by the object; and a processor configured to identify positions of sidewalls of the two openings of the pallet that is to be lifted based on distance data measured by the sensor, the processor being further configured to identify a center of a front surface of the pallet based on the positions of the sidewalls of the two openings.
- two openings for fork inserting are provided for a load-lifting operation by a forklift.
- the processor identifies the positions of the sidewalls of the two openings in the front surface (a surface facing the forklift) of the pallet, based on the distance data measured by the sensor.
- the two openings are provided at positions having a known positional relationship with the center of the front surface of the pallet (e.g., at positions symmetric with respect to the center), and hence the processor identifies the center and a direction of the front surface of the pallet based on the positions of the sidewalls of the two openings.
- a sidewall of an opening in the disclosure herein means a surface located at an end of the opening in a horizontal direction when the pallet is placed with a broader surface put horizontally to a floor surface (i.e., a surface extending approximately vertically relative to a floor surface).
- FIG. 1 is a perspective view that schematically shows a configuration of a forklift in an embodiment
- FIG. 2 is a diagram that schematically shows an example of a state in which laser light is scanned by the forklift in the embodiment
- FIG. 3 is a block diagram that shows a control configuration of the forklift in the embodiment
- FIG. 4 is a flowchart that shows a procedure of processing of identifying a position and a direction of a pallet by a controller of the forklift in the embodiment
- FIG. 5 is a diagram that shows a state in which distance data is being acquired by a sensor of the forklift in the embodiment
- FIG. 6 is a diagram for describing a method of extracting a group of points constituting a line from a group of observed points.
- FIG. 7 is a diagram that shows the pallet together with coordinate axes set for the pallet.
- a processor may be further configured to extract only distance data of a front surface of a pallet from distance data measured by a sensor, and identify a center and a direction of the front surface of the pallet based on positions of sidewalls of two openings identified by the extracted distance data. According to such a configuration, erroneous recognition of the positions of the sidewalls of the openings can be suppressed.
- the forklift disclosed herein may further comprise a movement mechanism configured to move a fork in a first direction.
- the sensor may be attached to the fork, and be configured to scan laser light in a second direction orthogonal to the first direction and move along with the fork by the movement mechanism. According to such a configuration, distance data in a three-dimensional space can be acquired.
- the processor may be further configured to perform: (1) measuring distance data by scanning laser light in the second direction with the fork set at a predetermined position of the first direction; (2) in a case where the positions of the sidewalls of the openings of the pallet cannot be identified based on the measured distance data, moving the fork a predetermined distance in the first direction to a position, and measuring distance data at the position by scanning laser light in the second direction; and (3) repeating the moving and measuring of (2) until the positions of the sidewalls of the openings of the pallet can be identified.
- the positions of the sidewalls of the openings can be identified.
- the forklift 10 is an unmanned forklift, and comprises a vehicle body 12 , a mast 20 , a fork 22 , a lift chain 24 , a sensor 26 , and a controller 30 (shown in FIG. 3 ).
- the vehicle body 12 comprises a front wheel 28 and a rear wheel 29 at each of both lateral surfaces.
- the front wheels 28 and the rear wheels 29 are rotatably supported to the vehicle body 12 .
- One of the rear wheels 29 has a drive wheel motor (not shown) connected thereto via a drive mechanism, and is configured to be driven by the drive wheel motor to rotate.
- the rear wheel 29 connected to the drive wheel motor is also connected to a steering device (not shown), and has an orientation of the rear wheel 29 adjusted by the steering device.
- the other of the rear wheels 29 is a caster wheel, and is rotated and steered following a movement of the vehicle body 12 .
- the controller 30 controlling the drive wheel motor and the steering device, the vehicle body 12 is allowed to run on a mad and change its running direction.
- the mast 20 is a post attached to a front surface of the vehicle body 12 , and its axis extends in an upward-and-downward direction.
- the fork 22 is attached to the mast 20 movably in the upward-and-downward direction.
- the fork 22 comprises a pair of tines 22 a and 22 b .
- the tines 22 a and 22 b are disposed at positions spaced apart from each other in a right-and-left direction of the vehicle body 12 , and extend forward of the vehicle body 12 from a mast 20 side.
- the fork 22 may be swingable relative to the mast 20 by a tilting mechanism (not shown).
- the lift chain 24 is provided at the mast 20 , and engages with the fork 22 .
- the lift chain 24 is driven by a fork lifting and lowering device 40 (shown in FIG. 3 )
- the fork 22 is accordingly lifted and lowered.
- a position of the fork 22 in the upward-and-downward direction can be identified by an amount by which the fork lifting and lowering device 40 drives the lift chain 24 .
- the sensor 26 is attached to the fork 22 , and is lifted and lowered in the upward-and-downward direction together with the fork 22 .
- a position to which the sensor 26 is attached is between the tine 22 a and the tine 22 b , and on a backward side (on a vehicle body 12 side) relative to a backrest surface of the fork 22 .
- the sensor 26 is a one-dimensional scanning-type sensor that scans laser light in one direction (a horizontal direction in the present embodiment).
- the sensor 26 radiates laser light, and measures a distance to a peripheral object based on reflected light of the radiated laser light.
- the sensor 26 radiates the laser light to a region 50 having a predetermined angular range and set forward of the forklift 10 (see FIG. 1 ).
- the distance data acquired by the sensor 26 is inputted into the controller 30 .
- the sensor 26 is also lifted and lowered, and hence as shown in FIG. 2 , a position, in a height direction, of the laser light radiated from the sensor 26 is changeable. Due to this, the distance data at an arbitrary height in a movable range of the fork 22 can be acquired by the sensor 26 .
- UTM-30LX made by HOKUYO AUTOMATIC CO. LTD, LMS 100 made by SICK AG, or the like can be used, for example.
- a position of the sensor 26 in the upward-and-downward direction can be identified by a sensor position detecting unit 36 (shown in FIG. 3 ).
- the controller 30 is constituted of a microprocessor that comprises a CPU and the like.
- the controller 30 is mounted in the vehicle body 12 .
- the controller 30 is connected to the sensor 26 , the drive wheel motor that drives the one of the rear wheels 29 , the steering device that adjusts a steering angle of the rear wheel 29 connected to the drive wheel motor, the fork lifting and lowering device 40 that lifts and lowers the fork 22 , and the like, and controls their operations.
- the controller 30 controls the running direction and a running speed of the forklift 10 by driving the drive wheel motor and the steering device.
- the controller 30 drives the one of the rear wheels 29 by outputting a control command value to the drive wheel motor and the steering device.
- the running direction, the running speed, and a running path of the forklift 10 are controlled.
- the controller 30 causes the fork 22 to move in the upward-and-downward direction by driving the fork lifting and lowering device 40 .
- the running direction and the running speed of the forklift 10 can be controlled by conventionally known methods, and hence the detailed description thereof will be omitted.
- the controller 30 performs, by executing a program stored in a memory, processing of coordinate-converting the distance data acquired by the sensor 26 , processing of identifying a position and a direction of a pallet 100 based on a group of coordinate-converted observed points, and the like.
- the controller 30 functions as a coordinate converting unit 32 and a computing unit 34 .
- the controller 30 functioning as each of the coordinate conversion unit 32 and the computing unit 34 , the group of observed points acquired from a space forward of the forklift 10 is generated, and based on the generated group of observed points, the position and the direction of the pallet 100 are identified. Details of the coordinate conversion unit 32 and the computer unit 34 will be described, along with the processing performed in the controller 30 , which will hereinafter be described.
- This processing is performed in a vicinity of the pallet 100 that is to be lifted, where the pallet 100 is observable by the sensor 26 .
- the controller 30 initially drives the one of the rear wheels 29 such that the pallet 100 is located forward of the vehicle body 12 , and brings the forklift 10 closer to the pallet 100 .
- the controller 30 moves the forklift 10 to an observation start position in the vicinity of the pallet 100 .
- a position where the load (pallet 100 ) is placed is predetermined.
- the controller 30 automatically moves the forklift 10 to the predetermined observation start position. It should be noted that, if the forklift 10 is driven by a driver the forklift 10 may be moved to the observation start position by the driver, and then the following processing (processing of step S 14 and the subsequent steps shown in FIG. 4 ) may be started.
- the observation target region 60 refers to a region where there may be the pallet 100 .
- Dimensions of the pallet 100 and a base or the like on which the pallet 100 is placed are known, and hence the position in the height direction where the pallet 100 exists (in details, the position in the height direction of a portion of the front surface of the pallet 100 where openings 110 are provided) can be preset. Specifically, as shown in FIG.
- the controller 30 adjusts the height at which the laser light is radiated by driving the fork lifting and lowering device 40 such that the laser light is radiated at the height where the pallet 100 exists within the observation target region 60 . It should be noted that a function of the controller 30 realized by this processing corresponds to a sensor movement control unit 38 shown in FIG. 3 .
- the controller 30 acquires distance data 42 by the sensor 26 (S 14 ).
- the sensor 26 radiates laser light while scanning the laser light in the horizontal direction, and detects reflected light of the radiated laser light.
- the distance data 42 in a radiating direction of the laser light can be acquired.
- the controller 30 coordinate-converts the acquired distance data 42 into a group of observed points 44 in a three-dimensional space (S 16 ).
- the distance data 42 can be coordinate-converted into the group of observed points 44 based on: the distance data 42 acquired in step S 14 ; a number of observation steps (number of measurement cycles (steps)) of the laser light of the sensor 26 and step intervals (scan angle in each observation step) of the laser light of the sensor 26 ; the position of the laser light in the height direction; and the like.
- the coordinate conversion can be performed using conventionally known methods, and hence the detailed description thereof will be omitted.
- the function of the controller 30 realized by the processing of step S 16 described above corresponds to the coordinate conversion unit 32 shown in FIG. 3 .
- the controller 30 extracts only observed points in the observation target region 60 from the acquired group of observed points 44 (S 18 ). Due to this, observed points outside the observation target region 60 are excluded, and hence in the following processing erroneous recognition of the pallet 100 can be prevented.
- the controller 30 extracts a line from the extracted group of observed points (S 20 ).
- a group of observed points made by the reflected light reflected from the front surface of the pallet 100 is located on a same plane. Due to this, in step S 20 , a line that is a result of scanning the front surface of the pallet 100 is extracted from the group of observed points extracted in step S 18 .
- robust estimation such as RANSAC (random sample consensus) can be used, for example.
- the controller 30 extracts positions of both end points among the group of points constituting the line extracted in step S 20 (S 22 ).
- the positions of both the end points can be found from positions of observed points among the group of points constituting the line, i.e., a point having a maximum value p xmax in a x direction and a minimum value p ymin in a y direction, and a point having a minimum value p xmin in the x direction and a maximum value p ymax in the y direction.
- the controller 30 performs clustering to the group of points constituting the extracted line (matching the line) by a Euclidean distance (S 24 ).
- the front surface of the pallet 100 has the two openings 110 (holes into which the tines 22 a and 22 b of the fork 22 are to be inserted) provided therein. Accordingly, there is a possibility that the line extracted from the front surface of the pallet 100 may be divided by the openings 110 in the front surface of the pallet 100 . Due to this, it is determined as to whether or not the group of points constituting the line extracted in step S 20 is divided by a distance that corresponds to a width of each of the openings 110 of the pallet 100 .
- the conventionally known method such as a k-means method or a Ward's method can be used, for example.
- the controller 30 performs processing of step S 26 . Specifically, the controller 30 initially counts a number of lines obtained through the clustering (i.e., number of clusters), and determines whether or not the counted number of clusters is three (S 26 ). As mentioned above, at the positions (height) where the openings 110 are provided in the front surface of the pallet 100 , the line (group of observed points) that extends approximately in the horizontal direction due to the horizontal scan of the laser light is divided by the openings 110 into three sections.
- step S 26 by determining whether or not the number of clusters is three, it can be determined whether or not the extracted line corresponds to the front surface of the pallet 100 (in details, a portion of the front surface of the pallet 100 that corresponds to the positions (height) of the openings 110 ).
- the controller 30 proceeds to step S 28 .
- the controller 30 returns to step S 14 , and performs the processing of acquiring the distance data 42 again. In other words, the controller 30 performs the distance measurement by the sensor 26 (processing in S 14 and the subsequent steps) again.
- the controller 30 may be configured to determine, in a case where the determination of NO is repeated a predetermined number of times in the processing in step S 26 , that the pallet 100 does not exist in the observation target region 60 , and terminate the processing.
- the controller 30 selects two clusters that respectively include one and the other points of the both end points extracted in step S 22 out of the three clusters of points (S 28 ). In other words, the controller 30 selects two clusters located on an outer side, out of the three clusters.
- the controller 30 detects, from each of the selected two clusters, a position of a respective end point (i.e., an inner end point) that is not extracted in step S 22 (S 30 ).
- the positions of the inner end points are positions S 1 and S 2 of sidewalls of the openings 110 of the pallet 100 (shown in FIG. 7 ).
- the controller 30 detects each of the positions S 1 and S 2 of the sidewalls of the two openings 110 in the pallet 100 .
- the positions S 1 and S 2 of the inner end points of the two selected clusters are less likely to be influenced by the noise of the occlusion boundary, and detection accuracy therefor can be improved.
- the controller 30 identifies a center M of the front surface of the pallet 100 from the positions of the two inner points detected in step S 30 (S 32 ).
- the two openings 110 of the pallet 100 are provided at positions symmetric with respect to the center M of the front surface of the pallet 100 . Therefore, it is possible to identify the center M of the front surface of the pallet 100 by finding a midpoint between the position S 1 and the position S 2 of the sidewalls of the openings 110 in the pallet 100 .
- the controller 30 identifies the position and the direction of the pallet 100 from the center M of the front surface of the pallet 100 identified in step S 32 , and the line extracted in step S 20 (S 34 ). Specifically, as shown in FIG. 7 , by finding a normal vector N of the extracted line on an xy plane, the direction of the pallet 100 can be found. Thereby, the position and the direction of the pallet 100 (the pallet data 46 ) can be identified. It should be noted that a function of the controller 30 realized by the processing in steps S 18 to S 34 described above corresponds to the computing unit 34 shown in FIG. 3 .
- the controller 30 identifies the positions S 1 and S 2 of the sidewalls of the two openings 110 in the front surface of the pallet 100 , based on the distance data 42 measured by the sensor 26 .
- the two openings 110 are provided at the positions symmetric with respect to the center M of the front surface of the pallet 100 , and hence the controller 30 can identify the center M of the front surface of the pallet 100 based on the positions S 1 and S 2 of the sidewalls of the two openings 110 .
- the positions of the two openings 110 can be detected based on the distance data 42 measured by the sensor 26 , irrespective of presence or absence of a space around the pallet 100 . Accordingly, relative position and direction of the pallet 100 with respect to the forklift 10 can be detected accurately. In other words, the position and the direction of the pallet 100 can be identified, irrespective of what environment the pallet 100 is placed in.
- the controller 30 initially moves the forklift 10 to the observation start position while moving the sensor 26 by the fork lifting and lowering device 40 such that laser light is radiated to an upper limit of the observation target region 60 .
- the controller 30 acquires the distance data 42 by the sensor 26 while lowering the fork 22 .
- the controller 30 performs the processing of steps S 16 to S 24 of the above-mentioned embodiment.
- the controller 30 determines that the current height is not the height where the openings 110 of the pallet 100 exist, and acquires the distance data 42 by the sensor 26 while lowering the fork 22 again.
- the controller 30 determines that the current height is the height where the openings 110 of the pallet 100 exist.
- the fork lifting and lowering device 40 is an example of a movement mechanism in the claims.
- the computing unit 34 is an example of a processor in the claims.
- the controller 30 selects, in step S 28 , two clusters each include a respective end point of a group of points constituting a line
- the processing of step S 28 may be modified such that the controller 30 selects a central cluster out of the three clusters in step S 28 .
- the processing at the following step S 30 may be modified such that the controller 30 extracts two points being both ends of the central cluster (an inner sidewall of each of the openings 110 ), finds a position of a midpoint between the two points, and the center M of the front surface of the pallet 100 can be identified.
- a three-dimensional image of a range irradiated with laser light may be generated based on the distance data 42 measured by the sensor 26 by scanning the laser light in the horizontal direction while moving the fork 22 in the upward-and-downward direction by driving the fork lifting and lowering device 40 .
- the controller 30 initially may scan laser light radiated from the sensor 26 two-dimensionally, namely in the horizontal direction and the height direction, by radiating the laser light to the region 50 that is set forward of the forklift 10 and has the predetermined angular range, while lifting and lowering the sensor 26 in the upward-and-downward direction.
- the controller 30 may acquire the distance data 42 of a three-dimensional space forward of the forklift 10 , from the reflected light of the laser light.
- the controller 30 may identify, from the three-dimensional image generated based on this distance data 42 , the positions S 1 and S 2 of the sidewalls of the openings 110 of the pallet 100 . Even with such a configuration, the positions S 1 and S 2 of the sidewalls can be identified, and hence the center of the front surface of the pallet 100 can be identified preferably.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- Structural Engineering (AREA)
- Electromagnetism (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mechanical Engineering (AREA)
- Geology (AREA)
- Civil Engineering (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optics & Photonics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Forklifts And Lifting Vehicles (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A forklift capable of carrying a load placed on a pallet having two openings into which a fork of the forklift is inserted is provided. The forklift includes a sensor configured to irradiate laser light toward a predetermined space forward of the fork, and measure a distance from the sensor to an object located in the predetermined space based on reflected light of the laser light reflected by the object; and a processor configured to identify positions of sidewalls of the two openings of the pallet that is to be lifted based on distance data measured by the sensor, the processor being further configured to identify a center of a front surface of the pallet based on the positions of the sidewalls of the two openings.
Description
- An art disclosed herein relates to a forklift capable of carrying a load placed on a pallet.
- There has been known an art of recognizing a pallet that is to be lifted by a sensor during a load-lifting operation by a forklift. In a forklift of Japanese Patent Application Publication No. 2013-230903, a relative position of a pallet with respect to the forklift is computed by allowing a two-dimensional laser rangefinder to measure distances to and angles with both ends of a front surface of the pallet in a width direction.
- To perform an accurate load-lifting operation by a forklift, the forklift needs to be moved accurately to a lifting position of a pallet that is to be lifted. To do so, a relative position of the pallet with respect to the forklift needs to be detected accurately. In the forklift of Japanese Patent Application Publication No. 2013-230903, however, for example in a case where there is no space around the pallet, namely, in a case where an object exists in contact with a lateral surface of the pallet, or in a case where pallets are placed side by side without clearances therebetween, it had been difficult to detect both ends of the pallet in the width direction. If the relative position of the pallet cannot be detected, the forklift cannot be accurately positioned to the lifting position of the pallet. The disclosure herein provides an art that allows a relative position of a pallet with respect to a forklift to be accurately detected irrespective of what environment the pallet is placed in.
- A forklift disclosed herein is capable of carrying a load placed on a pallet having two openings into which a fork of the forklift is inserted. The forklift comprises a sensor configured to irradiate laser light toward a predetermined space forward of the fork, and measure a distance from the sensor to an object located in the predetermined space based on reflected light of the laser light reflected by the object; and a processor configured to identify positions of sidewalls of the two openings of the pallet that is to be lifted based on distance data measured by the sensor, the processor being further configured to identify a center of a front surface of the pallet based on the positions of the sidewalls of the two openings.
- Generally, in a lateral surface of a pallet on which a load is to be placed, two openings for fork inserting are provided for a load-lifting operation by a forklift. In the above-described forklift, the processor identifies the positions of the sidewalls of the two openings in the front surface (a surface facing the forklift) of the pallet, based on the distance data measured by the sensor. The two openings are provided at positions having a known positional relationship with the center of the front surface of the pallet (e.g., at positions symmetric with respect to the center), and hence the processor identifies the center and a direction of the front surface of the pallet based on the positions of the sidewalls of the two openings. As such, the positions of the two openings can be detected based on the distance data measured by the sensor, irrespective of the environment around the pallet. Accordingly, the relative position and direction of the pallet can be detected accurately. In other words, according to the forklift described above, the relative position and direction of the pallet with respect to the forklift can be detected accurately, irrespective of what environment the pallet is placed in. It should be noted that a sidewall of an opening in the disclosure herein means a surface located at an end of the opening in a horizontal direction when the pallet is placed with a broader surface put horizontally to a floor surface (i.e., a surface extending approximately vertically relative to a floor surface).
-
FIG. 1 is a perspective view that schematically shows a configuration of a forklift in an embodiment; -
FIG. 2 is a diagram that schematically shows an example of a state in which laser light is scanned by the forklift in the embodiment; -
FIG. 3 is a block diagram that shows a control configuration of the forklift in the embodiment; -
FIG. 4 is a flowchart that shows a procedure of processing of identifying a position and a direction of a pallet by a controller of the forklift in the embodiment; -
FIG. 5 is a diagram that shows a state in which distance data is being acquired by a sensor of the forklift in the embodiment; -
FIG. 6 is a diagram for describing a method of extracting a group of points constituting a line from a group of observed points; and -
FIG. 7 is a diagram that shows the pallet together with coordinate axes set for the pallet. - Some of the features of an embodiment described below will be listed. It should be noted that the respective technical features described below are independent of one another, and useful solely or in combinations. The combinations thereof are not limited to those described in the claims as originally filed.
- (Feature 1) In a forklift disclosed herein, a processor may be further configured to extract only distance data of a front surface of a pallet from distance data measured by a sensor, and identify a center and a direction of the front surface of the pallet based on positions of sidewalls of two openings identified by the extracted distance data. According to such a configuration, erroneous recognition of the positions of the sidewalls of the openings can be suppressed.
- (Feature 2) The forklift disclosed herein may further comprise a movement mechanism configured to move a fork in a first direction. Moreover, the sensor may be attached to the fork, and be configured to scan laser light in a second direction orthogonal to the first direction and move along with the fork by the movement mechanism. According to such a configuration, distance data in a three-dimensional space can be acquired.
- (Feature 3) In the forklift disclosed herein, the processor may be further configured to perform: (1) measuring distance data by scanning laser light in the second direction with the fork set at a predetermined position of the first direction; (2) in a case where the positions of the sidewalls of the openings of the pallet cannot be identified based on the measured distance data, moving the fork a predetermined distance in the first direction to a position, and measuring distance data at the position by scanning laser light in the second direction; and (3) repeating the moving and measuring of (2) until the positions of the sidewalls of the openings of the pallet can be identified. According to such a configuration, in a case where the position of the pallet in the first direction is unknown, the positions of the sidewalls of the openings can be identified.
- With reference to the drawings, a
forklift 10 in an embodiment will hereinafter be described. As shown inFIG. 1 , theforklift 10 is an unmanned forklift, and comprises avehicle body 12, amast 20, afork 22, alift chain 24, asensor 26, and a controller 30 (shown inFIG. 3 ). - The
vehicle body 12 comprises afront wheel 28 and arear wheel 29 at each of both lateral surfaces. Thefront wheels 28 and therear wheels 29 are rotatably supported to thevehicle body 12. One of therear wheels 29 has a drive wheel motor (not shown) connected thereto via a drive mechanism, and is configured to be driven by the drive wheel motor to rotate. Moreover, therear wheel 29 connected to the drive wheel motor is also connected to a steering device (not shown), and has an orientation of therear wheel 29 adjusted by the steering device. The other of therear wheels 29 is a caster wheel, and is rotated and steered following a movement of thevehicle body 12. By thecontroller 30 controlling the drive wheel motor and the steering device, thevehicle body 12 is allowed to run on a mad and change its running direction. - The
mast 20 is a post attached to a front surface of thevehicle body 12, and its axis extends in an upward-and-downward direction. - The
fork 22 is attached to themast 20 movably in the upward-and-downward direction. Thefork 22 comprises a pair of 22 a and 22 b. Thetines 22 a and 22 b are disposed at positions spaced apart from each other in a right-and-left direction of thetines vehicle body 12, and extend forward of thevehicle body 12 from amast 20 side. It should be noted that thefork 22 may be swingable relative to themast 20 by a tilting mechanism (not shown). - The
lift chain 24 is provided at themast 20, and engages with thefork 22. When thelift chain 24 is driven by a fork lifting and lowering device 40 (shown inFIG. 3 ), thefork 22 is accordingly lifted and lowered. A position of thefork 22 in the upward-and-downward direction can be identified by an amount by which the fork lifting and loweringdevice 40 drives thelift chain 24. - The
sensor 26 is attached to thefork 22, and is lifted and lowered in the upward-and-downward direction together with thefork 22. A position to which thesensor 26 is attached is between thetine 22 a and thetine 22 b, and on a backward side (on avehicle body 12 side) relative to a backrest surface of thefork 22. Thesensor 26 is a one-dimensional scanning-type sensor that scans laser light in one direction (a horizontal direction in the present embodiment). Thesensor 26 radiates laser light, and measures a distance to a peripheral object based on reflected light of the radiated laser light. Thesensor 26 radiates the laser light to aregion 50 having a predetermined angular range and set forward of the forklift 10 (seeFIG. 1 ). Distance data in the horizontal direction is thereby acquired. The distance data acquired by thesensor 26 is inputted into thecontroller 30. Moreover, as thefork 22 is lifted and lowered, thesensor 26 is also lifted and lowered, and hence as shown inFIG. 2 , a position, in a height direction, of the laser light radiated from thesensor 26 is changeable. Due to this, the distance data at an arbitrary height in a movable range of thefork 22 can be acquired by thesensor 26. As thesensor 26, UTM-30LX made by HOKUYO AUTOMATIC CO. LTD,LMS 100 made by SICK AG, or the like can be used, for example. It should be noted that a position of thesensor 26 in the upward-and-downward direction can be identified by a sensor position detecting unit 36 (shown inFIG. 3 ). - The
controller 30 is constituted of a microprocessor that comprises a CPU and the like. Thecontroller 30 is mounted in thevehicle body 12. Thecontroller 30 is connected to thesensor 26, the drive wheel motor that drives the one of therear wheels 29, the steering device that adjusts a steering angle of therear wheel 29 connected to the drive wheel motor, the fork lifting and loweringdevice 40 that lifts and lowers thefork 22, and the like, and controls their operations. In other words, thecontroller 30 controls the running direction and a running speed of theforklift 10 by driving the drive wheel motor and the steering device. Specifically, thecontroller 30 drives the one of therear wheels 29 by outputting a control command value to the drive wheel motor and the steering device. Thereby, the running direction, the running speed, and a running path of theforklift 10 are controlled. Moreover, thecontroller 30 causes thefork 22 to move in the upward-and-downward direction by driving the fork lifting and loweringdevice 40. It should be noted that the running direction and the running speed of theforklift 10 can be controlled by conventionally known methods, and hence the detailed description thereof will be omitted. - Moreover, the
controller 30 performs, by executing a program stored in a memory, processing of coordinate-converting the distance data acquired by thesensor 26, processing of identifying a position and a direction of apallet 100 based on a group of coordinate-converted observed points, and the like. In other words, as shown inFIG. 3 , thecontroller 30 functions as a coordinate convertingunit 32 and acomputing unit 34. By thecontroller 30 functioning as each of the coordinateconversion unit 32 and thecomputing unit 34, the group of observed points acquired from a space forward of theforklift 10 is generated, and based on the generated group of observed points, the position and the direction of thepallet 100 are identified. Details of the coordinateconversion unit 32 and thecomputer unit 34 will be described, along with the processing performed in thecontroller 30, which will hereinafter be described. - Next, the processing of identifying the position and the direction of the pallet 100 (pallet data 46) by the
controller 30 will be described. This processing is performed in a vicinity of thepallet 100 that is to be lifted, where thepallet 100 is observable by thesensor 26. In other words, thecontroller 30 initially drives the one of therear wheels 29 such that thepallet 100 is located forward of thevehicle body 12, and brings theforklift 10 closer to thepallet 100. In other words, to observe thepallet 100 by thesensor 26, thecontroller 30 moves theforklift 10 to an observation start position in the vicinity of thepallet 100. Regarding theforklift 10 that carries a load in a factory, for example, a position where the load (pallet 100) is placed is predetermined. Accordingly, the position where theforklift 10 starts observing thepallet 100 is predetermined based on the approximate position where thepallet 100 is placed. Therefore, thecontroller 30 automatically moves theforklift 10 to the predetermined observation start position. It should be noted that, if theforklift 10 is driven by a driver theforklift 10 may be moved to the observation start position by the driver, and then the following processing (processing of step S14 and the subsequent steps shown inFIG. 4 ) may be started. - It should be noted that, while the
forklift 10 moves to the observation start position, thesensor 26 is moved in the upward-and-downward direction by the fork lifting and loweringdevice 40 such that anobservation target region 60 is irradiated with the laser light. Theobservation target region 60 refers to a region where there may be thepallet 100. Dimensions of thepallet 100 and a base or the like on which thepallet 100 is placed are known, and hence the position in the height direction where thepallet 100 exists (in details, the position in the height direction of a portion of the front surface of thepallet 100 whereopenings 110 are provided) can be preset. Specifically, as shown inFIG. 5 , for example, if aload 130 is placed on thepallet 100 and thepallet 100 is placed on abase 120, the region where there may be thepallet 100 is determined by the dimensions of thebase 120 and thepallet 100. Due to this, thecontroller 30 adjusts the height at which the laser light is radiated by driving the fork lifting and loweringdevice 40 such that the laser light is radiated at the height where thepallet 100 exists within theobservation target region 60. It should be noted that a function of thecontroller 30 realized by this processing corresponds to a sensormovement control unit 38 shown inFIG. 3 . - Next, the
controller 30 acquiresdistance data 42 by the sensor 26 (S14). In other words, thesensor 26 radiates laser light while scanning the laser light in the horizontal direction, and detects reflected light of the radiated laser light. Thereby, thedistance data 42 in a radiating direction of the laser light can be acquired. - Next, the
controller 30 coordinate-converts the acquireddistance data 42 into a group of observedpoints 44 in a three-dimensional space (S16). For example, thedistance data 42 can be coordinate-converted into the group of observedpoints 44 based on: thedistance data 42 acquired in step S14; a number of observation steps (number of measurement cycles (steps)) of the laser light of thesensor 26 and step intervals (scan angle in each observation step) of the laser light of thesensor 26; the position of the laser light in the height direction; and the like. It should be noted that the coordinate conversion can be performed using conventionally known methods, and hence the detailed description thereof will be omitted. The function of thecontroller 30 realized by the processing of step S16 described above corresponds to the coordinateconversion unit 32 shown inFIG. 3 . - Next, the
controller 30 extracts only observed points in theobservation target region 60 from the acquired group of observed points 44 (S18). Due to this, observed points outside theobservation target region 60 are excluded, and hence in the following processing erroneous recognition of thepallet 100 can be prevented. - Next, the
controller 30 extracts a line from the extracted group of observed points (S20). In other words, a group of observed points made by the reflected light reflected from the front surface of thepallet 100 is located on a same plane. Due to this, in step S20, a line that is a result of scanning the front surface of thepallet 100 is extracted from the group of observed points extracted in step S18. It should be noted that, for the extraction of the line, a known algorithm referred to as robust estimation, such as RANSAC (random sample consensus) can be used, for example. - Next, the
controller 30 extracts positions of both end points among the group of points constituting the line extracted in step S20 (S22). For example, as shown inFIG. 6 , the positions of both the end points can be found from positions of observed points among the group of points constituting the line, i.e., a point having a maximum value pxmax in a x direction and a minimum value pymin in a y direction, and a point having a minimum value pxmin in the x direction and a maximum value pymax in the y direction. - Next, the
controller 30 performs clustering to the group of points constituting the extracted line (matching the line) by a Euclidean distance (S24). Here, as shown in FIG. 7, the front surface of thepallet 100 has the two openings 110 (holes into which the 22 a and 22 b of thetines fork 22 are to be inserted) provided therein. Accordingly, there is a possibility that the line extracted from the front surface of thepallet 100 may be divided by theopenings 110 in the front surface of thepallet 100. Due to this, it is determined as to whether or not the group of points constituting the line extracted in step S20 is divided by a distance that corresponds to a width of each of theopenings 110 of thepallet 100. It should be noted that, for the clustering, the conventionally known method such as a k-means method or a Ward's method can be used, for example. - Next, the
controller 30 performs processing of step S26. Specifically, thecontroller 30 initially counts a number of lines obtained through the clustering (i.e., number of clusters), and determines whether or not the counted number of clusters is three (S26). As mentioned above, at the positions (height) where theopenings 110 are provided in the front surface of thepallet 100, the line (group of observed points) that extends approximately in the horizontal direction due to the horizontal scan of the laser light is divided by theopenings 110 into three sections. Accordingly, in step S26, by determining whether or not the number of clusters is three, it can be determined whether or not the extracted line corresponds to the front surface of the pallet 100 (in details, a portion of the front surface of thepallet 100 that corresponds to the positions (height) of the openings 110). In a case where the number of clusters is three (S26: YES), thecontroller 30 proceeds to step S28. On the other hand, in a case where the number of clusters is not three (S26: NO), thecontroller 30 returns to step S14, and performs the processing of acquiring thedistance data 42 again. In other words, thecontroller 30 performs the distance measurement by the sensor 26 (processing in S14 and the subsequent steps) again. It should be noted that thecontroller 30 may be configured to determine, in a case where the determination of NO is repeated a predetermined number of times in the processing in step S26, that thepallet 100 does not exist in theobservation target region 60, and terminate the processing. - Next, the
controller 30 selects two clusters that respectively include one and the other points of the both end points extracted in step S22 out of the three clusters of points (S28). In other words, thecontroller 30 selects two clusters located on an outer side, out of the three clusters. - Next, the
controller 30 detects, from each of the selected two clusters, a position of a respective end point (i.e., an inner end point) that is not extracted in step S22 (S30). The positions of the inner end points are positions S1 and S2 of sidewalls of theopenings 110 of the pallet 100 (shown inFIG. 7 ). In other words, in step S30, thecontroller 30 detects each of the positions S1 and S2 of the sidewalls of the twoopenings 110 in thepallet 100. It should be noted that there may be a case where the positions of the outer end points of the selected two clusters cannot be detected accurately due to an influence of noise of an occlusion boundary. On the other hand, the positions S1 and S2 of the inner end points of the two selected clusters are less likely to be influenced by the noise of the occlusion boundary, and detection accuracy therefor can be improved. - Next, the
controller 30 identifies a center M of the front surface of thepallet 100 from the positions of the two inner points detected in step S30 (S32). The twoopenings 110 of thepallet 100 are provided at positions symmetric with respect to the center M of the front surface of thepallet 100. Therefore, it is possible to identify the center M of the front surface of thepallet 100 by finding a midpoint between the position S1 and the position S2 of the sidewalls of theopenings 110 in thepallet 100. - Next, the
controller 30 identifies the position and the direction of thepallet 100 from the center M of the front surface of thepallet 100 identified in step S32, and the line extracted in step S20 (S34). Specifically, as shown inFIG. 7 , by finding a normal vector N of the extracted line on an xy plane, the direction of thepallet 100 can be found. Thereby, the position and the direction of the pallet 100 (the pallet data 46) can be identified. It should be noted that a function of thecontroller 30 realized by the processing in steps S18 to S34 described above corresponds to thecomputing unit 34 shown inFIG. 3 . - In the
forklift 10 in the above-mentioned embodiment, thecontroller 30 identifies the positions S1 and S2 of the sidewalls of the twoopenings 110 in the front surface of thepallet 100, based on thedistance data 42 measured by thesensor 26. The twoopenings 110 are provided at the positions symmetric with respect to the center M of the front surface of thepallet 100, and hence thecontroller 30 can identify the center M of the front surface of thepallet 100 based on the positions S1 and S2 of the sidewalls of the twoopenings 110. As such, the positions of the twoopenings 110 can be detected based on thedistance data 42 measured by thesensor 26, irrespective of presence or absence of a space around thepallet 100. Accordingly, relative position and direction of thepallet 100 with respect to theforklift 10 can be detected accurately. In other words, the position and the direction of thepallet 100 can be identified, irrespective of what environment thepallet 100 is placed in. - It should be noted that, although the above-mentioned embodiment has been described on the assumption that the position in the height direction where the
pallet 100 exists is known, the position and the direction of thepallet 100 can be identified even in a case where the position in the height direction where thepallet 100 exists is unknown. Specifically, thecontroller 30 initially moves theforklift 10 to the observation start position while moving thesensor 26 by the fork lifting and loweringdevice 40 such that laser light is radiated to an upper limit of theobservation target region 60. Next, thecontroller 30 acquires thedistance data 42 by thesensor 26 while lowering thefork 22. Subsequently, thecontroller 30 performs the processing of steps S16 to S24 of the above-mentioned embodiment. Afterwards, in the case where the determination of NO (i.e., the number of clusters being not three) is made in the processing of the subsequent step S26, thecontroller 30 determines that the current height is not the height where theopenings 110 of thepallet 100 exist, and acquires thedistance data 42 by thesensor 26 while lowering thefork 22 again. On the other hand, in the case where the determination of YES (i.e., the number of clusters being three) is made in the processing of step S26, thecontroller 30 determines that the current height is the height where theopenings 110 of thepallet 100 exist. After performing the processing described above, by performing the processing of step S28 and the subsequent steps of the above-mentioned embodiment, thecontroller 30 can identify the position and the direction of thepallet 100. - Correspondence relationships between the above-mentioned embodiment and the claims will be described. The fork lifting and lowering
device 40 is an example of a movement mechanism in the claims. Thecomputing unit 34 is an example of a processor in the claims. - While the embodiment of the technique disclosed herein have been described above in detail, this is merely illustrative and places no limitation on the scope of the claims. The technique described in the claims also encompasses various changes and modifications to the specific example described above.
- For example, in the above-mentioned embodiment, the
controller 30 selects, in step S28, two clusters each include a respective end point of a group of points constituting a line, however, the processing of step S28 may be modified such that thecontroller 30 selects a central cluster out of the three clusters in step S28. In this case, the processing at the following step S30 may be modified such that thecontroller 30 extracts two points being both ends of the central cluster (an inner sidewall of each of the openings 110), finds a position of a midpoint between the two points, and the center M of the front surface of thepallet 100 can be identified. - Moreover, a three-dimensional image of a range irradiated with laser light may be generated based on the
distance data 42 measured by thesensor 26 by scanning the laser light in the horizontal direction while moving thefork 22 in the upward-and-downward direction by driving the fork lifting and loweringdevice 40. Specifically, thecontroller 30 initially may scan laser light radiated from thesensor 26 two-dimensionally, namely in the horizontal direction and the height direction, by radiating the laser light to theregion 50 that is set forward of theforklift 10 and has the predetermined angular range, while lifting and lowering thesensor 26 in the upward-and-downward direction. Next, thecontroller 30 may acquire thedistance data 42 of a three-dimensional space forward of theforklift 10, from the reflected light of the laser light. Then, thecontroller 30 may identify, from the three-dimensional image generated based on thisdistance data 42, the positions S1 and S2 of the sidewalls of theopenings 110 of thepallet 100. Even with such a configuration, the positions S1 and S2 of the sidewalls can be identified, and hence the center of the front surface of thepallet 100 can be identified preferably. - The respective technical features described herein or in the drawings are independent of one another, and useful solely or in combinations. The combinations thereof are not limited to those described in the claims as originally filed. Further, the art described herein and the drawings may concurrently achieve a plurality of aims, and technical significance thereof resides in achieving any one of such aims.
Claims (4)
1. A forklift capable of carrying a load placed on a pallet having two openings into which a fork of the forklift is inserted, the forklift comprising:
a sensor configured to irradiate laser light toward a predetermined space forward of the fork, and measure a distance from the sensor to an object located in the predetermined space based on reflected light of the laser light reflected by the object; and
a processor configured to identify positions of sidewalls of the two openings of the pallet that is to be lifted based on distance data measured by the sensor, the processor being further configured to identify a center of a front surface of the pallet based on the positions of the sidewalls of the two openings.
2. The forklift according to claim 1 , wherein
the processor is further configured to extract only distance data of the front surface of the pallet from the distance data measured by the sensor, and identify the center of and a direction of the front surface of the pallet based on the positions of the sidewalls of the two openings identified by the extracted distance data.
3. The forklift according to claim 1 , further comprising
a movement mechanism configured to move the fork in a first direction;
wherein
the sensor is provided on the fork and is configured to scan laser light in a second direction orthogonal to the first direction, and move along with the fork by the movement mechanism.
4. The forklift according to claim 3 , wherein
the processor is further configured to perform:
(1) measuring distance data by scanning laser light in the second direction with the fork set at a predetermined position of the first direction;
(2) in a case where the positions of the sidewalls of the two openings of the pallet cannot be identified based on the measured distance data, moving the fork a predetermined distance in the first direction to a position, and measuring distance data at the position by scanning laser light in the second direction; and
(3) repeating the moving and measuring of (2) until the positions of the sidewalls of the two openings of the pallet can be identified.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-069454 | 2016-03-30 | ||
| JP2016069454A JP2017178567A (en) | 2016-03-30 | 2016-03-30 | forklift |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170285644A1 true US20170285644A1 (en) | 2017-10-05 |
Family
ID=58454973
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/471,107 Abandoned US20170285644A1 (en) | 2016-03-30 | 2017-03-28 | Forklift |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20170285644A1 (en) |
| EP (1) | EP3225584A1 (en) |
| JP (1) | JP2017178567A (en) |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180293743A1 (en) * | 2017-04-05 | 2018-10-11 | Murata Machinery, Ltd. | Recess detection device, transport device, and recess detecting method |
| US20180319640A1 (en) * | 2017-05-02 | 2018-11-08 | Eric Flenoid | Distance Measuring System |
| US10435284B1 (en) * | 2016-07-22 | 2019-10-08 | Fozi Androus | Load laser guidance system for forklift |
| CN110780276A (en) * | 2019-10-29 | 2020-02-11 | 杭州易博特科技有限公司 | Tray identification method and system based on laser radar and electronic equipment |
| US10640347B2 (en) | 2017-12-22 | 2020-05-05 | X Development Llc | Pallet tracking during engagement and disengagement |
| KR20200101181A (en) * | 2019-02-19 | 2020-08-27 | 주식회사 드림팩로지스틱스 | Forklift |
| CN113104776A (en) * | 2021-05-13 | 2021-07-13 | 湖北奥瑞金制罐有限公司 | Tin printing scheduling system and method based on unmanned forklift |
| US11111121B2 (en) | 2019-09-10 | 2021-09-07 | Kabushiki Kaisha Toshiba | Conveyance apparatus |
| CN114105043A (en) * | 2020-08-31 | 2022-03-01 | 三菱物捷仕株式会社 | Pallet sensing device, forklift, pallet sensing method, and program |
| US20220144609A1 (en) * | 2020-11-06 | 2022-05-12 | Kabushiki Kaisha Toshiba | Autonomous mobile robot, transporter, autonomous mobile robot control method, and transporter control method |
| CN114545430A (en) * | 2022-02-21 | 2022-05-27 | 山东亚历山大智能科技有限公司 | Tray pose identification method and system based on laser radar |
| US20220259023A1 (en) * | 2021-02-16 | 2022-08-18 | Toyota Jidosha Kabushiki Kaisha | Transport system and transport method |
| US20220262132A1 (en) * | 2021-02-16 | 2022-08-18 | Mitsubishi Logisnext Co., LTD. | Control method for mobile object, mobile object, and computer-readable storage medium |
| US11427448B2 (en) * | 2018-10-04 | 2022-08-30 | Toyota Jidosha Kabushiki Kaisha | Conveying apparatus |
| US11604281B2 (en) | 2019-08-08 | 2023-03-14 | Kabushiki Kaisha Toyota Jidoshokki | Position and posture estimation apparatus of a forklift pallet |
| JP2023163605A (en) * | 2022-04-28 | 2023-11-10 | 株式会社豊田自動織機 | Cargo handling system |
| IT202200015888A1 (en) * | 2022-07-27 | 2024-01-27 | System Ceramics S P A | Operator vehicle with vertically movable safety sensor |
| US12215012B2 (en) * | 2021-10-12 | 2025-02-04 | Kabushiki Kaisha Toyota Jidoshokki | Forklift and control method for forklift |
| US12351441B2 (en) * | 2022-01-28 | 2025-07-08 | Kabushiki Kaisha Toyota Jidoshokki | Forklift and forklift controlling method |
| US12391527B2 (en) | 2021-02-05 | 2025-08-19 | Panasonic Intellectual Property Management Co., Ltd. | Load transport system, method of load transport system, and non-transitory computer-readable recording medium |
| EP4500286A4 (en) * | 2022-03-28 | 2025-12-17 | Seegrid Corp | VALIDATION OF THE POSTURE OF A ROBOTIC VEHICLE THAT ALLOWS IT TO INTERACT WITH AN OBJECT ON A FIXED INFRASTRUCTURE |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6750564B2 (en) * | 2017-05-29 | 2020-09-02 | トヨタ自動車株式会社 | Forklift control device |
| CN108152823B (en) * | 2017-12-14 | 2021-09-03 | 北京信息科技大学 | Vision-based unmanned forklift navigation system and positioning navigation method thereof |
| JP7129314B2 (en) * | 2018-02-02 | 2022-09-01 | 株式会社Ihi | Unloading device |
| JP7228800B2 (en) * | 2018-10-29 | 2023-02-27 | パナソニックIpマネジメント株式会社 | Conveying method, conveying system, program and pallet |
| CN109443201B (en) * | 2018-10-31 | 2020-10-16 | 广东嘉腾机器人自动化有限公司 | A pallet identification method based on S300 laser scanning sensor |
| CN109264638A (en) * | 2018-11-13 | 2019-01-25 | 上海辛格林纳新时达电机有限公司 | A kind of intelligent forklift |
| JP7186623B2 (en) * | 2019-01-16 | 2022-12-09 | 三菱電機株式会社 | LOCATION MANAGEMENT SYSTEM, LOCATION MANAGEMENT DEVICE, LOCATION MANAGEMENT METHOD AND PROGRAM |
| CA3129088A1 (en) | 2019-02-04 | 2020-08-13 | The Heil Co. | Semi-autonomous refuse collection |
| CA3137399A1 (en) | 2019-04-23 | 2020-10-29 | The Heil Co. | Refuse collection vehicle positioning |
| US11208262B2 (en) | 2019-04-23 | 2021-12-28 | The Heil Co. | Refuse container engagement |
| US11453550B2 (en) | 2019-04-23 | 2022-09-27 | The Heil Co. | Refuse collection vehicle controls |
| JP7215394B2 (en) * | 2019-10-25 | 2023-01-31 | 株式会社豊田自動織機 | Operation support device for cargo handling vehicle |
| JP7363705B2 (en) * | 2020-03-31 | 2023-10-18 | 株式会社豊田自動織機 | Cargo handling system |
| JP7306311B2 (en) * | 2020-04-16 | 2023-07-11 | 株式会社豊田自動織機 | recognition device |
| JP7424241B2 (en) * | 2020-07-31 | 2024-01-30 | 株式会社豊田自動織機 | edge detection device |
| JP7179102B2 (en) * | 2021-02-05 | 2022-11-28 | 三菱ロジスネクスト株式会社 | Mobile object control method, mobile object and program |
| JP7257431B2 (en) * | 2021-02-16 | 2023-04-13 | 三菱ロジスネクスト株式会社 | Mobile object control method, mobile object and program |
| JP7556308B2 (en) * | 2021-02-22 | 2024-09-26 | 株式会社豊田自動織機 | Position and orientation estimation device |
| JP7283853B2 (en) * | 2021-04-28 | 2023-05-30 | 三菱ロジスネクスト株式会社 | forklift |
| JP7661870B2 (en) * | 2021-11-24 | 2025-04-15 | 株式会社豊田自動織機 | Position and orientation estimation device |
| KR20230168516A (en) * | 2022-06-07 | 2023-12-14 | 현대자동차주식회사 | Smart logistics vehicle and method of controlling the same |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS5957898A (en) * | 1982-09-28 | 1984-04-03 | 株式会社豊田自動織機製作所 | Method of controlling cargo work in unmanned forklift |
| JPH0755797B2 (en) * | 1987-10-21 | 1995-06-14 | 株式会社小松製作所 | Unmanned forklift pallet gap detector |
| JPH11278799A (en) * | 1998-03-24 | 1999-10-12 | Mitsubishi Electric Corp | Unloading control device for unmanned forklift and unloading control method for unmanned forklift |
| BE1013354A3 (en) * | 2000-03-17 | 2001-12-04 | Egemin Nv | Method and device for installing an automatically guided vehicle. |
| US6952488B2 (en) * | 2001-08-27 | 2005-10-04 | Carnegie Mellon University | System and method for object localization |
| JP2005089013A (en) * | 2003-09-12 | 2005-04-07 | Nippon Yusoki Co Ltd | Forklift |
| JP4293565B2 (en) * | 2007-05-22 | 2009-07-08 | 日本輸送機株式会社 | Fork pitch automatic adjustment device |
| KR101059927B1 (en) * | 2009-03-30 | 2011-08-26 | 부산대학교 산학협력단 | Apparatus and method for pallet position recognition of unmanned conveying equipment |
| JP5908333B2 (en) | 2012-04-27 | 2016-04-26 | 株式会社日立製作所 | forklift |
| JP2015225450A (en) * | 2014-05-27 | 2015-12-14 | 村田機械株式会社 | Autonomous traveling vehicle, and object recognition method in autonomous traveling vehicle |
| JP6369131B2 (en) * | 2014-05-27 | 2018-08-08 | 村田機械株式会社 | Object recognition apparatus and object recognition method |
| JP6469506B2 (en) * | 2015-04-16 | 2019-02-13 | 株式会社豊田中央研究所 | forklift |
| JP6542574B2 (en) * | 2015-05-12 | 2019-07-10 | 株式会社豊田中央研究所 | forklift |
-
2016
- 2016-03-30 JP JP2016069454A patent/JP2017178567A/en active Pending
-
2017
- 2017-03-28 US US15/471,107 patent/US20170285644A1/en not_active Abandoned
- 2017-03-29 EP EP17163631.9A patent/EP3225584A1/en not_active Withdrawn
Cited By (28)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10435284B1 (en) * | 2016-07-22 | 2019-10-08 | Fozi Androus | Load laser guidance system for forklift |
| US10692229B2 (en) * | 2017-04-05 | 2020-06-23 | Murata Machinery, Ltd. | Recess detection device, transport device, and recess detecting method |
| US20180293743A1 (en) * | 2017-04-05 | 2018-10-11 | Murata Machinery, Ltd. | Recess detection device, transport device, and recess detecting method |
| US20180319640A1 (en) * | 2017-05-02 | 2018-11-08 | Eric Flenoid | Distance Measuring System |
| US11180353B2 (en) | 2017-12-22 | 2021-11-23 | Boston Dynamics, Inc. | Pallet tracking during engagement and disengagement |
| US10640347B2 (en) | 2017-12-22 | 2020-05-05 | X Development Llc | Pallet tracking during engagement and disengagement |
| US11427448B2 (en) * | 2018-10-04 | 2022-08-30 | Toyota Jidosha Kabushiki Kaisha | Conveying apparatus |
| KR20200101181A (en) * | 2019-02-19 | 2020-08-27 | 주식회사 드림팩로지스틱스 | Forklift |
| KR102166631B1 (en) | 2019-02-19 | 2020-10-16 | 주식회사 드림팩로지스틱스 | Forklift |
| US11604281B2 (en) | 2019-08-08 | 2023-03-14 | Kabushiki Kaisha Toyota Jidoshokki | Position and posture estimation apparatus of a forklift pallet |
| US11111121B2 (en) | 2019-09-10 | 2021-09-07 | Kabushiki Kaisha Toshiba | Conveyance apparatus |
| CN110780276A (en) * | 2019-10-29 | 2020-02-11 | 杭州易博特科技有限公司 | Tray identification method and system based on laser radar and electronic equipment |
| CN114105043A (en) * | 2020-08-31 | 2022-03-01 | 三菱物捷仕株式会社 | Pallet sensing device, forklift, pallet sensing method, and program |
| EP3961259A1 (en) * | 2020-08-31 | 2022-03-02 | Mitsubishi Logisnext Co., Ltd. | Pallet detection device, forklift, pallet detection method, and program |
| US20220144609A1 (en) * | 2020-11-06 | 2022-05-12 | Kabushiki Kaisha Toshiba | Autonomous mobile robot, transporter, autonomous mobile robot control method, and transporter control method |
| US12391527B2 (en) | 2021-02-05 | 2025-08-19 | Panasonic Intellectual Property Management Co., Ltd. | Load transport system, method of load transport system, and non-transitory computer-readable recording medium |
| US20220259023A1 (en) * | 2021-02-16 | 2022-08-18 | Toyota Jidosha Kabushiki Kaisha | Transport system and transport method |
| US20220262132A1 (en) * | 2021-02-16 | 2022-08-18 | Mitsubishi Logisnext Co., LTD. | Control method for mobile object, mobile object, and computer-readable storage medium |
| US12347205B2 (en) * | 2021-02-16 | 2025-07-01 | Mitsubishi Logisnext Co., LTD. | Control method for mobile object, mobile object, and computer-readable storage medium |
| CN113104776A (en) * | 2021-05-13 | 2021-07-13 | 湖北奥瑞金制罐有限公司 | Tin printing scheduling system and method based on unmanned forklift |
| US12215012B2 (en) * | 2021-10-12 | 2025-02-04 | Kabushiki Kaisha Toyota Jidoshokki | Forklift and control method for forklift |
| US12351441B2 (en) * | 2022-01-28 | 2025-07-08 | Kabushiki Kaisha Toyota Jidoshokki | Forklift and forklift controlling method |
| CN114545430A (en) * | 2022-02-21 | 2022-05-27 | 山东亚历山大智能科技有限公司 | Tray pose identification method and system based on laser radar |
| EP4500286A4 (en) * | 2022-03-28 | 2025-12-17 | Seegrid Corp | VALIDATION OF THE POSTURE OF A ROBOTIC VEHICLE THAT ALLOWS IT TO INTERACT WITH AN OBJECT ON A FIXED INFRASTRUCTURE |
| JP2023163605A (en) * | 2022-04-28 | 2023-11-10 | 株式会社豊田自動織機 | Cargo handling system |
| JP7775779B2 (en) | 2022-04-28 | 2025-11-26 | 株式会社豊田自動織機 | Cargo Handling System |
| WO2024023658A1 (en) * | 2022-07-27 | 2024-02-01 | System Ceramics S.P.A. | Operating vehicle with vertically movable safety sensor |
| IT202200015888A1 (en) * | 2022-07-27 | 2024-01-27 | System Ceramics S P A | Operator vehicle with vertically movable safety sensor |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3225584A1 (en) | 2017-10-04 |
| JP2017178567A (en) | 2017-10-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20170285644A1 (en) | Forklift | |
| US10850961B2 (en) | Forklift | |
| JP6469506B2 (en) | forklift | |
| US10248123B2 (en) | Mobile apparatus | |
| US11718513B2 (en) | Position and posture estimation system | |
| JP7103077B2 (en) | Remote control system for forklifts | |
| US10692229B2 (en) | Recess detection device, transport device, and recess detecting method | |
| JP2021024728A (en) | Position/posture estimation device | |
| JP2022124817A (en) | Mobile object control method, mobile object and program | |
| US20240230903A9 (en) | Transport possibility determination device, distance measurement device, transport unit, transport possibility determination method, and transport possibility determination program | |
| JP2021169360A (en) | Recognition device | |
| JP7655176B2 (en) | forklift | |
| JP2021085828A (en) | Obstacle detector | |
| JP7488991B2 (en) | Industrial Vehicles | |
| US20250066172A1 (en) | Unloading control device | |
| JP7697769B2 (en) | Transport vehicle, distance calculation method, and distance calculation program | |
| US20230205213A1 (en) | Control method for mobile object, mobile object, and computer-readable storage medium | |
| JP2024154294A (en) | MOBILE BODY CONTROL METHOD, MOBILE BODY, AND PROGRAM | |
| US12515932B2 (en) | Position identification system, transport vehicle, position identification method and recording medium | |
| CN118723868A (en) | Transport vehicle, connecting part, distance determination method and storage medium | |
| JP7257431B2 (en) | Mobile object control method, mobile object and program | |
| JP7669902B2 (en) | Position and orientation estimation device | |
| JP2025101182A (en) | forklift |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: KABUSHIKI KAISHA TOYOTA JIDOSHOKKI, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ICHINOSE, MAKOTO;TANAKA, MINORU;TSUSAKA, YUUJI;AND OTHERS;SIGNING DATES FROM 20170214 TO 20170321;REEL/FRAME:041765/0139 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |