[go: up one dir, main page]

WO2025111689A1 - Dispositifs, systèmes et procédés de traitement ou de propagation de plantes - Google Patents

Dispositifs, systèmes et procédés de traitement ou de propagation de plantes Download PDF

Info

Publication number
WO2025111689A1
WO2025111689A1 PCT/CA2024/000015 CA2024000015W WO2025111689A1 WO 2025111689 A1 WO2025111689 A1 WO 2025111689A1 CA 2024000015 W CA2024000015 W CA 2024000015W WO 2025111689 A1 WO2025111689 A1 WO 2025111689A1
Authority
WO
WIPO (PCT)
Prior art keywords
wire
plant
image data
sensor
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CA2024/000015
Other languages
English (en)
Inventor
Mehrdad RAJI KERMANI
Moteaal ASADI SHIRZI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Western Ontario
Original Assignee
University of Western Ontario
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Western Ontario filed Critical University of Western Ontario
Publication of WO2025111689A1 publication Critical patent/WO2025111689A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G2/00Vegetative propagation
    • A01G2/30Grafting
    • A01G2/32Automatic apparatus therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • B25J15/0206Gripping heads and other end effectors servo-actuated comprising articulated grippers
    • B25J15/0226Gripping heads and other end effectors servo-actuated comprising articulated grippers actuated by cams
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/776Validation; Performance evaluation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects

Definitions

  • the present invention relates to agriculture and horticulture, and more particularly to tasks that typically involve handling or inspection of individual plants.
  • Plant processing and propagation includes many labour intensive tasks, such as pruning, localized spraying, clipping, and the like.
  • clipping of plants is a time-consuming and laborious task.
  • Clipping is the task of putting a rubber band or plastic clip around a plant’s main stem and a supporting structure, such as a stake at a particular point along the stem to provide additional support to the plant.
  • the clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to execute the clipping task.
  • a clipping device comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
  • a method for processing plants comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • a system for processing plants comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
  • Figure 1A shows a clipping device with a clamp in an open position.
  • Figure IB shows a clipping device with a clamp in an closed position.
  • Figure 2 shows wire feeding components of the clipping device.
  • Figure 3 A shows a first view of wire curving components of the clipping device.
  • Figure 3B shows a second view of the wire curving components.
  • Figure 3C shows examples of various configurations of curved wire clips.
  • Figure 4A shows wire cutting components of the clipping device.
  • Figure 4B shows interaction of wire cutting components so that a rotating cam pivots a wire cutting lever to execute a cutting motion of a cutter.
  • Figure 5A-5D show clamping components of the clipping device formed as first and second opposing claw shaped arms that are pivoted toward each other by a linear actuator to transition from an open position (Fig. 5A) through intermediate positions (Fig. 5B and 5C) to a closed position (Fig. 5D).
  • Figure 6 shows connection of wire feeding components, wire curving components, wire cutting components, and clamping components in absence of actuators and supporting structures.
  • Figure 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder.
  • Figure 8 shows the relationship between the relative location of the Bender to the Wire Feeder and the diameter of the clip.
  • Figure 9 shows two different profiles of the Bender. Changing a linear translational position of surface (a) and changing an angular rotational position of surface (b).
  • Figure 10 shows a schematic that demonstrates a change in Cam profile causing a different shaping of the Clip.
  • Figure 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide.
  • Figure 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
  • Figure 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
  • Figure 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 16 shows the clipping device with a stereo camera installed on a robotic arm.
  • Figure 17 shows examples of clipping points identified by expert farmer selections.
  • Figure 18 shows color value of pixels plotted in four different color spaces; for plant recognition, the concentration of pixels with similar color values in LAB is better than in other color spaces.
  • Figure 19 shows schematic steps of the stem recognition using adaptive color image segmentation based on optimized LAB color space.
  • Figure 20 shows schematic steps of the wooden stake segmentation/recognition.
  • Figure 21 shows comparison of the histogram (left) and kernel density estimation (right) constructed using the same data.
  • the dashed individual kernels make up the kernel density estimator.
  • Figure 22 shows schematic steps for computing the Principal Orientation of the Histogram Gradient.
  • Figure 23 shows schematic steps of a multi-stage point density method to identify the most suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers.
  • Figure 24 shows comparison of plant recognition for three types of seedlings (pepper, tomato, and cucumber) after applying four comparator automatic adaptive segmentation methods and the presently disclosed adaptive segmentation based on feature descriptors (entropy and variance).
  • Figure 25 shows stem and stake recognition of pepper (1), cucumber (2), bell pepper (3), and tomato (4) seedlings after applying the adaptive color image segmentation based on feature descriptors (entropy and variance) and the hybridization of the Otsu method and median filter; different cameras were used to take images in different lighting conditions and backgrounds to check the robustness of the algorithm.
  • feature descriptors entropy and variance
  • Figure 26 shows suggested clipping points after applying the multi-stage point density method; the stereo camera matches left and right images to find the distance of the clipping point from the clipping device and calculates the orientation of the clipping device related to the suggested clipping point.
  • Figure 27 shows suggested clipping points using multi-stage point density algorithm for samples 1 and 2; in sample 3, although a suitable clipping point could be identified on the seedling, the stake is behind a leaf and not accessible; sample 4 shows a case where neither the stem nor the stake was accessible.
  • Figure 28 shows the clipping device and stereo camera on a robotic arm; the stereo camera takes images from the seedling and stalk; after recognizing the suitable clipping point using the multi-stage point density method, the robot moves the clipping device near the recognized clipping point; and the clipping device makes a clip around the stem and stake.
  • Figure 29 shows an impedance controller block diagram of a servo motor in the clipping device.
  • Figure 30 depicts trajectory and torque profile of a servo motor in automated and hand-held clipping device platforms during a complete cycle of clip production.
  • Figure 31 shows a 3D renderings depicting (a) a potential schematic design of a robotic arm, and (b) a gantry system equipped with multiple robotic arms for the stem-stake coupling system, (c) schematic of a robotic stem-stake coupling system with two gantries and nine robotic arms, (d) photograph of a working robotic system with a single gantry and two robotic arms.
  • Figure 32 shows a block diagram of a robotic system that can include multiple gantries (1 to m) supporting varying numbers of robotic arms (1 to n) equipped with an automatic clipping device (ACD), a robotic control unit (RCU), and a stereo camera.
  • ACD automatic clipping device
  • RCU robotic control unit
  • This object-oriented design offers flexibility to adjust the number of gantries and arms as needed.
  • Figure 33 shows a graphical user interface (GUI) of the robotic stem-stake coupling system with fifteen distinct subsections.
  • GUI graphical user interface
  • Figure 34 shows evaluation results of five robotic arm configurations for the stem-stake coupling task based on eleven key parameters.
  • Figure 35 shows a 5-degree-of-freedom (5-DOF) robotic arm, custom-designed and fabricated to fulfill the requirements of an experimentally implemented robotic system; configuration and main components of the robotic arm are shown from the left (a) and right (b) perspectives.
  • 5-DOF 5-degree-of-freedom
  • Figure 36 shows a logic flowchart outline of the robotic arm shown in Fig. 35.
  • Figure 37 shows detailed analysis of the position, torque, and error for each joint in achieving the desired position of the robotic arm. This figure illustrates the discrepancies between the commanded and actual positions, along with the corresponding torque applied at each joint.
  • Figure 38 shows comprehensive examination of the angle, torque, and error for each joint in achieving the desired orientation of the ACD. This figure highlights the differences between the target and achieved joint angles, as well as the torque required for each joint to reach the specified orientation.
  • Figure 39 shows (a) a schematic representation of error calculations; (b) a box-and-whisker plot illustrating the robotic arm’s and machine vision’s accuracy in determining and reaching a specific point within the working space; and (c) shows accuracy and repeatability of the robotic arm in reaching a specific point in the working space - the black (darker colored) dots represent the accuracy and repeatability of the robotic arm alone and the blue (lighter colored) dots indicate the accuracy and repeatability of the integrated system using both machine vision and robotic arms.
  • Fig. 1 shows a perspective view of the clipping device (CD) 10 and its various parts.
  • the CD 10 makes a clip and places it around a wooden stake and the main stem of a seedling or a flower.
  • the clip provides additional support to the plant and avoids damage during transportation.
  • the working principle of the CD 10 is based on feeding a wire 21 or equivalent thin filament from a rotating wire feeder 12 through a rotating wheel (main feeder) 14 driven by an actuator.
  • the wire 21 is pulled from the wire feeder 12 by the actuated rotation of main feeder 14 and is pressed against a feeder supporter 16.
  • the main feeder 14 When the main feeder 14 rotates, it pulls the wire forward against the feeder supporter 16 and pushes the wire through an opening inside the wire guide 18 which defines a bore or channel shape with an input opening 19a proximal to the feeder supporter 16 surface that abuts wire 21 and an opposing output opening 19b proximal to a bender 20.
  • the main feeder 14 pushes the wire through and out of the wire guide 18, more specifically out of output opening 19b, to strike the bender 20 (which is optionally configured as a tunable/ adjustable mechanism that is adjusted to change the diameter or shape of a wire clip) and the force of pushing the wire against a strike surface 81 of the bender 20 shapes the curling wire into a ring shape clip 22.
  • a cutter 24 cuts the wire at the end of each cycle when a full clip is formed.
  • a first actuator such as an electric motor (Servo Motor 1) 26 or other types of actuators such as a pneumatic actuator, turns the rotating wheel of the main feeder 14 and as the wire 21 passes through the wire guide 18 the force applied by the bender 20 shapes the wire into the clip 22. The same actuator also drives the cutter 24 to cut the wire at the end of one cycle to release the clip 22 from the feeding wire.
  • the components of wire feeder 12, main feeder 14, feeder supporter 16, wire guider 18, first actuator 26 are interconnected in a desired orientation to one another by connection to a frame 28.
  • the wire By locating the clipping device near a suitable point near the stem of the plant, the wire can wrap around the stem and the wooden stake as the wire is shaped into a clip 22.
  • the size and shape of the clips can be changed with respect to the seedling or plant.
  • the wire can be fed continuously, with the bender 20 shaping the clip 22 and the cutter 24 releasing the clip 22, and therefore it is not necessary to use pre-made clips in a cartridge.
  • the bender 20 is tunable/adjustable, and a tuning screw 80 on the bender 20 allows tuning of the bender 20 to change an impact or strike point of the feeding wire 21 against the strike surface 81 of the bender 20 and therefore change the force applied to the wire - changing the applied force allows for changing the shape and diameter of the clip 22 as well as the number of overlaps.
  • the CD also includes a clamp comprising fist and second opposing claw-shaped arms (30, 32) that have a specialized shape.
  • the claw-shaped arms are useful to bring the stem and wooden stake close to each other prior to the clipping.
  • a second actuator such as an electric motor (Servo Motor 2) 34 or other types of actuators such as a pneumatic actuator is used to close (and open) the claw-shaped arms.
  • Fig. 1A shows the claw-shaped arms in an open position
  • Fig. IB shows a closed position.
  • the specific shapes of the claw-shaped arms bring the stem and wooden stake closer as it is closed by the second actuator.
  • the same actuator also brings the head or the clip forming portion of the clipping device (while closing the claw-shaped arms) to the appropriate position near the stem and wooden stake i.e., the second actuator translates frame 28 and its connected components relatively closer to the claw-shaped arms concurrent during closing, while translating frame 28 and its connected components relatively farther away from the claw-shaped arms during opening.
  • the claw shaped arms (30, 32) can be considered more generally as an example of a clamp, and the claw shaped arms (30, 32) are first and second opposing jaws (30, 32) of the clamp.
  • the first and second opposing jaws are rotationally mounted to a clamp frame 40 of the device at first and second clamp rotation points (42, 44), respectively, the clamp aligned with the exit of the wire guide 18.
  • the second actuator 34 is pivotably coupled to both of the first and second opposing jaws (30, 32) at a common third clamp rotation point 46, the common third clamp rotation point located approximately equidistant from the first and second clamp rotation points, the second actuator 34 driving counter-rotation of first and second opposing jaws (30, 32) to circumferentially reduce an open space between the first and second opposing jaws (30, 32). More specifically, counter-rotation means that the first and second opposing jaws rotate in opposing directions such that one of the jaws rotates clockwise while the other jaw rotates counter clockwise.
  • the second actuator 34 is coupled to the common third clamp rotation point 46 by a rack-and-pinion transmission.
  • the second actuator 34 is directly connected to a pinion gear 48, and pinion gear 48 engages rack 50 which is attached in a fixed position in clamp frame 40.
  • Rotational motion of pinion gear 48 along rack 50 translates a linear guide holder 52 relative to rack 50, and consequently also translates linear guide holder 52 relative to clamp frame 40 and also translates the common third clamp rotation point 46 relative to clamp frame 40.
  • a linear arm holder 54 extending from the linear guide holder 52 has a proximal end connected to the linear guide holder 52 and a distal end pivotally connected to the common third clamp rotation point 46. Linear sliding of the linear guide holder 52 is aided by bushings 56 slidably engaging linear tracks 58 that are orientated parallel to rack 50.
  • first and second slots (60, 62) formed within the opposing first and second jaws, respectively.
  • Each slot extends from a first end proximal to a capturing surface of the jaw (ie., the claw surface that captures the stem) to a second end distal from the capturing surface and proximal to the linear guide holder 52 and its linear arm holder 54.
  • the first and second slots cross over each other and the crossing point of the first and second slots provides for the coupling of the common third clamp rotation point.
  • each of the first and second slots are rotationally coupled to the clamp frame 40 at the first and second clamp rotation points (42, 44), while the crossing point of the slots is coupled to the distal end of the linear arm holder 54 forming the common third clamp rotation point 46.
  • linear guide holder 52 is attached to frame 28, and therefore as linear guide holder 50 translates relative to clamp frame 40, frame 28 and its attached components also move linearly relative to clamp frame 40 (for example, comparing Fig. 1A to Fig. IB shows that as linear guide holder 52 moves linearly relative to clamp frame 40 and claw-shaped arms (30, 32), frame 28 moves with linear guide holder 52).
  • the materials used to feed the CD can be different depending on the need of the user.
  • the feeding wire can be made of metals such as steel, copper, etc., or plastic material such as polyethylene or polyamide. If the plastic material is used, a heater is typically used for pre-heating the plastic material prior to being fed through the wire guide 18.
  • the energy required to move the mechanisms of the CD can be from AC or DC sources as well as pneumatic or hydraulic actuators.
  • the CD can be used by farmers as a handheld device, or it can be installed on automatic machines or robots. If used as a hand-held device, a single button on the CD allows a farmer to use the CD manually to perform clipping.
  • the manual CD could be used without claw-shaped arms (30, 32).
  • a control board of the clipping device can connect to the automatic machine or robot to follow their commands.
  • the control board can support different kinds of communication protocols such as CAN, I2C, RS232, and RS485 to connect a variety of devices both wired and wireless.
  • Fig. 2 shows wire feeding components of the CD.
  • the main feeder 14 is connected to and driven by servo motor 1.
  • the main feeder 14 is shaped as a wheel with a central groove 74 defined on the perimeter or circumference of the wheel.
  • the central groove 74 provides a wire path in the circumferential center of the main feeder 14.
  • transverse grooves 76 i.e., transverse to the central groove
  • These transverse grooves 76 make small indentations on the wire 21 to push it forward by the force created because of these indentations.
  • the number of transverse grooves and the diameter of the main feeder determine the length of the clip 22. The greater number of transverse grooves results in more wire being pulled, thereby a longer clip length.
  • Fig. 3 shows wire curving components of the CD 10.
  • a hole/bore (bore 19 communicatively extending from input opening 19a to output opening 19b) inside the wire guide 18 along the direction the wire is being pulled.
  • This hole/bore facilitates the feed wire 21 following a direct linear path and reduces and preferably prevents any bending of the wire before it reaches the bender 20.
  • the bender 20 applies a force to the wire 21 to bend it. The amount of the applied force depends on the pulling force of the main feeder. The surface and the angle of the profile of the bender 20 affect the direction of the applied force to the wire.
  • the tuning screw 80 on the bender 20 tunes the direction of the applied force to the wire 21 by adjusting the relative position of the bender (more specifically, a bender strike surface 81 that provides an impact point with wire 21) with respect to the wire guide 18.
  • the amount of applied force and its direction determine the shape and diameter (i.e., the curvature of the clip) as well as the number of wire overlaps in the clip 22.
  • Different surface profiles of the bender can be used to make different clip shapes.
  • the relationship between the relative position of the bender to the wire guide and the diameter of the clip is given in Fig. 7.
  • the number of transverse grooves 76 and diameter of the main feeder 14 determine how fast the wire is pulled, thereby determining the length of the wire in each clip.
  • the relationship between the number of transverse grooves and diameter of the main feeder and the length of a clip is given in Fig. 7.
  • Fig. 4A shows wire cutter components of the CD.
  • Servo motor 1 rotates the main feeder 14 and cutter guide with cam 84 (referenced for brevity as cam 84) - the wheel of the main feeder 14 and the cam 84 are fixed together so as to move co-rotationally.
  • the cam 84 abuts and engages a cutter lever with cam 86 at a first end and cutter 24 mounted on a second end (referenced for brevity as lever cam 86) throughout its rotation.
  • the lever cam 86 is pivotally coupled to the wire guider 18, with a first end of the lever cam engaging the cam 84 and a second end of the lever cam forming the cutter 24 with the bender 20 mounted on top of the cutter 24.
  • Wire 21 is pulled by capture within the circumferential central groove 74 and associated transverse grooves 76 during rotation of the main feeder 14.
  • a resting gap 88 is formed as a flattened circumferential portion on the main feeder.
  • the main feeder rotates about 320 degrees. After that, due to the resting gap on the main feeder, the wire is not pulled by the main feeder.
  • the cam 84 is aligned with the resting gap 88. Therefore, synchronized with a cessation of pulling force due to the resting gap 88, the cam 84 pushes a lever cam 86 downward, causing the lever cam 86 to rotate, which causes the cutter 24 to cut the wire 21 and separate the clip 22 from the wire 21.
  • FIG. 4B shows a resting position and a cutting position of the lever cam 86 and its associated cutter 24 throughout rotation of cam 84.
  • cam 84 For a majority of rotation of cam 84 (for example, approximately 320 degrees) the lever cam 86 and cutter 24 are biased towards a resting position, and when the resting gap 88 faces the wire 21 and the feeder supporter 16, the cam 84 engages the lever cam 86 to pivot the lever cam 86 and cutter 24 into a cutting position.
  • the cam 84 and resting gap 88 are aligned, and the cam 84 and lever cam 86 are also shaped and configured to engage and then disengage within the resting gap 88 portion of the rotation of the main feeder 14 so that the lever cam 86 and its associated cutter 24 pivot to a cutting position and clear the cutting position to return to a resting position aligned with and synchronous with the rotational portion of the resting gap 88 temporarily ceasing pulling of wire 21.
  • Fig. 5 shows the Collector Mechanism/ clamping components of the CD. From resting/open position (Fig. 5A), the servo motor 2 pushes the linear arm holder 54 forward and closes the clawshaped arms. When the linear arm holder 54 comes forward, the claw-shaped arms close and bring together the main stem and wooden stake in the center of the claw-shaped arms. The linear arm holder 54 continues to move forward and close the claw-shaped arms completely (Fig. 5B and 5C). The arms hold tightly the stem and wooden stake and collect them in the center with minimal damage to the stem due to the special shape and profile of the claws (Fig. 5D).
  • first and second slots (60, 62) are shaped biphasic with first and second portions and symmetrically mirrored (first portions distal from the capturing surface of the claw shaped arms and second portions proximal to the capturing surface of the claw shaped arms) so that when the common third clamp rotation point 46 moves forward (in a direction distal to proximal to the capturing surface of the claw-shaped arms) along the corresponding first portion of the slots (60, 62) the claw-shaped arms converge or close by rotating towards each other or conversely when the common third clamp rotation point 46 moves backward (in a direction proximal to distal from the capturing surface of the claw-shaped arms) along this same first portion of the slots (60, 62) the claw shaped arms expand or open by rotating away from each other (Fig.
  • Fig. 6 shows the various components shown in Figs. 1-5 assembled without actuators and support structures for convenience of illustration of interaction of these components.
  • Fig. 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder, and modifying length of the clip according to equation
  • L is the length of the Clip
  • n is the number of Transverse Grooves
  • a is the angle of the Resting Gap
  • k_l is a material-dependent coefficient that is a weighting factor included in the mathematical model to be able to tune the output value.
  • Fig. 8 shows the relationship between the relative location of the bender 20 and its strike surface 81 to the feed wire 21 and the diameter of the clip.
  • Fig. 9 shows two different profiles of the bender. Changing the location of surface (a) and changing the angle of surface (b). Changing the location of the surface and the angle of the bender changes the direction and amount of force on the wire and the shape of the clip.
  • Fig. 10 shows how changing the cam profile causes the different shapes of the clip.
  • the different diameter of the cam causes rotation of the lever cam which in turn results in changing the position of the bender.
  • Changing the position of the bender will change the direction of the applied force on the wire.
  • the clip has a helical shape.
  • Fig. 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide.
  • a heater element such as a heating coil, heating bar and the like can be used to heat wire passing through the bore of the wire guide to make the wire more bendable, moldable, malleable, flexible, and the like for ease of reshaping wire that exits from the bore and strikes a bender surface.
  • the variant wire curving components shown in Fig. 11 provides a clipping device equipped with a plastic wire roll with a heating coil provided to heat the plastic wire.
  • the clipping device may be equipped with a various materials for making a clip, for example materials such as copper, stainless steel, polyethylene, polyamide, polyesters or different kinds of plastic wires.
  • a heating coil can be coupled to the wire guide heat the plastic materials as it passes through the bore of the wire guide to make the plastic material more bendable, moldable, malleable, flexible, etc. so as to ease reshaping of the plastic material wire exiting the bore of the wire guide by the wire curving components to form a plastic material clip.
  • Fig. 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Fig. 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Fig. 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
  • Fig. 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
  • Fig. 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
  • Experimental testing results demonstrate the ability of the currently disclosed device, system and method to benefit plant processing in autonomous, semi- autonomous and manual modes.
  • the following experimental examples are for illustration purposes only and are not intended to be a limiting description.
  • Clipping is the task of putting a rubber band or plastic clip around the seedling’s main stem and a wooden stake at a particular point along the stem to provide additional support to the seedling and avoid damage during transportation.
  • the clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to process a large number of seedlings propagation arena. As an example, in one propagation facility (e.g., Roelands Plant Farms, Lambton Shores, ON., Canada), more than 25 million seedlings grow and are clipped per year.
  • one propagation facility e.g., Roelands Plant Farms, Lambton Shores, ON., Canada
  • the robotic clipping solution contains two main parts; a mechatronic unit that performs the act of clipping and a vision unit that identifies the clipping points.
  • the focus of this paper is on the vision unit, which replicates human visual functionalities and perception to identify a suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers.
  • Machine vision has been widely used to support precision agriculture by providing automated solutions for tasks that are traditionally performed manually.
  • Examples of such methods are optimized image registration and deep learning segmentation approach ⁇ l l/kerkech2020vine ⁇ , adaptive multi-vision technology ⁇ 12/chen2020three ⁇ , and image fusion technology ⁇ 13/li2021 recent ⁇ .
  • image registration and deep learning segmentation approach ⁇ l l/kerkech2020vine ⁇
  • adaptive multi-vision technology ⁇ 12/chen2020three ⁇
  • image fusion technology ⁇ 13/li2021 recent ⁇ .
  • CNN convolutional neural network
  • transfer learning ⁇ 14/bai2022multi ⁇ and point cloud using deep learning CNN ⁇ 15/jayakumari2021 object ⁇ need hundreds of labeled images to train the network for each type of seedling and have large processing time ⁇ 16/kolar2018transfer ⁇ .
  • CNN convolutional neural network
  • the analytical part of the multi-stage point density method is based on the point density variation, kernel density estimation, the principal orientation of the histogram gradient, and normalized cross-correlation for matching.
  • the point density variation calculates the disparity of intensity of colors on a map.
  • the kernel density estimator estimates the population of the finite data sample by smoothing the fundamental data. Using the principal orientation of the histogram gradient, and normalized cross-correlation for matching, the multi-stage point density method suggests a suitable clipping point.
  • the algorithm checks the accessibility of the wooden stake and stem for the robotic arm and clipping device and maps the stereo camera coordinate system to the robot coordinate system to provide necessary sensory feedback for the controller of the robot.
  • a mechatronic unit that includes the clipping device on a general purpose robotic arm (i.e., KUKA LWR IV).
  • Our novel clipping device curls a thin wire to simultaneously make and attach the clip to the plant.
  • An optimized stereo camera (Megapixel/USB5000W02M) has been placed on the clipping device to take images from the plants and send them to the vision algorithm.
  • Fig. 16 shows the installed clipping device and the stereo camera used for evaluation purposes.
  • An automated clipping system may include a plurality of specialized robotic arms equipped with such devices.
  • Suitable Clipping Point Finding a suitable clipping point on the seedlings is the most imperative, challenging, and time-consuming task of the machine vision of the robotic clipping system.
  • the clipping point can be on the highest point, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point is selected below the highest node or axial. However, the leaves are dense around the highest node, and some parts of the main stem are behind the leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder.
  • the selection of the clipping point is a cognitive process that relies on heuristic information.
  • Fig. 17 shows some seedlings and the preferred clipping points that expert farmers validated.
  • ANN ANN to predict the optimized cut-off values for the locally adaptive threshold for each pixel base on entropy and variance around the pixel.
  • the input features of the ANN are the variance and entropy of the sub-image around the pixel, and the output is the sub-range of the L, A, and B channels' cut-off values for the multilevel threshold.
  • the variance is a measure of variability and provides an indication of how the pixel values are spread.
  • p (mu) is the mean value and for each sub-image around the pixel can be obtained as
  • H(i) Tlf / N
  • n_i is the number of pixels with a gray level of i
  • N represents the total number of the pixels in the sub-image
  • L is the maximum grey level.
  • the entropy measures the average uncertainty of the information source, defined as the corresponding states of the intensity level to which individual pixels can adapt. The higher the value of the entropy is, the more detailed the image is ⁇ 17/deng2009entropy ⁇ .
  • the entropy is defined as follows, where all parameters are as defined before.
  • Fig. 19 shows the block diagram of the stem recognition algorithm. After camera calibration using Zhang's method ⁇ 20/zhang2000flexible ⁇ , the quality of the images was enhanced and restored using the pre-processing methods that were a combination of equalization techniques, high-boost filters, and morphological boundary enhancement ⁇ 21/thapar2012study ⁇ .
  • the morphological filtering techniques were used to remove noises from the segmented stem ⁇ 22/ruchay2017impulsive ⁇ . Using the hit-and-miss, thinning, and convex -hull techniques, the leaves were then eliminated.
  • Wooden Stake Recognition Recognizing the wooden stake inserted beside a seedling is more straightforward.
  • the hybridization of the Otsu method and median filter ⁇ 23/pacifico2018hybrid ⁇ can be used for stake recognition.
  • Fig. 20 shows the schematic steps of the stake segmentation method. The wooden stake is almost vertically straight. Thus, hidden and covered parts of the stake can be found using simple partial spline matching.
  • the multi-stage point density method uses the boundary and skeleton of the seedling to find the region of interest and limit the search area.
  • the borders of the region of interest are computed using Equations (3), (4), and (5) below,
  • S i and SJ are the mean values of the skeleton in the x and y directions for all non-null pixels
  • P_r, P l, and P t are the right, left, and top values of the seedling boundaries for non-null values
  • S(i) and S(j) are the values of the skeleton in pixel (i,j)
  • >_r, ⁇ j>_l, and ⁇ j>_t are the number of null values for right, left, and top of the boundary of the plant, respectively.
  • Point Density Variation shows the disparity of intensity of colors on a map ⁇ 24/lawin2018density ⁇ .
  • a Gaussian mixture model represents a distribution for each color channel i as, where 7i_ ⁇ i_ ⁇ k ⁇ ⁇ are the mixing coefficients that meet the following condition,
  • the density P_i(x) is the Gaussian distribution of intensities in each color channel
  • p_ ⁇ i_ ⁇ k ⁇ ,Z_ ⁇ i_ ⁇ k ⁇ ⁇ ) is the Gaussian density with the mean value, p_ ⁇ i_ ⁇ k ⁇ ⁇ and the variance E_ ⁇ i_ ⁇ k ⁇ .
  • Kernel Density Estimator In most computer vision and pattern recognition applications, the feature space is complex, noisy, and rarely can be described by the common parametric models, and non-parametric density estimation techniques have been widely used to analyze arbitrarily structured feature spaces ⁇ 26/yang2003improved ⁇ .
  • the kernel density estimator as a non-parametric density estimation technique calculates the density of features in a neighborhood around those features and the density function is estimated by a sum of kernel functions (typically Gaussians) centered at the data points ⁇ 27/elgammal2002background ⁇ .
  • a bandwidth associated with the kernel function is chosen to control the smoothness of the estimated densities and more data points allow a narrower bandwidth and a better density estimate, and the kernel density estimator spreads the known quantity of the population for each point out from the point location of random non-parametric variables ⁇ 28/matioli2018new ⁇ .
  • point density variation estimates the density of stem intensity
  • kernel density estimation is a fundamental data smoothing technique, as shown in Fig. 21, that inferences about the population of the finite data sample ⁇ 29/scaldelai2022multiclusterkde ⁇ .
  • the kernel density estimator is defined as, where K is the kernel, h > 0 is a smoothing parameter called the bandwidth, and scaled kernel.
  • the Principal Orientation of the Histogram Gradient represents a signature of the features of spatial regions ⁇ 31/lauria2018nonparametric ⁇ , ⁇ 32/wiangsamut2022fast ⁇ .
  • the Region of interest was divided the region of interest into sub-images (voxels) with a size of about 1.5 times the stem's average thickness. The boundary of the stem was used to calculate the stem's average thickness ⁇ 33/wang2020fruit ⁇ . This value can also be assigned by the user.
  • the normalized correlation metric was used to match the histograms, i.e., where H2 are the his + tograms of the candidate voxel and ground voxels with the same normalized principals of long histogram gradient.
  • H2 are the his + tograms of the candidate voxel and ground voxels with the same normalized principals of long histogram gradient.
  • Fig. 23A summarizes the step-by-step of the proposed algorithm for bell pepper seedlings.
  • the image is transferred to the LAB color space, and the adaptive color image segmentation and hybridization of the Otsu method and median filter are applied to recognize the stem and stake.
  • the boundary and skeleton of the plant using morphological image processing operations are obtained next.
  • the region of interest is defined using the boundary and skeleton.
  • a combination of recursive dilation and erosion and morphological techniques such as Hit and Miss, convex hull, and thinning are used to eliminate the leaves from the plant to find the stem.
  • the third row contains the results of applying point density variation (Fig. 23B shows a magnification of the point density variation plot) and kernel density estimator for finding the most suitable clipping point on the stem.
  • the stake is checked to determine whether there is a corresponding point on the stake. If accessible, the multi-stage point density method suggests the point and calculates the distance and depth using the images from the stereo camera ⁇ 34/dandil2019computer ⁇ . Finally, the multi-stage point density algorithm calculates the most suitable orientation and position of the clipping device in the real coordinates of the robotic arm.
  • MSE mean square error
  • MSE mean square error
  • the multi-stage point density method After adaptive segmentation, we applied the multi-stage point density method on tree types of seedlings to find the correct position of the clipping points and evaluate the quality of the results.
  • the first category contained 120 images of bell pepper seedlings
  • the second category had 80 images of tomatoes
  • the third category contained 80 images of cucumbers.
  • Table 2 shows the success rate of finding the suitable clipping point for each type of seedling. The leaves of cucumbers are big and access to the stem on top of the plant is difficult. Thus, the success rate for cucumbers is less than other seedlings.
  • Fig. 26 shows the recognized clipping points on the seedlings using the multi-stage point density method.
  • Fig. 28 shows examples of seedlings with clips on them.
  • Experimental Example 1 proposes a new approach for finding the most suitable clipping point for a new robotic clipping system under development.
  • the proposed approach is conceptually different from other feature detection methods in that it combines analytical image processing methods and data-driven learning algorithms. This allows us to solve the challenging problem of clipping point detection.
  • the success of our adaptive segmentation approach was in part due to the use of the variance and entropy of voxels as two effective features for tuning the local cut-off values.
  • the final results of the algorithm i.e., the identified clipping points, were verified by expert farmers to validate the efficacy of the algorithm. As a whole, the obtained results indicated satisfactory performance in finding the most suitable clipping point.
  • ACD automatic stem-stake coupling device
  • HCD stem-stake coupling device
  • ACD and HCD utilize interconnected mechanisms to create clips of various sizes and shapes from metallic wire. These mechanisms include Pushing Mechanism, Curving Mechanism, Cutter Mechanism, and, for the ACD specifically, Collector Mechanism. Both devices operate on the principle of feeding a thin wire from the Feeding Wire Spool.
  • the wire is pulled by the Main Feeder and is pressed against the Feeder Supporter. As the wire is pulled it moves through the Wire Guider while a Bender shapes it into a ring-shaped clip. At the end of this cycle, the Cutter cuts the wire when a full clip is formed.
  • a first actuator such as an electric motor (Servo Motor 1) turns the Main Feeder and drives the Cutter as well.
  • the clipping mechanism incorporates an Optical Sensor that sends a pulse to indicate the completion of each cycle. By positioning the clipping device near the stem of the plant, the wire wraps around the stem and wooden stake. The size and shape of the clips can be adjusted based on the seedling or plant using a Tuning Screw to adjust the Bender.
  • the ACD is equipped with a stereo camera and claw-shaped arms.
  • a second actuator such as an electric motor (Servo Motor 2), rotates the claw-shaped arms, bringing the head of the clipping device closer to the stem and wooden stake and closing the arms.
  • the ACD is integrated into a robotic system.
  • a vision system utilizes stereo images to provide real-time information about the optimal orientation and position of the stem-stake coupling point, as well as the 3D spatial coordinates of the ACD, which are then transmitted to the robotic arm.
  • the impedance control method is employed to regulate the speed and torque of the servo motors based on the desired shape and size of the clips.
  • Various materials can be used to produce the clips, including metals like steel and copper wire, as well as plastic materials such as polyethylene or polyamide wire.
  • copper wire is often selected based on growers’ preferences.
  • a heater is required to preheat the material before feeding it through the Wire Guider.
  • the pushing mechanism’s role is to exert force and propel the wire forward at a specified velocity.
  • the Main Feeder is connected to the Main Servo Motor (Servo Motor 1).
  • the Main Feeder has a Central Groove around its perimeter, which ensures the wire stays centered as it moves. Additionally, there are Transverse Grooves perpendicular to the Central Groove on the Main Feeder’s perimeter. These Transverse Grooves create small indentations on the wire, propelling it forward as the Main Feeder rotates.
  • the length of the clip depends on the diameter of the Main Feeder, the number of Transverse Grooves, and the arc length of the Resting Gap.
  • This empirical model states the relationship between the mentioned parameters and the length of the clip may be expressed as Equation 1 discussed above with reference to Fig. 7.
  • Curving Mechanism bends the wire into the desired shape. As the wire moves forward, it passes through a Wire Guider and then encounters a Bender. There is a hole/bore/channel inside the Wire Guider along the direction the wire is being pushed. This hole ensures that the wire follows a straight path and prevents any bending before it reaches the Bender. The Bender applies a normal force to the Wire to bend it. The normal force applied on the wire is proportional to the pulling force of the Main feeder.
  • This force determines the shape and diameter (i.e., the curvature) of the clip, as well as the number of wire overlaps.
  • the surface and the profile angle of the Bender affect the direction of the applied normal force to the wire.
  • the Tuning Screw installed on the Bender allows tuning the relative position of the Bender due to the Wire Guider for adjusting the direction of the applied force.
  • the surface profile of the Bender effects different clip shapes.
  • Fig. 9 shows two examples of different mechanisms with different profiles that result in producing clips with different diameters.
  • a first illustrative mechanism involves the Rotational Mechanism with a flat surface of the Bender.
  • Adjusting the Bender’s angle changes the orientation of its flat surface relative to the wire, thereby altering the force exerted on the wire and resulting in a different radius of the clip. For instance, the clockwise rotation of the Bender increases the force on the wire, resulting in smaller clip radii, and vice versa.
  • Fig. 9A shows the Positional Mechanism in which the angle of the Bender does not change, instead the position of the Bender changes relative to the Wire Guider. In this mechanism, the surface of the Bender is curved. Thus, a vertical movement of the Bender alters the force applied to the wire, resulting in a change in the clip’s radius.
  • the relationship between the clip’s radius and the Bender’s position is illustrated in Fig. 8. Additionally, Fig.
  • the Cam 10 demonstrates how adjusting the Cam profile produces various clip shapes.
  • the Cam has different radii at different points. So, when Cam pushes Lever Cam, the rotational angle of the Lever Cam varies. The rotation of the Lever Cam changes the position of the Bender, thereby altering the amount of applied normal force on the Wire. As a result, the clip shape will be spiral.
  • the benefit of the spiral clip lies in its ability to securely grasp the stem and stake, even when they are not closely positioned, particularly when the clip’s initial radius is relatively larger. In the final step of making the clip, the smaller radius of the clip tightens the stem and stake together.
  • Both the ACD and HCD utilize a single servo motor to push, curve, and cut the wire through an integrated mechanism.
  • the servo motor rotates the Main Feeder and a Cam at the end of each cycle.
  • the Main Feeder rotates about 5.8 rad while pushing the wire forward and forming a clip.
  • the Main Feeder incorporates a Resting Gap to halt the wire feed before cutting. This allows the rotation of the Main Feeder and engages the Cutter via a Camshaft and Cam Lever, severing the wire.
  • the ACD is equipped with a unique collector mechanism that uses claw-shaped arms.
  • Fig. 5A-5E illustrate the collector mechanism and multiple steps of closing the claw-shape arms and repositioning of the ACD’s head.
  • a second servo motor, Servo Motor 2 causes the claw-shaped arms to close.
  • the Arm Holder continues to move forward to fully close the claw-shaped arms to securely hold the stem and stake in the center of the arms with minimal damage, due to slots of the claw-shaped arms and shape of the claw-shaped arms. By closing these arms, the stem and stake are held together in the center of the arms. Additionally, the collector moves the ACD’s head closer to the stem and stake without further closing the claw-shaped arms before creating the clip, as shown in Fig.
  • MOTOR CONTROL The power for pushing, curving, and cutting the wire is produced by the Main Servo Motor in both HCD and ACD.
  • the load on the servo motor may vary dynamically at each step.
  • the rotational speed of the servo motor also influences both the shape and the quality of the clip. Therefore, as the servo motor maintains a consistent force in a specific angular velocity in the presence of dynamic external torques it benefits accurate bending of the wire.
  • torque control is significant to achieve optimal performance and prevent stalling or overloading. Torque control allows the motor to adjust its output torque to compensate for changes in the load, ensuring repeatable motion control with consistent performance.
  • the control objective of an impedance controller in our system is to impose a desired dynamic relationship between the servo motor’s position and the force of interaction with the wire.
  • Impedance is defined as the ratio of the force to the position.
  • impedance control enables the motor to behave as a massspring-damper system, commanding the desired position in response to the interaction force of the servo motor and external factors.
  • the impedance controller In response to external force f_e, the impedance controller generates a modified position 80 as follows, where s represents the Laplace Transform variable and all other parameters are as defined previously.
  • the impedance controller remains stable as long as m_d, b_d and k_d are positive values.
  • the intrinsic inertia of the Main Feeder is due to its mass, simplifying the choice of desired impedance to the selection of b_d and k_d.
  • Fig. 29 depicts the block diagram of the proposed impedance controller.
  • the outer loop naturally closes when the servo motor encounters external torques.
  • the impedance function uses the estimated torque feedback, the impedance function generates 80 and commands the desired angle 0d.
  • the inner loop consists of a PID controller to track the desired trajectory for achieving suitable movement of the Main Feeder and, consequently, other parts such as the Bender and Cutter.
  • Both ACD and HCD are capable of producing clips in various shapes with the required diameters, including simple and spiral forms.
  • Fig. 7 illustrates the correlation between clip length and the number of Transverse Grooves. Adjusting the bender allows for changes in the diameter and shape of the clip.
  • Fig. 30 depicts the trajectory and torque profile of the Main Servo Motor across both the HCD and ACD platforms during a complete cycle of clip production.
  • the ACD and HCD produce a clip in approximately 0.95 seconds.
  • the impedance controller regulates the Servo Motor to exert accurate force on the wire, maintaining a specific trajectory with the desired angular velocity during clip production.
  • the applied pushing force on the wire initiates from 0.08 seconds into the process and lasts until 0.72 seconds.
  • Wire cutting commences at 0.8 seconds and concludes at 0.92 seconds. This visualization highlights how the introduced impedance controller influences motor performance, affecting the precision and consistency of clip formation.
  • RSCS Robotic stem-stake coupling System
  • the robotic arm rotates around the seedling to locate an appropriate clipping point, recognizes this point, and positions the ACD correctly through the process that takes time.
  • the average duration of these steps is detailed in Table 4 for four primary seedlings: Beit Alpha cucumber, chili pepper, bell pepper, and tomato.
  • Table 4 also presents the acceptance rates of clip quality for each type of seedling. According to Table 4, two robotic arms equipped with ACD can achieve the speed of one grower.
  • the selected point should be the highest point, above the uppermost node on the main stem.
  • Pepper seedlings typically have a straight stem with sufficient spacing between nodes, resulting in a higher acceptance rate for bell peppers compared to other seedlings.
  • Tomato seedlings on the other hand, have a relatively short main stem between nodes. Additionally, cucumber leaves tend to be denser around the highest node, which can occasionally lead to issues during clipping as some parts of the leaves may get caught between the two claw-shaped arms, slightly reducing the clip quality.
  • HCD growners utilizing HCD in propagation facilities and greenhouses can achieve approximately 86% greater efficiency compared to those employing plastic clips for stem-stake coupling task.
  • HCD not only facilitates more efficient operations but also allows growers to allocate their time to other critical tasks.
  • HCD offers cost-saving benefits, further enhancing its value in greenhouse and propagation facility management.
  • the HCD simplified the clipping task, allowing farmers to work longer with a lesser risk of back injuries due to chronic bad posture.
  • automatic clipping in large propagation facilities using ACD accelerates the process for millions of seedlings and plants with fewer growers.
  • the developed robotic stem-stake coupling System featuring three gantries and 24 robotic arms equipped with ACDs, has the capacity to couple the stems of 12,000 bell pepper seedlings to stakes within an hour.
  • HCD and ACD utilize sustainable, eco-friendly materials, offering a viable alternative to traditional plastic clips.
  • recyclable or biodegradable materials By using recyclable or biodegradable materials, these systems reduce environmental impact and support broader efforts to minimize waste and enhance sustainability in agricultural and horticultural practices.
  • seedlings are cultivated using different methods.
  • One style of cultivation is on the concrete floor, also known as the folding floor.
  • Another style is cultivation on the tray system.
  • both styles are used, where seedlings are transferred from the concrete floor to the tray system semiautomatically using specialized equipment.
  • To prepare the seedlings for clipping before transportation, other machinery in greenhouses, are used to rearrange the seedlings by automatically altering the spacing between them for easier access.
  • the tray system carrying the seedlings is passed in front of human workers, who affix a plastic clip around the seedlings and the wooden stake.
  • a solution benefits by meshing with existing technology used in propagation facilities, allowing for smooth integration with other automated machinery and devices while reducing disruption or cost increases. Therefore, the most recommended robotic solutions are those that can be installed directly where growers perform clippings.
  • Multiple methods may be employed to access seedlings for clipping tasks using a robotic system.
  • Options include employing a mobile robot, utilizing a gantry to maneuver a robotic arm around seedlings, or employing a fixed robotic arm with a carrier transporting seedlings within its workspace.
  • Each strategy offers advantages in terms of efficiency, adaptability, and precision in seedling handling.
  • One strategy is a concept involving compact robotic arms with a restricted operational range. A mobile gantry system moves the arm towards the seedlings, while the gantry glides along the rails of the tray system. This setup facilitates the development of small, simple, and lightweight robotic arms.
  • An alternative approach is to mount the robotic arm to a fixed point on the ground in front of the tray stream.
  • the tray system is passed in front of the robotic arm, the arm can access and clip the seedlings similar to human workers.
  • This approach simplifies operations and setup time for vision system operation. However, it relies on the moving tray system and cannot be easily adapted for clipping seedlings on the concrete floor.
  • Another approach is a mobile robot carrying the gantry and a robotic arm.
  • This configuration eliminates reliance on moving trays and can be used for tray systems of various sizes as well as the concrete floor to provide access to seedlings.
  • the robotic arm can operate autonomously among seedlings to perform the clipping task.
  • the disadvantage of this solution is the difficulty in managing multiple such robots. Additionally, the robotic arm’s stability may be compromised, potentially causing issues due to vibrations.
  • Another concept involves a four-wheel mobile system carrying a robotic arm near the seedlings. This method offers advantages in flexibility and adaptation to both cultivation styles.
  • each robotic arm can be strategically positioned around the seedlings as shown in Fig. 4.
  • Each robotic arm is equipped with a camera to determine the optimal stem-stake coupling point and assess the seedling from various angles. If a suitable point is identified, the clipping action is performed. If not, the other robotic arm attempts the task.
  • the gantry-style solution is more appropriate for accommodating a multi-arm solution.
  • the limited space around seedlings on the tray system may impede access to seedlings.
  • the spacing between seedlings on trays can be adjusted to enhance accessibility for the robotic arms. This re-spacing is a common practice in manual stem-stake coupling as well
  • Robotic System Framework After evaluating various possibilities, it appears that adding a gantry system with multiple robotic arms onto the existing tray system is the optimal choice for the propagation facilities. This approach eliminates the need for modifying rails, and trays, or rearranging seedlings, reducing additional workload. While the approach is shown for the tray system it can be adapted to the concrete floor style. The number of gantries and robotic arms can be modified depending on the requirements of the facility and volume of seedlings, due to the object- oriented design of the robotic stem-stake coupling system.
  • Fig. 3 IB illustrates a schematic representation of the robotic stem-stake coupling system, s featuring two gantries and nine robotic arms equipped with automatic clipping devices.
  • the robotic stem-stake coupling system has six major components, including a Master Control Panel (MCP), Gantry, Robotic Arm, Automatic Clipping Device (ACD), Robotic Control Unit (RCU), and machine vision System.
  • Fig. 32 shows the interconnection of these components.
  • MCP Master Control Panel
  • the Master Control Panel is a centralized interface enabling monitoring and control of key components within the robotic system. It acts as a central hub for coordinating operations, offering users access to essential controls, data, and functionalities. Access to the MCP is available through four distinct interfaces.
  • the Graphical User Interface is a visual interface that allows users to interact with the entire system through graphical icons and visual indicators.
  • Fig. 33 shows different modules of the GUI and brief information of the numerically labelled GUI modules are as follows:
  • Each joint can move independently at varying speeds or synchronously at set speeds.
  • Motor Joystick The user can adjust the ACD’s position and orientation using buttons.
  • Sending Code Users can send commands to control motors and monitor sensors.
  • Clipping Device Indicating ACD status; users manage ACD at various control levels.
  • Motor/Sensors Monitoring motors and sensors.
  • Machine Vision Users can view stereo images and ML results, choose image processing methods, and direct the robotic arm to automate tasks with specified priorities, speeds, directions, and autonomy levels.
  • Main Buttons Users select a port number, connect to the robotic arm, and access functions like emergency stop, reset, or exit the GUL
  • Program/Code Users can input or modify code for compilation, execute it line by line (forward or backward), pause/resume execution, or halt at each step.
  • Control Tray Interface with third-party devices.
  • Compiler Panel Displaying the code and allowing users to track its compilation progress.
  • TCP/IP Connection Enabling remote robotic control via an internet connection, accessible from devices like Android.
  • the smartPAD serves as the interface for overseeing and managing the robotic system, equipped with touch-screen functionality and connectivity options such as wired or remote connections. This enables improved functionality and mobility of the robotic system. It assists the user in maneuvering around the trays and overseeing and managing the robotic system.
  • the TCP/IP offers the possibility of connecting the robotic system to the internet and controlling it remotely.
  • This setup permits remote communication between the robotic system and the GUI, allowing farmers to send commands and receive feedback from anywhere with internet access.
  • farmers can observe the real-time status and performance of the robotic system via remote interfaces, which involves monitoring sensor data, tracking the robot’s location, and overseeing task progress.
  • users can remotely control the robot’s movements and actions and can issue high-level commands to the robotic system, enabling it to autonomously execute predefined tasks.
  • Remote access to robotic systems via TCP/IP facilitates maintenance tasks and diagnostics.
  • the developed Compiler allows farmers to write highlevel commands abstractly and intuitively, without needing to modify the low-level details of the robotic system’s hardware or communication protocols. Utilizing the compiler simplifies programming the robotic system and enhances accessibility for users of different technical skill levels. Users can remotely modify or enhance the functionality of the robotic system, while the system itself, equipped with high-level intelligence, can execute advanced commands in diverse conditions.
  • Gantry refers to the large and rigid framework that supports and guides the movement of the robotic arms, linear guides, rails, lights, cable carrier chains, and other tools.
  • the Gantry provides a stable and secure structure for mounting robotic components. It ensures that the components are properly aligned and supported during operation, minimizing vibrations and inaccuracies in movement. It is structurally robust to withstand the stresses associated with the dynamic movements and the weight of the robotic components as well as any payloads they are carrying.
  • the Gantry framework ensures synchronized motion of the various components and includes feedback sensors to adjust the position of the robotic arms with precision. It is designed with flexibility, allowing for customization and adaptation to different environments in propagation facilities.
  • gantries can be installed on automated tray conveyance lines transporting seedling trays.
  • the Gantry design avoids modifying conveyor lines during installation.
  • Each Gantry can accommodate up to four robotic arms on each side.
  • the Gantry’ s specialized design and object-oriented programming allow for adding up to eight robotic arms without requiring hardware or software modifications.
  • Robotic Arm The robotic arm’s responsibility is to place the ACD at the suitable stem-stake coupling point, ensuring it is correctly oriented while maneuvering along a specific route to avoid collisions with surrounding seedlings, all to complete the task efficiently.
  • the design of the robot arm incorporates considerations for speed, affordability, reliability, and ease of maintenance.
  • the robotic arm should afford simplicity, eliminating the necessity for robust and costly hardware for calculations of inverse kinematics, and path planning, as well as the control system. It should be lightweight and equipped with the features of an intelligent multi-agent system to facilitate communication with adjacent robotic arms. Choosing an optimal configuration for the robotic arm from a range of possibilities is advantageous to achieving a desired performance for the robotic system. In evaluating various robotic arm configurations for the stem-stake coupling task, we took into account eleven key parameters as follows:
  • Payload Capacity The maximum weight the robotic arm can handle without compromising performance or safety.
  • Dexterous Workspace The area within the robotic arm’s workspace where it can reach points from various orientations, offers increased flexibility and versatility in manipulation tasks.
  • the robotic arm can easily reach and operate within different areas of its workspace, considering obstacles, joint configurations, and potential collisions.
  • Precision The repeatability of the robotic arm’s positioning, ensuring consistent performance in manipulating objects.
  • Speed The rate at which the robotic arm can move impacts efficiency and cycle time for completing tasks.
  • Vibration The level of oscillatory motion or shaking exhibited by the robotic arm during operation, which can affect accuracy, precision, and performance.
  • Control System the hardware and software components responsible for programming, monitoring, and executing tasks with the robotic arm, ensuring efficient operation and coordination of movements.
  • Computation Cost The computational resources required to control and operate the robotic arm, including processing power, memory, and energy consumption.
  • Fig. 34 Five different configurations of robotic arms were selected and evaluated to determine the most suitable one for the robotic system, as shown in Fig. 34.
  • the PPPRRR configuration emerges as the most suitable option for the robotic system.
  • the initial three prismatic joints primarily handle the positioning of the ACD, while the subsequent three revolute joints govern its orientation.
  • the robotic arm features three prismatic joints that adjust the position of the ADC relative to the main reference frame at the suitable clipping point. Additionally, there are two revolute joints responsible for orienting the ADC.
  • the arm can handle payloads of up to 20 Newtons.
  • the Robot Control Unit Controls the robotic arm and manages the movement of each joint to achieve desired configurations or trajectories, allowing the arm to perform its designated tasks effectively.
  • the RCU utilizes the PID controllers to regulate the movement of individual joints, and it incorporates linear functions with parabolic blends (LSPB) for trajectory planning.
  • LSPB parabolic blends
  • the RCU ensures that the robotic arm maintains an accuracy of 0.1 mm and achieves a maximum speed of 400 mm/sec. Additionally, it guarantees the orientation accuracy of the ACD to be 0.2 degrees and enables a maximum angular velocity of 120 degrees/sec. Users can utilize a keypad to send commands to control the robot independently.
  • Fig. 36 presents the robotic arm’s logic flowchart, outlining the sequence of steps along with inputs, outputs, and loops.
  • seedlings are arranged in a predetermined order on the tray, and the position of each seedling is accurately known on the tray with an accuracy of approximately 5 cm.
  • the MCP evaluates the distance between the seedlings and the robotic arms to determine which seedlings should be clipped by each specific robotic arm. Based on the approximate position of the seedling on the tray and the robotic arm’s position in relation to the tray, the robot places the ACD near the desired seedling. Afterward, it follows the steps outlined in the flowchart to identify the appropriate point for stem-stake coupling. If obstructed by dense foliage, the robotic arm adjusts the ACD’s height or repositions it around the seedling to pinpoint a suitable location and finalize the coupling process. Alternatively, it may assign the coupling task to other robotic arms.
  • the ACD is equipped with a Megapixel/USB5000W02M stereo camera which captures images of the seedlings. These images are then transmitted to the machine vision System for further processing.
  • the machine vision System enables each robotic arm to analyze images and find the most suitable stem-stake coupling point using various algorithms and techniques.
  • the initial stage of the vision algorithm involves seedling recognition.
  • the machine vision System employs an adaptive feature-based plant recognition technique to accurately segment the seedling from other elements present in the image.
  • the algorithm utilizes four distinct techniques to determine the clipping point, considering factors like seedling type, environmental conditions such as lighting, seedling age, and the presence of other seedlings in the image.
  • the first technique is straightforward, as it finds the top of the seedling and identifies 3 to 5 centimeters below that point as the clipping point, depending on the type of the seedling.
  • the second technique uses real-time point recognition using kernel density estimators and pyramid histogram of oriented gradients (KDE-PHOG) [M. Asadi Shirzi, M. R. Kermani, Real-time point recognition for seedlings using kernel density estimators and pyramid histogram of oriented gradients, in: Actuators, Vol. 13, MDPI, 2024, p. 81.].
  • Real-time point localization using featurebased soft margin SVM-PCA method (RTPL) [M. Asadi Shirzi, M. R.
  • Kermani Real-time point localization on plants using feature-based soft margin svm-pca method, in: IEEE Transactions on Instrumentation and Measurement, IEEE, under processing, 2024.] represents another rapid and precise method for identifying the most suitable stem-stake coupling point.
  • YOLO-v8 [T. Han, T. Cao, Y. Zheng, L. Chen, Y. Wang, B. Fu, Improving the detection and positioning of camouflaged objects in yolov8, Electronics 12 (20) (2023) 4213], a deep learning algorithm, presents another option for identifying the stem-stake coupling point.
  • users have the option to manually define the point on the image using Manual Cursor Selection. This allows users to move the cursor across the image using the mouse and click on the desired point, which is then designated as the appropriate coupling point.
  • a novel automatic stem-stake coupling device has been specifically designed and integrated into the robotic system.
  • ADC automatic stem-stake coupling device
  • the claw-shaped arms are closed to bring the stem and the stake together.
  • a thin wire is then fed and guided through specialized components that shape it into a desired circular clip. Then the wire wraps around the stem and wooden stake. Finally, the Cutter trims the wire upon completion of the clip formation.
  • the size and shape of the clips can be adjusted based on the seedling or plant.
  • the stereo camera is installed on the ACD to provide real-time images for the machine vision System.
  • RESULTS To assess the performance of the robotic system, we’ve developed a robotic stem-stake coupling system comprising one gantry and two robotic arms, which we installed on a specialized tray commonly used in propagation facilities for transporting seedlings.
  • the robotic system performed the stem-stake coupling task on three types of seedlings commonly grown in greenhouses: cucumber, pepper, and tomato.
  • Precision TP / (TP + FP) (18) where true positive (TP) represents the number of correctly detected stem-stake coupling points, false positive (FP) is the number of miss-detection points, and false negative (FN) is the number of false alarms. Recall serves as an indicator of false detection, measuring the system’s ability to identify relevant instances. Precision, conversely, reflects the success rate of correct detection, illustrating the system’s accuracy in identifying the desired outcomes. We enlisted the expertise of farmers to determine which clips corresponded to TP or FN. These results are presented in Table 5. As observed, the RTPL method exhibits superior performance compared to the other three methods.
  • Table 7 displays the numbers of seedlings per tray and the stem-stake coupling speed allocated by each grower across three distinct plants in a propagation facility arena (this data originates from the propagation facility at Roelands Plant Farms). Furthermore, Table 8 includes the stem-stake coupling speed for two distinct configurations of the robotic system.
  • the initial setup (Pl) comprises a single robotic arm mounted on a Gantry, while the second configuration (P2) features twenty-four robotic arms distributed across three gantries.
  • the stem-stake coupling speed of a singular robotic arm within the robotic system is lower compared to the setup where multiple robotic arms are situated closely together on gantries. Because robotic arms do not need to move much to reach the seedlings.
  • the stem-stake coupling speed of a grower is approximately twice as fast as that of a robotic arm.
  • the robotic arm used machine vision calculations to reach these predetermined coordinates. After identifying and reaching the target point, we measured the error in the X, Y , and Z directions (e x, e_y, e_z), which are due to errors in both positioning and orientation of the ACD (Fig. 39a). Then we calculated the robotic arm’s cumulative error e_a, which is:
  • Fig. 39b displays the box-and-whisker plot illustrating the accuracy of the robotic arm and machine vision in determining and reaching the position and orientation of a specific point within the robotic arm’s working space.
  • Fig. 39c illustrates the accuracy and repeatability of the robotic arm in reaching a specific point in the working space (black dots), as well as the performance of the integrated system where machine vision calculates the spatial coordinates of the point from images, and the robotic arm moves the ACD to the identified point (blue dots). Determining the spatial coordinates of a point through stereo matching and disparity reduces the accuracy of the integrated robotic arm and machine vision system.
  • the cumulative error across all 5 joints results in the robotic arm’s accuracy being within less than 1 mm. If the integrated system of machine vision and the robotic arm achieve positional accuracy within 10 mm, the claw-shaped arms of the ACD can correct for any discrepancies by precisely aligning the seedling’s stem and the stake at the center. This integrated system demonstrates a positional accuracy within 10 mm in 97.5% of instances.
  • the proposed system featuring three gantries and 24 robotic arms, has the potential to replace the work of 12 expert farmers in a propagation facility. With an average success rate of 89%, the system can identify unsuccessful attempts and flag them, allowing two additional farmers to complete the remaining coupling tasks. This effectively reduces the reliance on expert farmers from 12 to just 2.
  • the RTPL technique employed within the machine vision system, exhibits superior performance in accurately identifying the stem-stake coupling point compared to other techniques. Each coupling task costs less than 1.5 cents when executed by the robotic system, in contrast to the 3 cents incurred when completed manually, which includes the labor cost and the cost of clip itself.
  • ACD cost reduction achieved through the use of ACD is attributed to its utilization of thin wire to produce clips, as opposed to using pre-made plastic clips as well the reduction of labor costs.
  • This approach requires inexpensive materials and involves a clip-making process that consumes minimal energy.
  • the use of metallic wire clips enhances environmental sustainability by reducing plastic waste.
  • the robotic system Due to the object-oriented design of the robotic system, it is feasible to modify the configuration to enhance the stem-stake coupling speed.
  • the robotic system is compatible with automated greenhouses and propagation facilities, requiring minimal alteration to their existing automatic lines.
  • the robotic system ensures maximum efficiency and productivity by allowing tha task to be carried out around the clock with consistent performance and reliability.
  • Each of the 24 robotic arms consumes 40 watts of power, resulting in a total system power usage of less than 1 kilowatt. This demonstrates the energy efficiency of the robotic system, as it minimizes power consumption while maintaining optimal performance.
  • the integrated machine vision and robotic arm system demonstrate a positional accuracy within 10 mm for 97.5% of attempts, with the cumulative error across all five joints resulting in the robotic arm achieving an accuracy of less than 1 mm. This level of precision makes the robotic system suitable for various precision agricultural applications.
  • the currently disclosed device, system and method provide plant processing in autonomous, semi-autonomous and manual modes.
  • Image sensor and machine vision components combined with a robot and robot controller provide autonomous or semi -autonomous of processing plants, and specifically identifying and effecting target points in plants.
  • the clipping device as a hand-held device still provides a manual benefit as a hand-held clipping device/gun.
  • the clipping device used by either a fanner for semi-autonomous clipping or an automatic machine or a robot for fully autonomous clipping can have an immense impact on the efficiency of the processes, reduction of the labour costs associated with clipping tasks, and the prevention of work-related injuries, such as back injuries, strains, and sprains, associated with awkward body position for clipping.
  • the clipping device can accommodate a variety of materials for making a clip such as copper, stainless steel, polyethylene, polyamide, polyesters or many variety of plastic wires including thermoplastic or thermoset materials. Many plastic blends or composites or metal alloys may be used.
  • a clipping point is an example of a target point, and the machine vision and robot components may be adapted to other target points, including for example spraying points, pruning points, harvesting points, or any other target point that may be relevant to a horticultural or agricultural plant care or handling.
  • a desired target point may vary according to a specific plant type and specific implementation.
  • variation of a clipping point is contemplated.
  • the clipping point may be proximal to the highest point of a plant, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point can be selected below the highest node or axial.
  • leaves may be dense around the highest node, and some parts of the main stem may be covered by leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder.
  • the selection of the clipping point is a cognitive process that relies on heuristic information. Therefore, a desired clipping point may vary to suit a particular implementation, and the machine vision component may be configured and trained accordingly.
  • Various techniques may be used in machine vision to find a suitable stem-stake coupling point such as: Real-Time Point Recognition for Seedlings Using Kernel Density Estimators and Pyramid Histogram of Oriented Gradients, Real-time Point Localization on Plants using Featurebased Soft Margin SVM-PCA Method, and YOLO v8/vl0.
  • the machine vision component may analyze acquired image data using a feature descriptor.
  • the feature descriptor may be variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Feature descriptors can be selected manually or automatically.
  • FMS correlation-based feature selection
  • Correlation is a statistical measure that expresses the strength of the relationship between two variables. A positive correlation occurs for variables that move in the same direction, and a negative correlation occurs when two variables move in opposite directions.
  • Correlation is often used to determine whether there is a cause-and- effect relationship between two variables and it is often used in machine learning to identify multicollinearity, that is when two or more predictor variables are highly correlated with each other.
  • Multicollinearity can impact the accuracy of predictive models and it is an important indicator of suitability of the variables selected for training.
  • extracting the correlation coefficient between two sets of stochastic variables is nontrivial, in particular, when canonical correlation analysis indicates such degraded correlations due to heavy noise contributions. Considering this fact, we avoided using the Fourier shape descriptor and SAR because of the heavy noise contributions.
  • the currently disclosed device, system and method can accommodate plant processing in any type of environment including, for example, horticulture, agriculture, outdoor farming, indoor greenhouse, and the like.
  • Embodiments disclosed herein, or portions thereof, can be implemented by programming one or more computer systems or devices with computer-executable instructions embodied in a non- transitory computer-readable medium. When executed by a processor, these instructions operate to cause these computer systems and devices to perform one or more functions particular to embodiments disclosed herein. Programming techniques, computer languages, devices, and computer-readable media necessary to accomplish this are known in the art.
  • a non-transitory computer readable medium embodying a computer program for processing plants may comprise: computer program code for acquiring image data of a plant with an image sensor; computer program code for analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; computer program code for identifying a target point in the plant based on the segmented image data; computer program code for determining distance/depth from the image sensor to the target point in the plant; and computer program code for generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • the computer readable medium further comprises computer program code for applying a kernel density estimator to the segmented image data to determine the target point.
  • the computer readable medium further comprises computer program code for validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • the computer readable medium is a data storage device that can store data, which can thereafter, be read by a computer system.
  • Examples of a computer readable medium include readonly memory, random-access memory, CD-ROMs, magnetic tape, optical data storage devices and the like.
  • the computer readable medium may be geographically localized or may be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
  • Computer-implementation of the system or method typically comprises a memory, an interface and a processor.
  • the interface may include a software interface that communicates with an end-user computing device through an Internet connection.
  • the interface may also include a physical electronic device configured to receive requests or queries from a device sending digital and/or analog information.
  • the interface can include a physical electronic device configured to receive signals and/or data relating to the plant processing method and system, for example from an imaging sensor or camera or image processing device.
  • Any suitable processor type may be used depending on a specific implementation, including for example, a microprocessor, a programmable logic controller or a field programmable logic array.
  • any conventional computer architecture may be used for computer-implementation of the system or method including for example a memory, a mass storage device, a processor (CPU), a graphical processing unit (GPU), a Read-Only Memory (ROM), and a Random-Access Memory (RAM) generally connected to a system bus of data-processing apparatus.
  • Memory can be implemented as a ROM, RAM, a combination thereof, or simply a general memory unit.
  • Software modules in the form of routines and/or subroutines for carrying out features of the system or method can be stored within memory and then retrieved and processed via processor to perform a particular task or function. Similarly, one or more method steps may be encoded as a program component, stored as executable instructions within memory and then retrieved and processed via a processor.
  • a user input device such as a keyboard, mouse, or another pointing device, can be connected to PCI (Peripheral Component Interconnect) bus.
  • the software may provide an environment that represents programs, files, options, and so forth by means of graphically displayed icons, menus, and dialog boxes on a computer monitor screen. For example, any number of plant images or clipping device characteristics or robot arm characteristics may be displayed.
  • Computer-implementation of the system or method may accommodate any type of end-user computing device including computing devices communicating over a networked connection.
  • the computing device may display graphical interface elements for performing the various functions of the system or method, including for example display of a clipping device characteristic or a robot arm characteristic during a stem-stake coupling task.
  • the computing device may be a server, desktop, laptop, notebook, tablet, personal digital assistant (PDA), PDA phone or smartphone, and the like.
  • PDA personal digital assistant
  • the computing device may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication. Communication can occur over a network, for example, where remote control of the system is desired.
  • the system or method may accommodate any type of network.
  • the network may be a single network or a combination of multiple networks.
  • the network may include the internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks.
  • the network may comprise a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet.
  • the network may comprise a computer network that makes use of a TCP/IP protocol (including protocols based on TCP/IP protocol, such as HTTP, HTTPS or FTP).
  • a stem-to-stake clipping device comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
  • Example 2 The device of example 1, wherein a circumference of the cam is eccentric to a rotation point of the cam, the lever moving from a first pivot position to a second pivot position by following the circumference of the cam.
  • Example 3 The device of example 2, wherein the cam is a drop cam with a single projecting tab/tooth, and the lever maintains the first pivot position for a major portion of each cam rotation, and the lever moves to the second pivot position when following the single projecting tab/tooth.
  • Example 4 The device of example 3, wherein the lever moves to the second pivot position when following in sequence a base to an apex of the single projecting tab/tooth, and returns to the first pivot position when following in sequence the apex to the base of the single projecting tab/tooth.
  • Example 5 The device of any one of examples 1-4, wherein the intermediate pivot point of the lever is rotationally coupled to the wire guide.
  • Example 6 The device of any one of examples 1-5, wherein the cutter is mounted on the second end of the lever, and the bender is mounted on the cutter.
  • Example 7 The device of any one of examples 1-6, wherein the feeder wheel pulls the wire from the spool and pushes the wire through the wire guide.
  • Example 8 The device of any one of examples 1-7, wherein a central groove is circumferentially formed in the feeder wheel for receiving the wire, and a plurality of transverse grooves intersect the central groove, intersecting edges of each of the plurality of transverse grooves and the central groove forming a friction surface for engaging the wire.
  • Example 9 The device of any one of example 8, wherein an increase in the number of transverse grooves is positively correlated with the length of the wire pulled from the spool in a single rotation cycle of the feeder wheel.
  • Example 10 The device of example 9, wherein the length of the wire pulled from the spool is determined by
  • Example 11 The device of any one of examples 1-9, wherein a gap is formed in a circumference of the feeder wheel, the gap reducing frictional force of the feeder wheel on the wire, the gap rotating to face the wire in overlapping coordination with the second pivot position of the lever.
  • Example 12 The device of example 11, wherein the gap faces the wire simultaneously with the second pivot position.
  • Example 13 The device of example 11, wherein the feeder wheel is a planar disc and the gap is formed as a flattened portion of the circumference of the feeder wheel.
  • Example 14 The device of example 11, wherein the gap provides a smooth surface devoid of a central groove and devoid of a transverse groove.
  • Example 15 The device of any one of examples 1-14, wherein the wire guide is formed as body with a bore extending through the body, the bore defining a lumen having a diameter sized to receive the wire, the lumen communicative with a first open end and an opposing second open end, the first open end is the entrance of the wire guide and the second open end is the exit of the wire guide.
  • Example 16 The device of any one of examples 1-15, wherein the bender strike surface is adjustable and tunable to and a change in a position of the strike surface changes the curving of the wire and the size or shape of the circular clip.
  • Example 17 The device of any one of examples 1-15, wherein the first pivot position is variable in a single rotational cycle by the lever following a varied radius of the cam.
  • Example 18 The device of example 17, wherein variation of the first pivot position in the single rotation cycle changes the curving of the wire and produces a spiral shape/configuration of the circular clip.
  • Example 19 The device of any one of examples 1-18, wherein the rotational actuator is a servo motor, a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator.
  • Example 20 The device of any one of examples 1-19, further comprising a clamp comprising first and second opposing jaws rotationally mounted to a frame of the device at first and second clamp rotation points, respectively, the clamp aligned with the exit of the wire guide.
  • Example 21 The device of example 20, further comprising a linear actuator pivotably coupled to both of the first and second opposing jaws at a common third clamp rotation point, the linear actuator driving counter-rotation of first and second opposing jaws to circumferentially reduce an open space between the first and second opposing jaws.
  • Example 22 The device of example 21, wherein the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
  • the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
  • Example 23 The device of any one of examples 20-22, further comprising a camera mounted to the frame of the device, an orientation and a field of view of the camera configured to capture the clamp.
  • Example 24 The device of example 23, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 25 The device of example 23 or 24, wherein the camera includes a light detection and ranging (LIDAR) sensor.
  • LIDAR light detection and ranging
  • Example 26 The device of example 23 or 24, wherein the camera includes a time-of-flight (TOF) sensor.
  • TOF time-of-flight
  • Example 27 The device of any one of examples 1-26, further comprising an optical sensor positioned proximal to a circumference of the feeder wheel, the optical sensor triggered by an indicator on the feeder wheel, the optical sensor sending a pulse signal after completion of each cycle of clipping.
  • Example 28 The device of any one of examples 1-27, further comprising a heater element coupled to the wire guide.
  • Example 29 The device of any one of examples 1-28, wherein the wire is a metal material that is copper, stainless steel, or any alloy thereof.
  • Example 30 The device of any one of examples 1-28, wherein the wire is a plastic material that is polyethylene, polyamide, polyester, any blend thereof, or any composite thereof.
  • Example 31 A method for processing plants, the method comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • Example 32 The method of example 31, wherein the image sensor is part of a camera.
  • Example 33 The method of example 32, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 34 The method of any one of examples 31-33, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
  • the image sensor is a light detection and ranging (LIDAR) sensor.
  • Example 35 The method of any one of examples 31-33, wherein the image sensor a time-of- flight (TOF) sensor.
  • TOF time-of- flight
  • Example 36 The method of any one of examples 31-35, wherein the acquired image data is transferred to a LAB color space.
  • Example 37 The method of any one of examples 31-36, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
  • Example 38 The method of example 37, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • Example 39 The method of example 37 or 38, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
  • Example 40 The method of any of examples 37-39, wherein the machine vision component comprises an artificial neural network.
  • Example 41 The method of any one of examples 37-40, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Example 42 The method of any one of examples 37-41, wherein the feature descriptor is selected by an automated algorithm.
  • Example 43 The method of any one of examples 31-42, further comprising applying morphological image processing to the segmented image data.
  • Example 44 The method of example 43, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
  • Example 45 The method of example 43, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
  • Example 46 The method of example 43, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
  • Example 47 The method of any one of examples 31-46, further comprising applying a point density variation to the segmented image data to determine the target point.
  • Example 48 The method of any one of examples 31-46, further comprising applying a kernel density estimator to the segmented image data to determine the target point.
  • Example 49 The method of any one of examples 31-48, further comprising validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • Example 50 The method of example 49, further comprising normalizing the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
  • Example 51 The method of any one of examples 31-50, further comprising projecting a coordinate map of the image sensor to a coordinate map of the robot.
  • Example 52 The method of example 51, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generating and communicating the control signal to the robot to minimize a difference between the real-time location data and the target point location.
  • Example 53 The method of any one of examples 31-52, further comprising mounting an end-effector to the robot.
  • Example 54 The method of example 53, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
  • the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake
  • the end-effector is a spraying device for localized spraying of the plant
  • the end-effector is a cutting device for localized pruning of the plant.
  • Example 55 The method of any one of examples 31-54, further comprising mounting a sensor to the robot.
  • Example 56 The method of example 55, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
  • Example 57 The method of any one of examples 31-56, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
  • Example 58 The method of any one of examples 31-56, wherein the plant is grown inside a greenhouse.
  • Example 59 The method of any one of examples 31-56, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
  • Example 60 The method of any one of examples 31-58, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
  • Example 61 The method of any one of examples 31-59, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
  • Example 62 A system for processing plants, the system comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
  • a memory configured to store image data
  • an image sensor configured to acquire image data of a plant
  • a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a
  • Example 63 The system of example 62, wherein the image sensor is part of a camera.
  • Example 64 The system of example 63, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 65 The system of any one of examples 62-64, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
  • the image sensor is a light detection and ranging (LIDAR) sensor.
  • Example 66 The system of any one of examples 62-64, wherein the image sensor a time-of- flight (TOF) sensor.
  • TOF time-of- flight
  • Example 67 The system of any one of examples 62-66, wherein the acquired image data is transferred to a LAB color space.
  • Example 68 The system of any one of examples 62-67, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
  • Example 69 The system of example 68, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • Example 70 The system of example 68 or 69, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
  • Example 71 The system of any of examples 68-70, wherein the machine vision component comprises an artificial neural network.
  • Example 72 The system of any one of examples 68-71, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Example 73 The system of any one of examples 68-72, wherein the feature descriptor is selected by an automated algorithm.
  • Example 74 The system of any one of examples 62-73, wherein the processor is configured to apply morphological image processing to the segmented image data.
  • Example 75 The system of example 74, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
  • Example 76 The system of example 75, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
  • Example 77 The system of example 74, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
  • Example 78 The system of any one of examples 62-77, wherein the processor is configured to apply a point density variation to the segmented image data to determine the target point.
  • Example 79 The system of any one of examples 62-77, wherein the processor is configured to apply a kernel density estimator to the segmented image data to determine the target point.
  • Example 80 The system of any one of examples 62-79, wherein the processor is configured to validate the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • Example 81 The system of example 80, wherein the processor is configured to normalize the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
  • Example 82 The system of any one of examples 62-81, wherein the processor is configured to project a coordinate map of the image sensor to a coordinate map of the robot.
  • Example 83 The system of example 82, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generates and communicates the control signal to the robot to minimize a difference between the real-time location data and the target point location.
  • Example 84 The system of any one of examples 62-83, further comprising an end-effector mounted to the robot.
  • Example 85 The system of example 84, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
  • the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake
  • the end-effector is a spraying device for localized spraying of the plant
  • the end-effector is a cutting device for localized pruning of the plant.
  • Example 86 The system of any one of examples 62-85, further comprising a sensor mounted to the robot.
  • Example 87 The system of example 86, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
  • Example 88 The system of any one of examples 62-87, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
  • Example 89 The system of any one of examples 62-87, wherein the plant is grown inside a greenhouse.
  • Example 90 The system of any one of examples 62-87, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
  • Example 91 The system of any one of examples 62-90, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
  • Example 92 The system of any one of examples 62-91, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Environmental Sciences (AREA)
  • Botany (AREA)
  • Developmental Biology & Embryology (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif de fixation de tige sur tuteur, comprenant : une bobine à rotation passive supportant et fournissant un rouleau de fil ; une roue d'alimentation poussant le fil à travers une entrée d'un guide-fil et une sortie d'un guide-fil ; une came couplée en co-rotation à la roue d'alimentation ; un actionneur rotatif entraînant la rotation de la roue d'alimentation et de la came ; une plieuse positionnée à proximité de la sortie du guide-fil, la plieuse fournissant une surface de frappe pour courber le fil en une pince circulaire ; un dispositif de coupe positionné à proximité de la sortie du guide-fil, le dispositif de coupe fournissant un bord pour couper le fil ; un levier ayant une première extrémité venant en butée contre la came, une seconde extrémité positionnant le dispositif de coupe et la plieuse, et un point de pivot intermédiaire ; le levier suivant la came pour se déplacer d'une première position de pivot alignant la surface de frappe de plieuse avec un fil poussé à travers l'extrémité de sortie à une seconde position de pivot balayant le bord de coupe à travers l'extrémité de sortie pour couper le fil. L'invention concerne également des procédés et des systèmes pour faire fonctionner le dispositif.
PCT/CA2024/000015 2023-11-30 2024-11-29 Dispositifs, systèmes et procédés de traitement ou de propagation de plantes Pending WO2025111689A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363604596P 2023-11-30 2023-11-30
US63/604,596 2023-11-30

Publications (1)

Publication Number Publication Date
WO2025111689A1 true WO2025111689A1 (fr) 2025-06-05

Family

ID=95895861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2024/000015 Pending WO2025111689A1 (fr) 2023-11-30 2024-11-29 Dispositifs, systèmes et procédés de traitement ou de propagation de plantes

Country Status (1)

Country Link
WO (1) WO2025111689A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5778946A (en) * 1995-09-12 1998-07-14 Pellenc (Societe Anonyme) Apparatus for placing ties, for example, for tying vines
US20080127787A1 (en) * 2005-01-18 2008-06-05 Jan Bosmans Device And A Method For Fitting An Elastic Element Around A Particularly Rectilinear Element

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5778946A (en) * 1995-09-12 1998-07-14 Pellenc (Societe Anonyme) Apparatus for placing ties, for example, for tying vines
US20080127787A1 (en) * 2005-01-18 2008-06-05 Jan Bosmans Device And A Method For Fitting An Elastic Element Around A Particularly Rectilinear Element

Similar Documents

Publication Publication Date Title
Miao et al. Efficient tomato harvesting robot based on image processing and deep learning
Yoshida et al. Automated harvesting by a dual-arm fruit harvesting robot
Arad et al. Development of a sweet pepper harvesting robot
Silwal et al. Bumblebee: A Path Towards Fully Autonomous Robotic Vine Pruning.
CN109863874B (zh) 一种基于机器视觉的果蔬采摘方法、采摘装置及存储介质
SepúLveda et al. Robotic aubergine harvesting using dual-arm manipulation
Silwal et al. Design, integration, and field evaluation of a robotic apple harvester
Rajendran et al. Towards autonomous selective harvesting: A review of robot perception, robot design, motion planning and control
Majeed et al. Development and performance evaluation of a machine vision system and an integrated prototype for automated green shoot thinning in vineyards
Pal et al. A novel end-to-end vision-based architecture for agricultural human–robot collaboration in fruit picking operations
Alaaudeen et al. Intelligent robotics harvesting system process for fruits grasping prediction
Majeed et al. Estimating the trajectories of vine cordons in full foliage canopies for automated green shoot thinning in vineyards
Khan et al. Optimizing precision agriculture: A real-time detection approach for grape vineyard unhealthy leaves using deep learning improved YOLOv7 with feature extraction capabilities
Wang et al. Biologically inspired robotic perception-action for soft fruit harvesting in vertical growing environments
Kounalakis et al. Development of a tomato harvesting robot: Peduncle recognition and approaching
CN119339249A (zh) 廊道封闭采收自动识别方法及系统
Luo et al. DRL-enhanced 3D detection of occluded stems for robotic grape harvesting
Navone et al. Autonomous Robotic Pruning in Orchards and Vineyards: a Review
Mahalingam et al. Containerized vertical farming using cobots
Udekwe et al. Virtual Reality-Enabled remote Human-Robot interaction for strawberry cultivation in greenhouses
Fernandes et al. Grapevine winter pruning: Merging 2D segmentation and 3D point clouds for pruning point generation
WO2025111689A1 (fr) Dispositifs, systèmes et procédés de traitement ou de propagation de plantes
Silwal Machine vision system for robotic apple harvesting in fruiting wall orchards
Krishnan et al. Revolutionizing agriculture: A comprehensive review of agribots, machine learning, and deep learning in meeting global food demands
Thet et al. Green Chili Pepper Localization System Based on Mask R-CNN and IoT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24895249

Country of ref document: EP

Kind code of ref document: A1