WO2022180276A1 - Autonomous precision landing system, method and program for drones - Google Patents
Autonomous precision landing system, method and program for drones Download PDFInfo
- Publication number
- WO2022180276A1 WO2022180276A1 PCT/ES2021/070126 ES2021070126W WO2022180276A1 WO 2022180276 A1 WO2022180276 A1 WO 2022180276A1 ES 2021070126 W ES2021070126 W ES 2021070126W WO 2022180276 A1 WO2022180276 A1 WO 2022180276A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- landing
- drone
- template
- image
- detected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
- B64D45/04—Landing aids; Safety measures to prevent collision with earth's surface
- B64D45/08—Landing aids; Safety measures to prevent collision with earth's surface optical
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/18—Visual or acoustic landing aids
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U70/00—Launching, take-off or landing arrangements
- B64U70/90—Launching from or landing on platforms
- B64U70/97—Means for guiding the UAV to a specific location on the platform, e.g. platform structures preventing landing off-centre
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
Definitions
- the present invention falls within the field of autonomous landing systems for drones.
- Infrared light detection it is based on the use of an array of infrared LEDs with respect to which to position oneself looking for its center (e.g. the “IRIock” autonomous landing system).
- the problem with using infrared is the contamination of the light footprint that may exist, especially with direct sunlight, which makes it difficult to make precise landings in certain environmental conditions.
- Code detection uses for landing the detection of a code (e.g. ArUco code used by the “Flytbase” autonomous landing system) printed on the landing zone, which can be used to identify the landing zone (to check for example, which is the one that corresponds to the drone) as well as as a navigation reference during landing.
- a code e.g. ArUco code used by the “Flytbase” autonomous landing system
- the current landing systems based on code detection e.g. ArUco, QR, ARtag
- an RTK-type GNSS navigation data source is used.
- an RTK system has the disadvantage of the need to use external systems, such as GNSS constellations (which do not provide guarantees of operational persistence for critical systems, as is the case of drone safe landing systems) and the telemetry link with the base to receive RTK corrections , which can also fail. This dependence on external systems can cause failures (eg in data communication), as well as requiring additional hardware elements.
- the present invention proposes an autonomous precision landing system that solves these problems.
- the invention relates to an autonomous precision landing system and method for drones.
- the invention is based on the detection of a certain landing template.
- the main components of the system are a landing template detector and a system to command corrections to the drone.
- the autonomous precision landing system for drones of the present invention comprises the following elements:
- a camera installed on board a drone and oriented towards the ground.
- a landing template detection unit on board the drone and configured to detect, in an image captured by the camera, a landing template formed by a plurality of concentric circular crowns of decreasing thickness by detecting a predetermined number N of concentric circles in the image.
- the landing template detection unit analyzes an input image and returns whether or not the template is present and the coordinates or relative position of the center of the template on the input image.
- a command system embedded in the drone and configured to perform an autonomous landing of the drone on the landing template using as reference the positions of the landing template detected in images captured by the camera during landing.
- the landing template detection unit can be used in real environments in a drone, with detection speeds in real time greater than 25Hz, which is important to be able to counteract movements of the drone in its displacement, both by the corrections that are applied to it as for external movements caused by example by the wind.
- the template detection algorithm can be run on a small, inexpensive device such as the Nvidia Jetson Nano.
- the detection carried out provides robustness against changing light conditions, being able to work with clear skies, with clouds and conditions in which shadows are produced on part of the landing template.
- the template detection of the present invention provides robustness against occlusions, being able to resist small elements that can cover the landing template, such as leaves that fall on top of the template. Detection is also robust against drone vibrations, as drone flight will induce vibrations on the camera mount. Furthermore, the degree of robustness of the detection can be adapted by regulating the predetermined number N of concentric circles in the image to be detected; for example, detections can be made more robust by enforcing a larger number N of concentric detections. By using a visual pattern (unlike RTK-based autonomous landing systems), the landing system can ensure the necessary operational persistence to ensure accurate landings.
- the landing template detection unit is capable of detecting the template at a minimum distance of about 15cm (which is necessary to be able to establish a guidance up to a sufficiently low height) and at a maximum distance higher to 10m, which is necessary to be able to start the automated control when the drone's return position has a static error of a couple of meters.
- the landing template detection procedure allows it to work with a simple camera (RGB monocular) and light (less than 50gr), so as not to reduce the load capacity or autonomy of the drone.
- the size of the detection template is limited by the dimensions of the landing platform where you want to land the drone. In one embodiment, for example, a size of 80x80cm is considered.
- the minimum size of the detection template will depend on the maximum distance at which detections are to be made; for example, in an embodiment where the size is reduced to 60x60cm, detections at a distance of 10m could be guaranteed. Detection distance and template size are related in a linear proportional manner. In case the required sensing distance is less, the size may reduce.
- the absolute minimum size of the detection template is the one that allows the circular crowns not to be coupled, which can be around 20x20cm.
- the drone command system it is in charge of making the automatic landing in a safe way, generating an appropriate set of movements. To do this, it uses a valid communication system (e.g. communication through MAVROS) with the chosen autopilot (e.g. PX4).
- the command system is capable of executing the control procedure at a real-time rate greater than 25Hz.
- the command system is capable of detecting the moment in which the drone must pass from the autopilot's own control (e.g. PX4) to the automatic precision landing control of the present invention. To do this, it evaluates that a series of conditions are met.
- the command system is responsible for repeatedly managing drone descent actions and drone centering actions during landing, since even if an initial centering of the drone is performed with respect to the landing template before starting the descent, during During descent, alterations in the centering of the drone may occur (produced, for example, by small gusts of wind) that make new corrections necessary for the correct centering of the drone at the moment of landing.
- the command system is capable of managing recovery actions, such as ascending to a predetermined level in the event that the landing template is no longer visible (e.g. lateral displacement caused by the wind, large object covering the template, etc. ), or alternative actions to precision landing (e.g. landing in an alternative zone, passing manual control to the pilot, etc.) when the landing process exceeds a pre-established time limit.
- recovery actions such as ascending to a predetermined level in the event that the landing template is no longer visible (e.g. lateral displacement caused by the wind, large object covering the template, etc. ), or alternative actions to precision landing (e.g. landing in an alternative zone, passing manual control to the pilot, etc.) when the landing process exceeds a pre-established time limit.
- the present invention also relates to an autonomous precision landing method for drones, comprising the following steps:
- a further aspect of the present invention relates to a program product comprising program instruction means for performing the autonomous precision landing method for drones.
- Figure 1 shows a first embodiment of a circular shaped landing template.
- Figure 2 illustrates a second embodiment of a circular shaped landing template.
- Figure 3 represents a third embodiment of a circular shaped landing template.
- Figure 4 represents a fourth embodiment of a circular shaped landing template.
- Figure 5A shows a landing template detection procedure in which the landing template of Figure 4 is used.
- Figure 5B illustrates, according to one embodiment, the detection process of N concentric circles in the image.
- Figures 6A-6D show an example of the treatment and analysis of an image captured by a drone camera for the detection of the landing template.
- Figure 7 shows an example of cropping an image to eliminate a circumference detected in it.
- Figure 8 shows, according to one embodiment, an autonomous precision landing method for drones.
- Figure 9 illustrates an autonomous precision landing system for drones according to a possible realization.
- Figure 10 shows a drone prepared to initiate a precision landing on a landing template arranged on a landing pad.
- Figures 11 and 12 show different checks that are carried out, according to a possible embodiment, before initiating a precision landing.
- Figure 13 shows a status check of the drone performed during a precision landing according to a possible embodiment.
- the present invention relates to an autonomous precision landing system for drones based on the detection of a landing template used as a visual reference for landing.
- the landing template must be identified by the drone during landing in order to land safely on top of it.
- the autonomous precision landing system comprises a landing template detection unit (or landing template detector) configured to detect the presence and position of the landing template in images captured by a ground-facing drone camera, and a drone command system in charge of executing the maneuvers and movements of the drone during landing and commanding the drone the appropriate corrections for the autonomous landing of the drone on the landing template based on the detections made by the landing template detector, and in particular taking as reference the positions of the landing template detected in images captured by the camera during the landing sequence.
- a landing template detection unit or landing template detector
- a drone command system in charge of executing the maneuvers and movements of the drone during landing and commanding the drone the appropriate corrections for the autonomous landing of the drone on the landing template based on the detections made by the landing template detector, and in particular taking as reference the positions of the landing template detected in images captured by the camera during the landing sequence.
- Figures 1 to 4 present different embodiments of landing templates, and algorithms used for their detection, using circles as shapes to detect.
- Figure 1 shows a circular landing template 100, formed by two concentric circular annuluses (102,104), used by a landing template detector based on YOLO ("You Only Look Once"), an object detection system in real time.
- YOLO You Only Look Once
- Tiny YOLOv2 detection system is used, a convolutional neural network (CNN) for object detection and classification.
- Tiny YOLOv2 is a smaller version of YOLOv2, with shorter execution times.
- the initialization time of the detector is long, since it must load a large amount of data in memory. From when you want to launch the detector until it is loaded, it can take more than 8 seconds.
- Figure 2 shows, according to a second embodiment, a landing template 200 based on circular shapes and colors.
- the landing template detector is responsible for the detection of concentric circular rings or annuluses (202,204,206) of different colors in the landing template 200.
- the colors of the different rings allow filtering of false positives.
- edge extraction and shape checks are performed.
- Template scalability Considering a "closed" design, making the template large enough to be detected at a distance of 10m by the drone causes it to lose detection when the drone is close (since only the inner circles are visible). To solve this problem, checks must be added or removed from the algorithm to adapt it to the final size.
- Figure 3 represents a third embodiment of the landing template 300, formed by a plurality of concentric circular crowns (302,304,306,308) of a dark tone, preferably black.
- the landing template detector is in this case based on detection of circular shapes and filtering with comparison of templates (“Template Matching”).
- the template comparison technique consists of searching an image for an input template, being a common technique to perform sign and license plate detection.
- Figure 4 depicts a fourth embodiment of the landing template 400, in which the landing template detector is based on recursive detection of circular shapes.
- the landing template 400 comprises a plurality of concentric annuluses 402, of decreasing thickness as the annuluses are closer to their center 406.
- the thickness of the annuluses defined by the difference between the outer radius and the inner radius of the annulus circular, is selected so that the annuluses can be clearly identifiable when the landing template 400 is a certain distance away from the drone camera.
- the concentric circular crowns 402 have a dark tone, preferably black, and are separated by spaces 404 in a light tone (white or another color with little intensity) and of decreasing thickness, in order to ensure that there is no overlapping optical effect. between the black concentric annuluses 402 as the landing template 400 is moved away.
- the landing template 400 could also be considered to be formed by a plurality of consecutive 2-tone concentric rings of decreasing thickness: first concentric annuluses 402 in a dark tone (for example, in black) separated from each other by second concentric circular annuluses (or spaces 402) in a light tone (for example, in white).
- This particular design of the landing template 400 allows the number of concentric annuluses 402 to be increased or decreased to make it larger or smaller to suit the required situation without having to change the detection procedure used by the landing template detector or with some minimum parameter changes, such as establishing the maximum and minimum radius (in pixels) of the circles to search.
- the total number of concentric circular crowns 402 allows to control the robustness against occlusions of some of the circular crowns (leaves, reflection produced by the Sun, etc.) as well as when only part of the template is visible because it is in one of the edges of the picture.
- a greater number of concentric annulus 402 offers greater robustness.
- Landing jig 400 can obviously use a different number of concentric annuluses 402 and thicknesses, depending on the particular application (landing jig 400 maximum size, minimum sensing distance, maximum sensing distance, etc.).
- the basis of the landing template detector's operation is to be able to detect a predetermined number N of concentric annuluses 402, where N is equal to or preferably less than the total number of concentric annuluses 402 in the landing template 400. In this way, the algorithm manages to detect both far and near distances.
- the number N of concentric circular annuluses 402 that the detector must search for is fixed for the entire process, both for near and far distances.
- N is a parameter
- the value of said parameter can be modified in real time, for example, to require a different number of concentric circles to be detected depending on the distance between the drone and the landing template 400 (eg the vertical distance measured by a height sensor and correcting for the drone's tilt angles).
- the template is easily scalable to different sizes, depending on the maximum distances to be detected.
- FIG. 5A depicts, according to one embodiment, a flowchart of a landing template detection procedure 500 executed by the landing template detector when landing template 400 of Figure 4 is used.
- a camera installed on board the drone and preferably oriented along the positive Z axis (axis directed downwards in normal flight attitude) in the body axis system of the aircraft acquires 501 an image 502.
- the camera In normal flight attitude, the camera it is oriented towards the ground; It is not necessary for the camera to be perpendicular to the ground in normal flight attitude, that is, the camera can be oriented at a certain angle with respect to the positive Z axis of the body axis system. In the event that the camera is not perpendicular to the ground but oriented at a small angle to the ground when it acquires the image, the circles on the landing template 400 will start to look like ellipses in the captured image, but the Hough transform will still represent it as circles and the detected center will be valid.
- the shape of the landing template circles 400 will appear on the image as ellipses in which the semi-major axis and semi-minor axis differ each time. larger in size, reaching a point from which the Hough transform will not be able to detect the circles.
- Figure 6A shows, by way of example, an image 502 captured by a drone camera at the initial moment of landing (for example, at an initial height greater than 10 meters), before beginning the landing process.
- the image will contain the entire landing template 400, in addition of other details and forms (601,603,605) present in the landing zone.
- a detection process of N concentric circles in the image 503 is started, and the detection 504 of at least N concentric circles in the image, where N is a predetermined number, is checked.
- detection 506 is determined on the image of the landing template 400, and the position (Xc.Yc) of the center 406 of the landing template is determined 508, which it is positioned in the center of the concentric circles detected. If at least N concentric circles are not detected, it is considered that the landing template has not been detected 510 and is therefore not present in the image.
- the detection of the circumferences in the image is preferably carried out by means of the Hough transform applied to circles.
- Figure 5B shows a flow diagram of the detection process 503 of N concentric circles in the image, according to a preferred embodiment.
- circumferences 520 are detected in the image 502 captured by a drone camera by means of the Hough transform, obtaining detected circumferences 522.
- the Hough transform would detect, as shown in Figure 6B , the circumferences 522a and 522b, which correspond respectively to the outer contour of the landing template 400 and the outer contour of the circle 601.
- the first 402a, second 402b and third 402 outermost circular crowns of the template are represented numbered landing template 400, and the first 404a, second 404b, and third 404c outermost spaces of the landing template 400.
- the Hough transform favors the detection of the outermost circumference in the case of concentric circumferences (as is the case with landing template 400), since the outer circumference is the first to receive positive votes. Since the first step in applying the Hough transform is to perform edge extraction on the image, depending on the lighting conditions the strongest edge of each black annulus could be either the inner edge or the outer edge. The algorithm works in both cases.
- the Hough transform applied to image 502 provides the outermost circumference 522a of landing template 400, corresponding to the outer edge of outermost annulus 402a.
- a recursive search is performed for the N-1 remaining inner concentric circumferences in order to detect the N required concentric circumferences.
- the recursive search comprises cropping 524 the image 502 to remove the detected circumference 522, obtaining a cropped image 526, as shown in Figure 6C when the crop is applied to the detected circumference 522a corresponding to the landing template 400.
- Crop 536 each cropped image (526,538) to eliminate the corresponding additional circumference 530, obtaining an additional cropped image 538 (which will be analyzed again to check if an additional circumference 530 is detected 528).
- N-1 additional circumferences 530 have already been detected (through, for example, a counter of detected additional circumferences or an iteration number counter), in which case it is determined that the landing template has been detected 506; otherwise, the cropped image is cropped 536 and the iterative process is restarted, starting again with the detection 528 of an additional circumference 530 in the cropped image 538.
- a first additional circumference 530a is detected 528, which would correspond to the outer edge of the outermost circular annulus 402b (and which is complete, not cropped) in the image. cropped image 526, said annulus 402b being the second outermost annulus of landing template 400, as illustrated in Figure 6B.
- the landing template detection unit also determines 508 the position (Xc.Yc) of the center 406 of the landing template 400.
- the coordinates (Xc.Yc) of the center of the landing template 400 can be mapped to the coordinates of the center (606a, 606b, 606c) of any of the concentric circles (522a, 530a, 530b) or can be calculated from them, for example as an average value of the coordinates of the centers (606a, 606b, 606c ) of the concentric circles (522a, 530a, 530b) detected.
- the cuts (524,536) of the images are made in such a way that the detected circumference is eliminated, to avoid detecting the same element again when the Hough transform is applied again to the cut image (526,538).
- the cropped image 526 is obtained by removing, at least partially, the outermost annulus 402a, so that none of its edges (inner or outer) can be detected when the Hough transform is applied. to the cropped image 526.
- the specific cropping to be applied once a circumference is detected and the next one is to be detected can be adjusted based on the ratios of the distances between the concentric circular crowns 402 and the spaces 404.
- FIG. 7 An example of cropping an image 700 (rectangular array) to remove a detected circumference 702 is shown in Figure 7.
- the cropped image 704 (or sub-image) is selected such that the detected circumference 702 is removed to prevent it from being detect the same element again in the cropped image 704.
- the detected circumference is defined by the parameters (x, y, r), where 'x' is the position of the center 706 of the circumference 702 for the X axis, 'y' is the position of the center 706 of the circle 702 for the Y axis, and the value ⁇ is the radius of the circle 702.
- the rectangle 708 (or square, when the camera is arranged perpendicular to the landing template 400) containing the detected circumference 702 is calculated.
- x1, x2, y1 , y2 are the minimum and maximum positions of the rectangle on the X and Y axes
- W is the width of the 700 image in pixels
- ⁇ ' is the height of the 700 image in pixels.
- a crop function is applied to the rectangle 708 to remove the detected circumference 702, reducing the size of the image 700 by a certain percentage.
- the chosen cutout is such that the cutout image 704 maintains a certain percentage (e.g. 70%) of the dimensions of the rectangle 708 that encompasses the circumference 702.
- Figure 7 shows half (To/2) of the initial size (To) of the rectangle 708 on the X axis, half (Ti/2) of the final size (Ti) of the cropped image 704, and the percentage of crop (%R) applied .
- the search for a circumference is carried out again.
- the cropped image 704 is smaller, so the computational cost of applying the Hough transform is reduced, since in the new sub image the maximum radius to be searched is reduced compared to the previous image and also there are fewer pixels on which to apply edge and circumference detection.
- an internal circumference is identified, it is calculated with a predetermined margin of error if it is concentric. This can be done by projecting the center of the circumferences found on the general coordinates of the input image, calculating the discrepancy of its position and verifying that it is less than a margin of error considered acceptable. The projection is done by adding for each axis the coordinates of the new detected center to the coordinates of the sub-image clipping rectangle.
- the landing template detection algorithm discards the circumference. If it is possible to validate the conditions for the required N total circumferences, the landing template detection is taken as valid.
- Landing template 400 can be supplemented by adding more internal shapes, such as squares or diamonds, for greater robustness against false positives.
- FIG 8 shows the flowchart of an autonomous precision landing method 800 for drones according to an embodiment of the present invention, which employs the process of landing template detection explained in Figures 5 to 7.
- the autonomous precision landing method 800 comprises the following steps:
- Detect 506 in each image 502 captured by the camera, a landing template 400 formed by a plurality of concentric circular crowns 402 of decreasing thickness by detecting 503 a predetermined number N of concentric circumferences in the corresponding image 502.
- Figure 9 illustrates the elements that make up an autonomous precision landing system 900 for drones according to an embodiment of the present invention.
- the autonomous precision landing system 900 includes elements mounted on a drone and responsible for the execution of the autonomous precision landing method 800 (FIG. 8) on board the drone during the landing sequence.
- the autonomous precision landing system 900 comprises the following elements: a ground-facing camera 910, a landing template detection unit 920, and a command system 930. As shown in Figure 9, the detection unit landing template detection 920 and the command system 930 can be incorporated in a control unit 940 (implemented for example in a CPU, a processor or a microcontroller), or they can be totally independent elements on board the drone, not included in a common device.
- the autonomous precision landing system 900 may also comprise a height sensor 912 to provide the command system 930 with the HD height of the drone relative to the ground. The direction of the height measurement would preferably be along the plumb line or along the Down axis of a NED (North East Dow) system centered on the drone's center of mass.
- the landing template detection unit 920 is configured to detect, in each image 502 received from the camera 910, a landing template 400 formed by a plurality of concentric circular annuluses 402 of decreasing thickness by detecting a predetermined number N of concentric circumferences in the corresponding image 502. Each time it detects a landing template 400 in a received image 502, the landing template detection unit 920 calculates the position 508 of the landing template, determined for example by the position (Xc.Yc) of the center 406 of the landing template 400, and provides it to the command system 930.
- the command system 930 is configured to perform the autonomous landing of the drone on the landing template 400 using as reference the positions 508 of the landing template 400 detected by the landing template detection unit 920 in images captured by the camera 910. during landing.
- a drone 1010 that incorporates the autonomous precision landing system 900 of Figure 9, where the camera 910 is focused towards a landing zone 1020 in which a landing platform 1030 is arranged that supports a landing template 400 arranged horizontally and oriented upwards.
- Landing template 400 may be attached or adhered in any way (e.g. by means of attachment or an adhesive element) to landing pad 1030, or painted on its upper outer surface.
- the autonomous precision landing system 1000 comprises, in addition to the elements on board the drone and described in Figure 9, the landing template 400 and/or the landing platform 1030.
- the drone 1010 is ready to initiate a precision landing 1002, in a position and at a height where the camera 910 can capture the landing template 400, which will be used as a visual aid during the execution of the precision landing.
- the command system 930 is in turn made up of several modules: a command start management unit 932, a movement control unit 934, in charge of the general control of movements of the drone 1010 based on the inputs, and a speed controller 936 configured to generate the necessary corrections to center the drone with respect to the landing template 400 according to the detections of the landing template 400 made by the detection unit of landing template 920 during landing.
- the 930 command system remains on standby until a series of conditions are met that allow the precision landing 1002 to be initiated. Said precision landing initiation conditions are automatically detected by the 932 command initiation management unit. In the event that any of the modules required for the operation is not available, for example due to connection errors, the program ends.
- the flow chart in Figure 11 shows different checks carried out by the start management unit of the command 932 before starting a precision landing 1002. First, the connection 1102 with the autopilot and the connection 1004 with the autopilot are checked. with the landing template detection unit 920, which is responsible for managing the camera 910, detecting the landing template 400 and returning the results related to its detection and its position. Such checks may include a maximum number of connection attempts.
- a reference code e.g. ArUco tag, QR, ARtag, or similar
- the reference code can be incorporated into the landing pad 400 itself (exterior to the concentric annuluses 402) or, as shown in Figure 10 (reference code 1050), into the landing pad 1030 itself. of the reference code, any detection method known in the state of the art can be used.
- All OFFBOARD' conditions 1106 have to be fulfilled one by one, for at least a specific time, according to a chain check, following a certain flow of checks (for example, according to the flow chart of Figure 12). In the event that any of the conditions fail, all the checks are repeated from the beginning.
- precision landing 1002 is initiated. If the precision landing conditions are not met, an alternative landing 1110 is initiated, in which the drone is directed automatically, using guidance techniques already known in the state of the art, to an alternative place to land, for example to a point defined by the user in the system configuration. This alternative point or location can be defined by setting a distance from the takeoff point (HomePos) to the desired point and an orientation (North, South, East, West or a more specific heading in degrees). Once the alternate landing site is reached, the drone goes into autopilot landing mode.
- HomePos takeoff point
- orientation North, South, East, West or a more specific heading in degrees
- This precision landing allows the integration of other systems to automate the maintenance of the drone or leave it protected, such as the integration with the so-called Nests for drones ("Drone Nest”) or landing in spaces with limited dimensions, while a non-precision landing system or alternate landing are aimed at landing safely, but for a use case where it is acceptable to assume the 3 m errors that can be had with a GNSS receiver conventional.
- Nests for drones
- Drone Nest a non-precision landing system or alternate landing are aimed at landing safely, but for a use case where it is acceptable to assume the 3 m errors that can be had with a GNSS receiver conventional.
- a flowchart of the state control performed by the motion control unit 934 during a precision landing 1002 is shown in Figure 13.
- the drone initiates the precision landing 1002 in a stationary state 1302 (no movement, according to the check made in 1204).
- the detection of the landing template by the landing template detection unit 920 is checked 1304 by analyzing at least one image captured by the camera 910 based on the previously described detection algorithm. If the landing template is not detected, it is checked 1306 whether the drone is positioned at a given maximum height; in the affirmative case, it returns to the stationary state 1302 to check again 1304 the detection of the landing template, and in Otherwise, the drone goes to an ascension state 1308 to reach the maximum height determined from which it can detect the landing template and start the landing sequence.
- the drone In the event that the drone remains hovering 1302 for a certain time without detecting the landing template, the drone abandons the precision landing 1002 and initiates an alternate landing 1110 instead. Similarly, if from the start precision landing 1002 an internal time counter reaches a certain maximum value, precision landing 1002 is canceled and alternate landing 1110 proceeds.
- the drone enters a horizontal approach state 1310 in order to center the drone horizontally with respect to landing template 400 (i.e., center drone 1010 at coordinates (Xc.Yc ) of the center 406 of the landing template 400).
- the drone checks 1312 the detection of the landing template in at least one image captured by the camera 910. If it does not detect the landing template 400 for a certain period of time (during the checks the drone remains stationary), it is checked 1306 if the drone is positioned at a maximum height.
- the landing template In the event that the landing template is detected, it proceeds to check 1314 if the drone is centered on it. If it is not centered, it returns to the horizontal approach state 1310 to give the appropriate speed commands that allow the drone 1010 to center on the landing template 400 detected. This horizontal approach process is repeated until the drone is centered on the landing template, at which point the drone enters a vertical approach state 1316 in order to gradually descend and vertically approach the landing template 400.
- the drone descends vertically, repeatedly checking during the descent if the template is still detected 1318 and if the drone is still centered 1320. In case the landing template is no longer detected (for example, when a strong crosswind gust has occurred that has substantially displaced the drone and prevents it from detecting the landing template at its current height), the drone enters climb state 1308 until template 1304 detections are obtained again or the landing template is reached. the established maximum flight height 1306, in which case it will return to the stationary state 1302, as in the beginning of the precision landing 1002. If during the descent it is verified that the drone is no longer centered, it returns to a state of horizontal zoom 1310 to center the drone on the landing template.
- the drone 1010 descends and remains centered on the landing template 400, it is checked 1322 whether the drone is positioned at the landing height (eg at the height of the landing template), at which point which enters a landing state 1324 and terminates the precision landing sequence 1002.
- the landing height eg at the height of the landing template
- the motion control unit 934 uses the speed controller 936 in the horizontal approach state 1310 to center the drone 1010 on the landing template 400 during precision landing 1002.
- the speed controller 936 is responsible for generating the speeds of correction necessary to center the drone, depending on the position (Xc.Yc) of the landing template detected at each instant.
- the speed controller 936 is also responsible for generating the vertical ascent (in climb state 1308) and descent (in vertical approach state 1316) speeds of the drone.
- the implemented speed controller 936 is preferably based on a proportional controller with maximum and minimum output cutoff. This cutoff can be set based on the input offset instead of being a fixed preset value.
- the speed controller 936 inputs are as follows:
- Offset on the X-axis (offset x ) in pixels between the center of the image 502 and the center (Xc.Yc) of the landing template 400.
- Offset on the Y axis (offset y ) in pixels between the center of the image 502 and the center (Xc.Yc) of the landing template 400.
- MaxSpeedPer Percentage
- MaxSpeed x The maximum output speed value on the horizontal X axis ( MaxSpeed x ) and on the horizontal Y axis ( MaxSpeed y ) is calculated as:
- MaxSpeed x abs(deviation x ) + MaxSpeedPer x abs(deviation x )
- MaxSpeed y abs(deviation y ) + MaxSpeedPer x abs (deviation y )
- the speed to command on each horizontal axis (X,Y) is calculated as follows:
- the accuracy of the calculation largely depends on the quality of the height measurement as well as the perpendicularity between the camera 910 (the main lens axis) and the landing template 400.
- the distance measured by the height sensor 912 is corrects when the drone 1010 is not perpendicular to the plane of the landing template 400.
- the pitch angles of the drone 1010 when moving to apply the corrections or due to the need to overcome the wind are reduced, so they do not pose a problem for the calculation of the Physical Error with the camera.
- V x PhysicalError x xg
- V y ErrorPhysicalOy xg
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
Abstract
Description
SISTEMA. MÉTODO Y PRODUCTO DE PROGRAMA DE ATERRIZAJE DE PRECISIÓN SYSTEM. PRECISION LANDING PROGRAM METHOD AND PRODUCT
AUTÓNOMO PARA DRONES AUTONOMOUS FOR DRONES
DESCRIPCIÓN DESCRIPTION
Campo de la invención field of invention
La presente invención se engloba dentro del campo de los sistemas de aterrizaje autónomo para drones. The present invention falls within the field of autonomous landing systems for drones.
Antecedentes de la invención Background of the invention
En la actualidad, existen sistemas de aterrizaje autónomo para drones que utilizan distintas tecnologías para aterrizar en una zona de aterrizaje concreta, entra las cuales destacan: Currently, there are autonomous landing systems for drones that use different technologies to land in a specific landing zone, among which the following stand out:
• Detección de luz infrarroja: se basa en la utilización de un array de leds infrarrojos respecto a la cual posicionarse buscando su centro (e.g. el sistema de aterrizaje autónomo “IRIock”). El problema de utilizar infrarrojos es la contaminación de la huella lumínica que puede haber, especialmente con luz directa del sol, lo cual dificulta realizar aterrizajes precisos en ciertas condiciones ambientales.. • Infrared light detection: it is based on the use of an array of infrared LEDs with respect to which to position oneself looking for its center (e.g. the “IRIock” autonomous landing system). The problem with using infrared is the contamination of the light footprint that may exist, especially with direct sunlight, which makes it difficult to make precise landings in certain environmental conditions.
• Detección de códigos: utiliza para el aterrizaje la detección de un código (e.g. código ArUco empleado por el sistema de aterrizaje autónomo “Flytbase”) impreso en la zona de aterrizaje, el cual se puede emplear para identificar la zona de aterrizaje (para comprobar por ejemplo que es la que corresponde al dron) a la vez que como referencia de navegación durante el aterrizaje. Sin embargo, los sistemas de aterrizaje actuales basados en detección de códigos (e.g. ArUco, QR, ARtag) tienen la desventaja de que únicamente pueden detectar el código cuando hay visión completa del mismo, con lo que no funcionan correctamente ante detecciones parciales del código, lo cual ocurre de manera muy frecuente cuando durante la operación de aterrizaje el dron se encuentra a baja altura, ya próximo a la zona de aterrizaje, o cuando elementos externos (como por ejemplo hojas caídas de los árboles) cubren parcialmente el código. . • Code detection: uses for landing the detection of a code (e.g. ArUco code used by the “Flytbase” autonomous landing system) printed on the landing zone, which can be used to identify the landing zone (to check for example, which is the one that corresponds to the drone) as well as as a navigation reference during landing. However, the current landing systems based on code detection (e.g. ArUco, QR, ARtag) have the disadvantage that they can only detect the code when there is a complete view of it, so they do not work correctly when the code is partially detected, This happens very frequently when the drone is at a low altitude during the landing operation, close to the landing zone, or when external elements (such as fallen leaves from the trees) partially cover the code. .
En el ámbito de los sistemas de aterrizaje preciso para drones, donde se necesita que el dron aterrice en un espacio reducido y concreto, por ejemplo, en una plataforma de aterrizaje de pequeño tamaño para que el dron pueda ser cargado de manera automática, en la actualidad para realizar aterrizajes con errores de unos pocos centímetros se utiliza una fuente de datos de navegación GNSS tipo RTK. Pero un sistema RTK tiene la desventaja de la necesidad de utilizar sistemas externos, como las constelaciones de GNSS (las cuales no dan garantías de persistencia operacional para sistemas críticos, como es el caso de los sistemas de aterrizaje seguros de drones) y el enlace de telemetría con la base para la recepción de las correcciones RTK, que también puede fallar. Esta dependencia en sistemas externos puede provocar fallos (e.g. en la comunicación de datos), además de requerir elementos hardware adicionales. In the field of precise landing systems for drones, where the drone needs to land in a small and specific space, for example, on a small landing platform so that the drone can be loaded automatically, in the Currently, to make landings with errors of a few centimeters, an RTK-type GNSS navigation data source is used. But an RTK system has the disadvantage of the need to use external systems, such as GNSS constellations (which do not provide guarantees of operational persistence for critical systems, as is the case of drone safe landing systems) and the telemetry link with the base to receive RTK corrections , which can also fail. This dependence on external systems can cause failures (eg in data communication), as well as requiring additional hardware elements.
La presente invención propone un sistema de aterrizaje autónomo de precisión que resuelve estos problemas. The present invention proposes an autonomous precision landing system that solves these problems.
Descripción de la invención Description of the invention
La invención se refiere a un sistema y método de aterrizaje de precisión autónomo para drones. La invención se basa en la detección de una determinada plantilla de aterrizaje. Los componentes principales del sistema son un detector de plantilla de aterrizaje y un sistema para comandar las correcciones al dron. The invention relates to an autonomous precision landing system and method for drones. The invention is based on the detection of a certain landing template. The main components of the system are a landing template detector and a system to command corrections to the drone.
De acuerdo a una realización, el sistema de aterrizaje de precisión autónomo para drones de la presente invención comprende los siguientes elementos: According to one embodiment, the autonomous precision landing system for drones of the present invention comprises the following elements:
Una cámara instalada a bordo de un dron y orientada hacia el suelo. A camera installed on board a drone and oriented towards the ground.
Una unidad de detección de plantilla de aterrizaje, embarcada en el dron y configurada para detectar, en una imagen capturada por la cámara, una plantilla de aterrizaje formada por una pluralidad de coronas circulares concéntricas de grosor decreciente mediante la detección de un número predeterminado N de circunferencias concéntricas en la imagen. La unidad de detección de plantilla de aterrizaje analiza una imagen de entrada y devuelve si la plantilla está o no presente y las coordenadas o posición relativa del centro de la plantilla sobre la imagen de entrada. A landing template detection unit, on board the drone and configured to detect, in an image captured by the camera, a landing template formed by a plurality of concentric circular crowns of decreasing thickness by detecting a predetermined number N of concentric circles in the image. The landing template detection unit analyzes an input image and returns whether or not the template is present and the coordinates or relative position of the center of the template on the input image.
Un sistema de comando, embarcado en el dron y configurado para realizar un aterrizaje autónomo del dron sobre la plantilla de aterrizaje utilizando como referencia las posiciones de la plantilla de aterrizaje detectadas en imágenes capturadas por la cámara durante el aterrizaje. A command system, embedded in the drone and configured to perform an autonomous landing of the drone on the landing template using as reference the positions of the landing template detected in images captured by the camera during landing.
De forma ventajosa, la unidad de detección de plantilla de aterrizaje puede ser usada en entornos reales en un dron, con velocidades de detecciones en tiempo real superior a 25Hz, lo cual es importante para poder contrarrestar movimientos del dron en su desplazamiento, tanto por las correcciones que se le apliquen como por movimientos externos causados por ejemplo por el viento. Advantageously, the landing template detection unit can be used in real environments in a drone, with detection speeds in real time greater than 25Hz, which is important to be able to counteract movements of the drone in its displacement, both by the corrections that are applied to it as for external movements caused by example by the wind.
El algoritmo de detección de plantilla se puede ejecutar en un dispositivo pequeño y de coste reducido, como la Nvidia Jetson Nano. La detección realizada aporta robustez frente a condiciones lumínicas cambiantes, siendo capaz de funcionar con cielo despejado, con nubes y condiciones en las que se produzcan sombras sobre una parte de la plantilla de aterrizaje. The template detection algorithm can be run on a small, inexpensive device such as the Nvidia Jetson Nano. The detection carried out provides robustness against changing light conditions, being able to work with clear skies, with clouds and conditions in which shadows are produced on part of the landing template.
La detección de plantilla de la presente invención aporta robustez frente a oclusiones, siendo capaz de resistir a pequeños elementos que puedan tapar la plantilla de aterrizaje, como por ejemplo hojas que caigan encima de la plantilla. La detección también es robusta frente a vibraciones del dron, ya que el vuelo del dron inducirá unas vibraciones sobre el soporte de la cámara. Además, se puede adaptar el grado de robustez de la detección mediante la regulación del número predeterminado N de circunferencias concéntricas en la imagen que deben detectarse; por ejemplo, se puede dar más robustez a las detecciones imponiendo un número N mayor de detecciones concéntricas. Al utilizar un patrón visual (a diferencia de los sistemas de aterrizaje autónomo basados en sistemas RTK), el sistema de aterrizaje puede garantizar la persistencia operacional necesaria para asegurar los aterrizajes precisos. The template detection of the present invention provides robustness against occlusions, being able to resist small elements that can cover the landing template, such as leaves that fall on top of the template. Detection is also robust against drone vibrations, as drone flight will induce vibrations on the camera mount. Furthermore, the degree of robustness of the detection can be adapted by regulating the predetermined number N of concentric circles in the image to be detected; for example, detections can be made more robust by enforcing a larger number N of concentric detections. By using a visual pattern (unlike RTK-based autonomous landing systems), the landing system can ensure the necessary operational persistence to ensure accurate landings.
Con respecto a la distancia de detección, la unidad de detección de plantilla de aterrizaje es capaz de detectar la plantilla a una distancia mínima de unos 15cm (lo cual es necesario para poder establecer un guiado hasta una altura suficientemente baja) y a una distancia máxima superior a 10m, lo cual es necesario para poder iniciar el control automatizado cuando la posición de retorno del dron tenga un error estático de un par de metros. Regarding the detection distance, the landing template detection unit is capable of detecting the template at a minimum distance of about 15cm (which is necessary to be able to establish a guidance up to a sufficiently low height) and at a maximum distance higher to 10m, which is necessary to be able to start the automated control when the drone's return position has a static error of a couple of meters.
El procedimiento de detección de plantilla de aterrizaje permite funcionar con una cámara sencilla (monocular RGB) y ligera (menos de 50gr), para no reducir la capacidad de carga o autonomía del dron. The landing template detection procedure allows it to work with a simple camera (RGB monocular) and light (less than 50gr), so as not to reduce the load capacity or autonomy of the drone.
El tamaño de la plantilla de detección está limitado por las dimensiones de la plataforma de aterrizaje donde se quiera hacer aterrizar el dron. En una realización se considera por ejemplo un tamaño de 80x80cm. El tamaño mínimo de la plantilla de detección será dependiente de la distancia máxima a la que se quieran realizar las detecciones; por ejemplo, en una realización donde el tamaño se reduce a 60x60cm se podrían garantizar detecciones a 10m de distancia. La distancia de detección y el tamaño de la plantilla se relacionan de manera proporcional lineal. En caso de que la distancia de detección requerida sea inferior, el tamaño puede reducirse. El tamaño mínimo absoluto de la plantilla de detección es aquel que permite que no se acoplen las coronas circulares, que puede rondar los 20x20cm. The size of the detection template is limited by the dimensions of the landing platform where you want to land the drone. In one embodiment, for example, a size of 80x80cm is considered. The minimum size of the detection template will depend on the maximum distance at which detections are to be made; for example, in an embodiment where the size is reduced to 60x60cm, detections at a distance of 10m could be guaranteed. Detection distance and template size are related in a linear proportional manner. In case the required sensing distance is less, the size may reduce. The absolute minimum size of the detection template is the one that allows the circular crowns not to be coupled, which can be around 20x20cm.
Con respecto al sistema de comando del dron, se encarga de realizar el aterrizaje automático de una manera segura, generando un conjunto de movimientos adecuado. Para ello utiliza un sistema de comunicación válido (e.g. comunicación mediante MAVROS) con el autopiloto elegido (e.g. PX4). El sistema de comando es capaz de ejecutar el procedimiento de control a una velocidad en tiempo real superior a 25Hz. Regarding the drone command system, it is in charge of making the automatic landing in a safe way, generating an appropriate set of movements. To do this, it uses a valid communication system (e.g. communication through MAVROS) with the chosen autopilot (e.g. PX4). The command system is capable of executing the control procedure at a real-time rate greater than 25Hz.
El sistema de comando es capaz de detectar el momento en el que el dron debe pasar del control propio del autopiloto (e.g. PX4) al control automático de aterrizaje de precisión propio de la presente invención. Para ello evalúa que se cumplan una serie de condiciones. The command system is capable of detecting the moment in which the drone must pass from the autopilot's own control (e.g. PX4) to the automatic precision landing control of the present invention. To do this, it evaluates that a series of conditions are met.
El sistema de comando se encarga de gestionar de manera repetida acciones de descenso del dron y acciones de centrado del dron durante el aterrizaje, ya que aunque se realice un centrado inicial del dron con respecto a la plantilla de aterrizaje antes de comenzar el descenso, durante el descenso se pueden producir alteraciones en el centrado del dron (producidas por ejemplo por pequeñas ráfagas de viento) que hagan necesario nuevas correcciones para el correcto centrado del dron en el momento del aterrizaje. The command system is responsible for repeatedly managing drone descent actions and drone centering actions during landing, since even if an initial centering of the drone is performed with respect to the landing template before starting the descent, during During descent, alterations in the centering of the drone may occur (produced, for example, by small gusts of wind) that make new corrections necessary for the correct centering of the drone at the moment of landing.
Así mismo, el sistema de comando es capaz de gestionar acciones de recuperación, como ascender a un nivel predeterminado en caso de que la plantilla de aterrizaje deje de ser visible (e.g. desplazamiento lateral producido por el viento, objeto grande cubriendo la plantilla, etc.), o acciones alternativas al aterrizaje de precisión (e.g. aterrizar en una zona alternativa, pasar control manual al piloto, etc.) cuando el proceso de aterrizaje exceda un tiempo límite preestablecido. Likewise, the command system is capable of managing recovery actions, such as ascending to a predetermined level in the event that the landing template is no longer visible (e.g. lateral displacement caused by the wind, large object covering the template, etc. ), or alternative actions to precision landing (e.g. landing in an alternative zone, passing manual control to the pilot, etc.) when the landing process exceeds a pre-established time limit.
La presente invención también se refiere a un método de aterrizaje de precisión autónomo para drones, que comprende las siguientes etapas: The present invention also relates to an autonomous precision landing method for drones, comprising the following steps:
- Adquirir repetidamente imágenes mediante una cámara instalada a bordo de un dron y orientada hacia el suelo. - Repeatedly acquire images using a camera installed on board a drone and oriented towards the ground.
Detectar, en cada imagen capturada por la cámara, una plantilla de aterrizaje formada por una pluralidad de coronas circulares concéntricas de grosor decreciente mediante la detección de un número predeterminado N de circunferencias concéntricas en la imagen. Realizar un aterrizaje autónomo del dron sobre la plantilla de aterrizaje utilizando como referencia las posiciones de la plantilla de aterrizaje detectadas en imágenes capturadas por la cámara durante el aterrizaje. Detect, in each image captured by the camera, a landing template formed by a plurality of concentric circular crowns of decreasing thickness by detecting a predetermined number N of concentric circles in the image. Perform an autonomous landing of the drone on the landing template using as reference the positions of the landing template detected in images captured by the camera during landing.
Un aspecto adicional de la presente invención se refiere a un producto de programa que comprende medios de instrucciones de programa para llevar a cabo el método de aterrizaje de precisión autónomo para drones. A further aspect of the present invention relates to a program product comprising program instruction means for performing the autonomous precision landing method for drones.
Breve descripción de los dibujos Brief description of the drawings
A continuación se pasa a describir de manera muy breve una serie de dibujos que ayudan a comprender mejor la invención y que se relacionan expresamente con una realización de dicha invención que se presenta como un ejemplo no limitativo de ésta. Next, a very brief description is given of a series of drawings that help to better understand the invention and that are expressly related to an embodiment of said invention that is presented as a non-limiting example of it.
La Figura 1 muestra una primera realización de una plantilla de aterrizaje de forma circular. Figure 1 shows a first embodiment of a circular shaped landing template.
La Figura 2 ilustra una segunda realización de una plantilla de aterrizaje de forma circular. Figure 2 illustrates a second embodiment of a circular shaped landing template.
La Figura 3 representa una tercera realización de una plantilla de aterrizaje de forma circular. Figure 3 represents a third embodiment of a circular shaped landing template.
La Figura 4 representa una cuarta realización de una plantilla de aterrizaje de forma circular. Figure 4 represents a fourth embodiment of a circular shaped landing template.
La Figura 5A muestra un procedimiento de detección de plantilla de aterrizaje en el que se emplea la plantilla de aterrizaje de la Figura 4. La Figura 5B ilustra, de acuerdo a una realización, el proceso de detección de N circunferencias concéntricas en la imagen. Figure 5A shows a landing template detection procedure in which the landing template of Figure 4 is used. Figure 5B illustrates, according to one embodiment, the detection process of N concentric circles in the image.
Las Figuras 6A-6D muestran un ejemplo del tratamiento y análisis de una imagen capturada por una cámara del dron para la detección de la plantilla de aterrizaje. Figures 6A-6D show an example of the treatment and analysis of an image captured by a drone camera for the detection of the landing template.
La Figura 7 muestra un ejemplo de recorte de una imagen para eliminar una circunferencia detectada en la misma. Figure 7 shows an example of cropping an image to eliminate a circumference detected in it.
La Figura 8 muestra, de acuerdo a una realización, un método de aterrizaje de precisión autónomo para drones. Figure 8 shows, according to one embodiment, an autonomous precision landing method for drones.
La Figura 9 ilustra un sistema de aterrizaje de precisión autónomo para drones de acuerdo a una posible realización. Figure 9 illustrates an autonomous precision landing system for drones according to a possible realization.
La Figura 10 muestra un dron preparado para iniciar un aterrizaje de precisión sobre una plantilla de aterrizaje dispuesta en una plataforma de aterrizaje. Figure 10 shows a drone prepared to initiate a precision landing on a landing template arranged on a landing pad.
Las Figuras 11 y 12 muestran distintas comprobaciones que se realizan, de acuerdo a una posible realización, antes de iniciar un aterrizaje de precisión. Figures 11 and 12 show different checks that are carried out, according to a possible embodiment, before initiating a precision landing.
La Figura 13 muestra un control de estados del dron realizado durante un aterrizaje de precisión según una posible realización. Figure 13 shows a status check of the drone performed during a precision landing according to a possible embodiment.
Descripción detallada de la invención Detailed description of the invention
La presente invención se refiere a un sistema de aterrizaje de precisión autónomo para drones basado en la detección de una plantilla de aterrizaje empleada como referencia visual para el aterrizaje. La plantilla de aterrizaje debe ser identificada por el dron durante el aterrizaje para conseguir aterrizar de manera segura encima de ella. The present invention relates to an autonomous precision landing system for drones based on the detection of a landing template used as a visual reference for landing. The landing template must be identified by the drone during landing in order to land safely on top of it.
El sistema de aterrizaje de precisión autónomo comprende una unidad de detección de plantilla de aterrizaje (o detector de plantilla de aterrizaje) configurada para detectar la presencia y posición de la plantilla de aterrizaje en imágenes capturadas por una cámara del dron orientada hacia el suelo, y un sistema de comando del dron encargado de ejecutar las maniobras y movimientos del dron durante el aterrizaje y comandar al dron las adecuadas correcciones para el aterrizaje autónomo del dron en la plantilla de aterrizaje en base a las detecciones realizadas por el detector de plantilla de aterrizaje, y en particular tomando como referencia las posiciones de la plantilla de aterrizaje detectadas en imágenes capturadas por la cámara durante la secuencia de aterrizaje. The autonomous precision landing system comprises a landing template detection unit (or landing template detector) configured to detect the presence and position of the landing template in images captured by a ground-facing drone camera, and a drone command system in charge of executing the maneuvers and movements of the drone during landing and commanding the drone the appropriate corrections for the autonomous landing of the drone on the landing template based on the detections made by the landing template detector, and in particular taking as reference the positions of the landing template detected in images captured by the camera during the landing sequence.
En el ámbito de las técnicas de detección de formas, existen múltiples métodos para detectar figuras o formas de elementos presentes en una imagen digital. En el caso de los círculos, los métodos de visión artificial clásicos más habituales están basados en la extracción de bordes seguida de un procesado, como la extracción de contornos y la Transformada de Hough. Existen alternativas como la extracción de componentes conexas o la búsqueda de patrones mediante comparación de plantillas (“Témplate Matching”), entre otras. Otra alternativa es usar técnicas de visión artificial basadas en inteligencia artificial, como pueden ser las redes neuronales en forma de red convolucional. En cuanto a las formas a detectar en la plantilla, unas de las más eficaces son los cuadrados (e.g., etiquetas o códigos ArUco, QR, ARtag) y los círculos. Cuando no se necesita codificar información en el interior de la plantilla los círculos presentan la ventaja de ser invariantes frente a rotaciones, además de ser más fáciles de distinguir cuando no toda la plantilla es visible. In the field of shape detection techniques, there are multiple methods to detect shapes or shapes of elements present in a digital image. In the case of circles, the most common classical computer vision methods are based on edge extraction followed by processing, such as contour extraction and the Hough Transform. There are alternatives such as the extraction of related components or the search for patterns by comparing templates (“Template Matching”), among others. Another alternative is to use artificial vision techniques based on artificial intelligence, such as neural networks in the form of a convolutional network. As for the shapes to detect in the template, some of the most effective are squares (eg, ArUco, QR, ARtag labels or codes) and circles. When no information needs to be encoded inside the template, circles have the advantage of being invariant against rotations, as well as being easier to distinguish when not the entire template is visible.
En las Figuras 1 a 4 se presentan diferentes realizaciones de plantillas de aterrizaje, y de algoritmos empleados para su detección, utilizando círculos como formas a detectar. Figures 1 to 4 present different embodiments of landing templates, and algorithms used for their detection, using circles as shapes to detect.
La Figura 1 muestra una plantilla de aterrizaje 100 de forma circular, formada por dos coronas circulares (102,104) concéntricas, empleada por un detector de plantilla de aterrizaje basado en YOLO (“You Only Look Once"), un sistema de detección de objetos en tiempo real. Para esta realización se emplea el sistema de detección Tiny YOLOv2, una red neuronal convolucional (CNN) para detección y clasificación de objetos. Tiny YOLOv2 es una versión más reducida de YOLOv2, con unos tiempos de ejecución inferiores. Figure 1 shows a circular landing template 100, formed by two concentric circular annuluses (102,104), used by a landing template detector based on YOLO ("You Only Look Once"), an object detection system in real time. For this realization, the Tiny YOLOv2 detection system is used, a convolutional neural network (CNN) for object detection and classification. Tiny YOLOv2 is a smaller version of YOLOv2, with shorter execution times.
Para conseguir detectar la plantilla es necesario realizar un reentrenamiento de la red neuronal, lo cual se consigue a base de generar un conjunto de ejemplos en los que se etiqueta la plantilla y se le asigna una clase, en este caso la única clase a detectar. Los algoritmos como YOLO tiene la ventaja de permitir una buena resistencia al ruido y a las oclusiones, pero disponen de los siguientes problemas: In order to detect the template, it is necessary to retrain the neural network, which is achieved by generating a set of examples in which the template is labeled and a class is assigned to it, in this case the only class to be detected. Algorithms like YOLO have the advantage of allowing good resistance to noise and occlusions, but they have the following problems:
• Necesita ejecutarse en un dispositivo con GPU (e.g. la familia de productos de Nvidia Jetson), descartando la posibilidad de usarse en otros equipos más sencillos como una Odroid. • It needs to run on a device with a GPU (e.g. the Nvidia Jetson family of products), ruling out the possibility of using it on other simpler computers like an Odroid.
• Necesita librerías de optimización de redes neuronales (TensorRT) que cambian con frecuencia y no son estables para trabajar en producción. • You need neural network optimization libraries (TensorRT) that change frequently and are not stable to work in production.
• El tiempo de inicialización del detector es elevado, ya que debe cargar una gran cantidad de datos en memoria. Desde que se quiere lanzar el detector hasta que se carga pueden pasar más de 8 segundos. • The initialization time of the detector is long, since it must load a large amount of data in memory. From when you want to launch the detector until it is loaded, it can take more than 8 seconds.
• Velocidad de ejecución limitada, con valores que en la Jetson Nano rondan los 10-12 Hz. • Limited execution speed, with values that in the Jetson Nano are around 10-12 Hz.
• Necesidad de generar un conjunto de datos de entrenamiento y prueba y de realizar validaciones de los entrenamientos, lo que supone una gran cantidad de tiempo. En la Figura 2 se representa, de acuerdo a una segunda realización, una plantilla de aterrizaje 200 basada en formas circulares y colores. El detector de plantilla de aterrizaje se encarga de la detección de anillos o coronas circulares concéntricas (202,204,206) de diferentes colores en la plantilla de aterrizaje 200. Los colores de los diferentes anillos permiten realizar un filtrado de falsos positivos. Para la detección de los círculos se realiza extracción de bordes y comprobaciones de forma. • Need to generate a set of training and test data and perform training validations, which is time consuming. Figure 2 shows, according to a second embodiment, a landing template 200 based on circular shapes and colors. The landing template detector is responsible for the detection of concentric circular rings or annuluses (202,204,206) of different colors in the landing template 200. The colors of the different rings allow filtering of false positives. For the detection of the circles, edge extraction and shape checks are performed.
Las principales ventajas de esta realización son: The main advantages of this embodiment are:
• Bajo número de falsos positivos cuando se emplea una gestión de color adecuada. • Low number of false positives when using proper color management.
• Velocidad de ejecución buena (+20 FPS) sobre CPU. • Good execution speed (+20 FPS) on CPU.
• Sólo necesita apoyarse en la librería de OpenCV. • You only need to rely on the OpenCV library.
• Buena resistencia a oclusiones. • Good resistance to occlusions.
Los problemas que presenta son: The problems it presents are:
• Dificultad para interpretar los colores en condiciones lumínicas reales (cambios en el balance de blancos y exposición), lo que hace que se tengan que implementar métodos de búsqueda automática de parámetros para cada situación y que pueden inducir un aumento de falsos positivos. • Difficulty interpreting colors in real lighting conditions (changes in white balance and exposure), which means that automatic parameter search methods have to be implemented for each situation and that can lead to an increase in false positives.
• Escalabilidad de la plantilla. Al plantearse un diseño “cerrado” hacer que la plantilla tenga un tamaño suficiente para detectarse a 10m de distancia por parte del dron hace que pierda detecciones cuando el dron está cerca (ya que solo se ven los círculos internos). Para solucionar este problema deben añadirse o quitarse comprobaciones en el algoritmo para adaptarlo al tamaño final. • Template scalability. Considering a "closed" design, making the template large enough to be detected at a distance of 10m by the drone causes it to lose detection when the drone is close (since only the inner circles are visible). To solve this problem, checks must be added or removed from the algorithm to adapt it to the final size.
La Figura 3 representa una tercera realización de la plantilla de aterrizaje 300, formada por una pluralidad de coronas circulares concéntricas (302,304,306,308) de una tonalidad oscura, preferentemente negra. El detector de plantilla de aterrizaje está en este caso basado en detección de formas circulares y filtrado con comparación de plantillas (“Témplate Matching”). La técnica de comparación de plantillas consiste en la búsqueda en una imagen de una plantilla de entrada, siendo una técnica común para realizar detección de señales y matrículas. Figure 3 represents a third embodiment of the landing template 300, formed by a plurality of concentric circular crowns (302,304,306,308) of a dark tone, preferably black. The landing template detector is in this case based on detection of circular shapes and filtering with comparison of templates (“Template Matching”). The template comparison technique consists of searching an image for an input template, being a common technique to perform sign and license plate detection.
Los principales problemas de esta técnica son: The main problems with this technique are:
• Gestionar la escala de la plantilla (debe ser muy similar a la del objeto presente en la imagen a analizar). • Manage the scale of the template (it must be very similar to that of the object present in the image to be analyzed).
• Gestionar oclusiones, ya que éstas generan una confianza baja en la métrica usada para comparar la plantilla con la sección de la imagen de entrada. • Manage occlusions, as these generate low confidence in the metric used to compare the template with the input image section.
En el caso de que para ayudar al algoritmo se busquen plantillas de múltiples tamaños o con oclusiones predefinidas, los tiempos de ejecución se vuelven demasiado elevados. Además, es un método que no es invariante a rotaciones, con lo que añadir formas adicionales en la plantilla para aumentar la robustez complica la implementación en gran medida. In the event that templates of multiple sizes or with predefined occlusions are sought to help the algorithm, the execution times become too high. Furthermore, it is a method that is not rotation invariant, so adding additional shapes in the template to increase robustness complicates the implementation to a great extent.
La Figura 4 representa una cuarta realización de la plantilla de aterrizaje 400, en la cual el detector de plantilla de aterrizaje está basado en detección recursiva de formas circulares. La plantilla de aterrizaje 400 comprende una pluralidad de coronas circulares concéntricas 402, de grosor decreciente conforme las coronas están más cerca de su centro 406. El grosor de las coronas circulares, definido por la diferencia entre el radio exterior y el radio interior de la corona circular, se selecciona para que las coronas circulares puedan ser claramente identificables cuando la plantilla de aterrizaje 400 se encuentra a una cierta distancia alejada de la cámara del dron. Figure 4 depicts a fourth embodiment of the landing template 400, in which the landing template detector is based on recursive detection of circular shapes. The landing template 400 comprises a plurality of concentric annuluses 402, of decreasing thickness as the annuluses are closer to their center 406. The thickness of the annuluses, defined by the difference between the outer radius and the inner radius of the annulus circular, is selected so that the annuluses can be clearly identifiable when the landing template 400 is a certain distance away from the drone camera.
Las coronas circulares concéntricas 402 tienen una tonalidad oscura, preferentemente negra, y están separadas por unos espacios 404 en tonalidad clara (blanco u otro color con poca intensidad) y de grosores decrecientes, con el objeto de conseguir que no exista un efecto óptico de solapamiento entre las coronas circulares concéntricas 402 negras al alejar la plantilla de aterrizaje 400. Alternativamente, también se podría considerar que la plantilla de aterrizaje 400 está formada por una pluralidad de anillos concéntricos consecutivos a 2 tonos y de grosores decrecientes: unas primeras coronas circulares concéntricas 402 en tono oscuro (por ejemplo, en negro) separados entre sí por unas segundas coronas circulares concéntricas (o espacios 402) en un tono claro (por ejemplo, en blanco). The concentric circular crowns 402 have a dark tone, preferably black, and are separated by spaces 404 in a light tone (white or another color with little intensity) and of decreasing thickness, in order to ensure that there is no overlapping optical effect. between the black concentric annuluses 402 as the landing template 400 is moved away. Alternatively, the landing template 400 could also be considered to be formed by a plurality of consecutive 2-tone concentric rings of decreasing thickness: first concentric annuluses 402 in a dark tone (for example, in black) separated from each other by second concentric circular annuluses (or spaces 402) in a light tone (for example, in white).
Este particular diseño de la plantilla de aterrizaje 400 permite aumentar o disminuir el número de coronas circulares concéntricas 402 para hacerla más grande o pequeña y adaptarla a la situación requerida sin tener que cambiar el procedimiento de detección empleado por el detector de plantilla de aterrizaje o con unos cambios de parámetros mínimos, como pueden ser establecer el radio máximo y mínimo (en píxeles) de los círculos a buscar. This particular design of the landing template 400 allows the number of concentric annuluses 402 to be increased or decreased to make it larger or smaller to suit the required situation without having to change the detection procedure used by the landing template detector or with some minimum parameter changes, such as establishing the maximum and minimum radius (in pixels) of the circles to search.
El número total de coronas circulares concéntricas 402 permite controlar la robustez frente a oclusiones de alguno de las coronas circulares (hojas, reflejo producido por el Sol, etc.) así como cuando solo es visible una parte de la plantilla por encontrarse en uno de los bordes de la imagen. Una mayor cantidad de coronas circulares concéntricas 402 ofrece una mayor robustez. The total number of concentric circular crowns 402 allows to control the robustness against occlusions of some of the circular crowns (leaves, reflection produced by the Sun, etc.) as well as when only part of the template is visible because it is in one of the edges of the picture. A greater number of concentric annulus 402 offers greater robustness.
En el ejemplo mostrado en la Figura 4, en el que se emplea un tamaño máximo de la plantilla de aterrizaje de 80x80cm y se requiere una distancia de detección mínima de unos 15cm y una distancia de detección máxima superior a 10m, se han considerado 7 coronas circulares concéntricas 402. Para maximizar la plantilla de aterrizaje 400 se diseñan las coronas circulares concéntricas 402 de manera que la corona circular más externa utilice los 40cm de radio máximo permitido para la plantilla. Los radios interiores y exteriores de las diferentes coronas circulares seleccionados en el ejemplo de la Figura 4 es el siguiente (unidades en cm): In the example shown in Figure 4, in which a maximum landing template size of 80x80cm is used and a minimum detection distance of about 15cm and a maximum detection distance greater than 10m are required, 7 crowns have been considered. concentric annuluses 402. To maximize the landing template 400, the concentric annuluses 402 are designed so that the outermost annulus utilizes the 40cm maximum radius allowed for the template. The inner and outer radii of the different circular crowns selected in the example of Figure 4 is as follows (units in cm):
La plantilla de aterrizaje 400 obviamente puede utilizar un número diferente de coronas circulares concéntricas 402 y de grosores, en función de la aplicación concreta (tamaño máximo de la plantilla de aterrizaje 400, distancia de detección mínima, distancia de detección máxima, etc.). Landing jig 400 can obviously use a different number of concentric annuluses 402 and thicknesses, depending on the particular application (landing jig 400 maximum size, minimum sensing distance, maximum sensing distance, etc.).
La base del funcionamiento del detector de plantilla de aterrizaje es ser capaz de detectar un número predeterminado N de coronas circulares concéntricas 402, donde N es igual o, preferentemente, menor al número total de coronas circulares concéntricas 402 de la plantilla de aterrizaje 400. De esta manera el algoritmo consigue detectar tanto a distancias lejanas como cercanas. The basis of the landing template detector's operation is to be able to detect a predetermined number N of concentric annuluses 402, where N is equal to or preferably less than the total number of concentric annuluses 402 in the landing template 400. In this way, the algorithm manages to detect both far and near distances.
De acuerdo a una realización, el número N de coronas circulares concéntricas 402 que debe buscar el detector es fijo para todo el proceso, tanto para distancias cercanas como lejanas. Sin embargo, al ser N un parámetro se puede modificar en tiempo real el valor de dicho parámetro, por ejemplo, para exigir un número distinto de círculos concéntricos a detectar en función de la distancia entre el dron y la plantilla de aterrizaje 400 (e.g. la distancia vertical medida por un sensor de altura y corrigiendo los ángulos de inclinación del dron). El valor mínimo del parámetro N será N=2 (esto es, detectar un par de círculos concéntricos) y el valor máximo de N dependerá de la cantidad de círculos visualmente identificables en una imagen extraída de la cámara del dron, lo cual depende por tanto de la resolución de imagen inicial, el tamaño de la plantilla y la distancia del dron a la plantilla. Aumentar el parámetro N implicaría una disminución del rendimiento general del algoritmo al tener que realizar un mayor número de comprobaciones, pero dado que la imagen a analizar es cada vez menor (por los recortes sucesivos de las imágenes) el coste de cada comprobación adicional es inferior a un crecimiento de complejidad lineal. According to one embodiment, the number N of concentric circular annuluses 402 that the detector must search for is fixed for the entire process, both for near and far distances. However, since N is a parameter, the value of said parameter can be modified in real time, for example, to require a different number of concentric circles to be detected depending on the distance between the drone and the landing template 400 (eg the vertical distance measured by a height sensor and correcting for the drone's tilt angles). The minimum value of the parameter N will be N=2 (that is, to detect a pair of concentric circles) and the maximum value of N will depend on the number of visually identifiable circles in an image extracted from the drone's camera, which therefore depends on of the initial image resolution, the size of the template and the distance from the drone to the template. Increasing the parameter N would imply a decrease in the overall performance of the algorithm by having to perform a greater number of checks, but since the image to be analyzed is getting smaller and smaller (due to the successive cuts of the images), the cost of each additional check is lower. to linear complexity growth.
Para realizar la detección de coronas circulares concéntricas de la plantilla de aterrizaje 400 de la Figura 4 se usa el algoritmo de la transformada de Hough aplicada a circunferencias. La base del funcionamiento del algoritmo es la siguiente: To perform the detection of concentric circular crowns of the landing template 400 of Figure 4, the Hough transform algorithm applied to circumferences is used. The basis of the operation of the algorithm is as follows:
1. Detectar circunferencias en la imagen capturada por la cámara del dron. 1. Detect circumferences in the image captured by the drone's camera.
2. Para cada circunferencia detectada, realizar N-1 búsquedas recursivas de circunferencias concéntricas interiores a la circunferencia detectada. 2. For each circumference detected, carry out N-1 recursive searches for concentric circles inside the detected circumference.
Las principales ventajas de esta técnica son: The main advantages of this technique are:
• Alta robustez frente a oclusiones parciales, ya que se consiguen detecciones hasta un 50% de oclusión cuando se corta por uno de los ejes. En el caso de que se corte la plantilla con ambos ejes se llegan a realizar detecciones cuando solamente se ve un 30-40% de la plantilla. • High robustness against partial occlusions, since detections are achieved up to 50% occlusion when cut along one of the axes. In the event that the template is cut with both axes, detections are made when only 30-40% of the template is visible.
• Buena robustez frente a condiciones lumínicas. El algoritmo de detección funciona correctamente en múltiples condiciones reales, con luz directa y con sombra. • Good robustness against light conditions. The detection algorithm works correctly in multiple real conditions, with direct light and with shadow.
• Alta velocidad de ejecución. Es una técnica computacionalmente sencilla capaz de ejecutarse a una velocidad en tiempo real superior a 25 Hz (e.g. más de 40Hz en la Nvidia Jetson Nano). • High execution speed. It is a computationally simple technique capable of running at a real-time rate greater than 25 Hz (e.g. greater than 40 Hz on the Nvidia Jetson Nano).
• Funciona correctamente con distintos modelos de cámara sin tener que modificar parámetros, con lo que su expansión a cámaras diferentes es sencilla o directa. • It works correctly with different camera models without having to modify parameters, so its expansion to different cameras is simple or direct.
• La plantilla es fácilmente escalable a distintos tamaños, en función de las distancias máximas que se quieran detectar. • The template is easily scalable to different sizes, depending on the maximum distances to be detected.
Para aumentar la robustez frente a falsos positivos, el algoritmo de detección se inicia en el momento en el que el dron está posicionado por encima de la zona de aterrizaje y se quiere iniciar la secuencia de aterrizaje. Además, esto se puede complementar con comprobaciones de códigos adicionales, como etiquetas ArUco o QR, presentes en la zona de aterrizaje, que además permiten enlazar un determinado dron con una determinada zona de aterrizaje. La Figura 5A representa, de acuerdo a una realización, un diagrama de flujo de un procedimiento de detección de plantilla de aterrizaje 500 ejecutado por el detector de plantilla de aterrizaje cuando se emplea la plantilla de aterrizaje 400 de la Figura 4. En primer lugar, una cámara instalada a bordo del dron y preferentemente orientada según el eje Z positivo (eje dirigido hacia abajo en la actitud normal de vuelo) en el sistema de ejes cuerpo de la aeronave adquiere 501 una imagen 502. En actitud normal de vuelo, la cámara está orientada hacia el suelo; no es necesario que la cámara se encuentre de manera perpendicular al suelo en actitud normal de vuelo, esto es, la cámara puede estar orientada formando un determinado ángulo con respecto al eje Z positivo del sistema de ejes cuerpo. En el caso de que la cámara no esté perpendicular al suelo sino orientada un pequeño ángulo con respecto al suelo cuando adquiere la imagen, los círculos de la plantilla de aterrizaje 400 empezarán a parecer elipses en la imagen capturada, pero la transformada de Hough seguirá representándolo como círculos y el centro detectado será válido. Conforme la inclinación de la cámara con respecto al suelo en el momento de capturar la imagen se incrementa, la forma de los círculos de la plantilla de aterrizaje 400 aparecerán en la imagen como elipses en las que el semieje mayor y el semieje menor difieren cada vez más en tamaño, llegado un punto a partir del cual la transformada de Hough no será capaz de detectar los círculos. To increase robustness against false positives, the detection algorithm starts at the moment the drone is positioned above the landing zone and the landing sequence is to start. In addition, this can be complemented with additional code checks, such as ArUco or QR tags, present in the landing zone, which also allow a certain drone to be linked to a certain landing zone. Figure 5A depicts, according to one embodiment, a flowchart of a landing template detection procedure 500 executed by the landing template detector when landing template 400 of Figure 4 is used. First, a camera installed on board the drone and preferably oriented along the positive Z axis (axis directed downwards in normal flight attitude) in the body axis system of the aircraft acquires 501 an image 502. In normal flight attitude, the camera it is oriented towards the ground; It is not necessary for the camera to be perpendicular to the ground in normal flight attitude, that is, the camera can be oriented at a certain angle with respect to the positive Z axis of the body axis system. In the event that the camera is not perpendicular to the ground but oriented at a small angle to the ground when it acquires the image, the circles on the landing template 400 will start to look like ellipses in the captured image, but the Hough transform will still represent it as circles and the detected center will be valid. As the tilt of the camera relative to the ground at the time of capturing the image increases, the shape of the landing template circles 400 will appear on the image as ellipses in which the semi-major axis and semi-minor axis differ each time. larger in size, reaching a point from which the Hough transform will not be able to detect the circles.
En la Figura 6A se representa, a modo de ejemplo, una imagen 502 capturada por una cámara del dron en el momento inicial del aterrizaje (por ejemplo, a una altura inicial superior a 10 metros), antes de comenzar el proceso de aterrizaje. Cuando el dron está en una posición de inicio de aterrizaje por encima de la zona de aterrizaje, a una cierta altura con respecto a la plantilla de aterrizaje 400 (e.g. a 10m de altura), la imagen contendrá la plantilla de aterrizaje 400 completa, además de otros detalles y formas (601 ,603,605) presentes en la zona de aterrizaje. Figure 6A shows, by way of example, an image 502 captured by a drone camera at the initial moment of landing (for example, at an initial height greater than 10 meters), before beginning the landing process. When the drone is in a landing start position above the landing zone, at a certain height with respect to the landing template 400 (e.g. at 10m height), the image will contain the entire landing template 400, in addition of other details and forms (601,603,605) present in the landing zone.
A continuación, se inicia un proceso de detección de N circunferencias concéntricas en la imagen 503, y se comprueba la detección 504 de al menos N circunferencias concéntricas en la imagen, donde N es un número predeterminado. En el caso de que se detecten al menos N circunferencias concéntricas, se determina la detección 506 en la imagen de la plantilla de aterrizaje 400, y se determina 508 la posición (Xc.Yc) del centro 406 de la plantilla de aterrizaje, el cual está posicionado en el centro de las circunferencias concéntricas detectadas. Si no se detectan al menos N circunferencias concéntricas, se considera que la plantilla de aterrizaje no ha sido detectada 510 y que, por tanto, no está presente en la imagen. La detección de las circunferencias en la imagen se realiza preferentemente mediante la transformada de Hough aplicada a circunferencias. Next, a detection process of N concentric circles in the image 503 is started, and the detection 504 of at least N concentric circles in the image, where N is a predetermined number, is checked. In the event that at least N concentric circles are detected, detection 506 is determined on the image of the landing template 400, and the position (Xc.Yc) of the center 406 of the landing template is determined 508, which it is positioned in the center of the concentric circles detected. If at least N concentric circles are not detected, it is considered that the landing template has not been detected 510 and is therefore not present in the image. The detection of the circumferences in the image is preferably carried out by means of the Hough transform applied to circles.
En la Figura 5B se muestra un diagrama de flujo del proceso de detección 503 de N circunferencias concéntricas en la imagen, de acuerdo a una realización preferida. En primer lugar se detectan circunferencias 520 en la imagen 502 capturada por una cámara del dron mediante la transformada de Hough, obteniendo circunferencias detectadas 522. En el ejemplo de la Figura 6A la transformada de Hough detectaría, tal y como se representa en la Figura 6B, las circunferencias 522a y 522b, que corresponden respectivamente al contorno exterior de la plantilla de aterrizaje 400 y el contorno exterior del círculo 601. En la Figura 6B se representan numeradas la primera 402a, segunda 402b y tercera 402 coronas circulares más externas de la plantilla de aterrizaje 400, y el primer 404a, segundo 404b y tercero 404c espacios más externos de la plantilla de aterrizaje 400. Figure 5B shows a flow diagram of the detection process 503 of N concentric circles in the image, according to a preferred embodiment. First, circumferences 520 are detected in the image 502 captured by a drone camera by means of the Hough transform, obtaining detected circumferences 522. In the example of Figure 6A, the Hough transform would detect, as shown in Figure 6B , the circumferences 522a and 522b, which correspond respectively to the outer contour of the landing template 400 and the outer contour of the circle 601. In Figure 6B the first 402a, second 402b and third 402 outermost circular crowns of the template are represented numbered landing template 400, and the first 404a, second 404b, and third 404c outermost spaces of the landing template 400.
El uso de la transformada de Hough favorece la detección de la circunferencia más externa en caso de existir circunferencias concéntricas (como es el caso de la plantilla de aterrizaje 400), ya que la circunferencia externa es la primera en recibir las votaciones positivas. Dado que el primer paso de aplicar la transformada de Hough es realizar una extracción de bordes sobre la imagen, en función de las condiciones lumínicas el borde más intenso de cada corona circular negra podría ser el borde interior o el borde exterior. El algoritmo funciona en ambos casos. En el ejemplo de la Figura 6B, la transformada de Hough aplicado a la imagen 502 proporciona la circunferencia 522a más externa de la plantilla de aterrizaje 400, correspondiente al borde externo de la corona circular 402a más externa. The use of the Hough transform favors the detection of the outermost circumference in the case of concentric circumferences (as is the case with landing template 400), since the outer circumference is the first to receive positive votes. Since the first step in applying the Hough transform is to perform edge extraction on the image, depending on the lighting conditions the strongest edge of each black annulus could be either the inner edge or the outer edge. The algorithm works in both cases. In the example of Figure 6B, the Hough transform applied to image 502 provides the outermost circumference 522a of landing template 400, corresponding to the outer edge of outermost annulus 402a.
Para cada circunferencia detectada (522a y 522b en el ejemplo de la Figura 6B), se realiza una búsqueda recursiva de las N-1 circunferencias concéntricas interiores restantes para llegar a detectar las N circunferencias concéntricas requeridas. La búsqueda recursiva comprende recortar 524 la imagen 502 para eliminar la circunferencia detectada 522, obteniendo una imagen recortada 526, tal y como se representa en la Figura 6C cuando se aplica el recorte a la circunferencia detectada 522a correspondiente a la plantilla de aterrizaje 400. For each circumference detected (522a and 522b in the example of Figure 6B), a recursive search is performed for the N-1 remaining inner concentric circumferences in order to detect the N required concentric circumferences. The recursive search comprises cropping 524 the image 502 to remove the detected circumference 522, obtaining a cropped image 526, as shown in Figure 6C when the crop is applied to the detected circumference 522a corresponding to the landing template 400.
Una vez obtenida la primera imagen recortada 526, se repiten de manera iterativa los siguientes pasos hasta detectar N-1 circunferencias concéntricas interiores a la circunferencia detectada (o hasta que dejan de detectarse circunferencias concéntricas interiores, en el caso de que haya menos de N circunferencias concéntricas para la circunferencia detectada - 522a, 522b- que se esté analizando): Once the first cropped image 526 has been obtained, the following steps are repeated iteratively until N-1 concentric circles inside the detected circle are detected (or until inside concentric circles are no longer detected, in the event that there are fewer than N circles). concentric for the detected circumference - 522a, 522b- being analyzed):
Detectar 528 una circunferencia adicional 530 en cada imagen recortada 526 mediante la transformada de Hough. Si no se encuentra una circunferencia (i.e. circunferencia adicional 530) en la imagen recortada, se considera que la plantilla de aterrizaje no ha sido detectada 510. Detect 528 an additional circumference 530 in each cropped image 526 using the Hough transform. If no circumference (i.e. additional circumference 530) is found in the cropped image, the landing template is considered not detected 510.
Comprobar 532 que cada circunferencia adicional 530 es concéntrica a la circunferencia detectada 522, dentro de un margen de error. En el caso de que no los sea, se considera que la plantilla de aterrizaje no ha sido detectada 510. Check 532 that each additional circumference 530 is concentric to the detected circumference 522, within a margin of error. If it is not, it is considered that the landing template has not been detected 510.
Recortar 536 cada imagen recortada (526,538) para eliminar la correspondiente circunferencia adicional 530, obteniendo una imagen recortada 538 adicional (la cual volverá a ser analizada para comprobar si se detecta 528 una circunferencia adicional 530). Crop 536 each cropped image (526,538) to eliminate the corresponding additional circumference 530, obtaining an additional cropped image 538 (which will be analyzed again to check if an additional circumference 530 is detected 528).
Durante el proceso iterativo se comprueba 534 (por ejemplo, después de comprobar 532 la concentricidad de la circunferencia adicional 530 con respecto a la circunferencia detectada 522) si se han detectado ya N-1 circunferencias adicionales 530 (a través por ejemplo de un contador de circunferencias adicionales detectadas o un contador del número de la iteración), en cuyo caso se determina que la plantilla de aterrizaje ha sido detectada 506; en caso contrario, se procede a recortar 536 la imagen recortada y se reinicia el proceso iterativo, empezando de nuevo por la detección 528 de una circunferencia adicional 530 en la imagen recortada 538. During the iterative process, it is checked 534 (for example, after checking 532 the concentricity of the additional circumference 530 with respect to the detected circumference 522) if N-1 additional circumferences 530 have already been detected (through, for example, a counter of detected additional circumferences or an iteration number counter), in which case it is determined that the landing template has been detected 506; otherwise, the cropped image is cropped 536 and the iterative process is restarted, starting again with the detection 528 of an additional circumference 530 in the cropped image 538.
Aplicando este proceso iterativo a la imagen recortada 526 de la Figura 6C para N=3, se detecta 528 una primera circunferencia adicional 530a, que correspondería al borde externo de la corona circular 402b más externa (y que esté completa, no recortada) en la imagen recortada 526, siendo dicha corona circular 402b la segunda corona más externa de la plantilla de aterrizaje 400, según se ilustra en la Figura 6B. A continuación, se comprueba 532 que efectivamente la primera circunferencia adicional 530a es concéntrica a la circunferencia detectada 522a y que todavía no se han detectado 534 N-1 (i.e. 2) circunferencias adicionales, con lo que la imagen recortada 526 (Figura 6C) se vuelve a recortar 536, obteniendo una nueva imagen recortada 538 (Figura 6D), en la cual se detecta una segunda circunferencia adicional 530b concéntrica a la circunferencia detectada 522a, que corresponde al borde externo de la corona circular 402c más externa en la imagen recortada 536 (tercera corona más externa de la plantilla de aterrizaje 400, Figura 6B). De esta forma se termina el proceso iterativo, al haberse detectado las dos circunferencias adicionales (530a, 530b) requeridas, y se determina la detección 506 de la plantilla de aterrizaje 400. Applying this iterative process to the cropped image 526 of Figure 6C for N=3, a first additional circumference 530a is detected 528, which would correspond to the outer edge of the outermost circular annulus 402b (and which is complete, not cropped) in the image. cropped image 526, said annulus 402b being the second outermost annulus of landing template 400, as illustrated in Figure 6B. Next, it is verified 532 that the first additional circumference 530a is indeed concentric to the detected circumference 522a and that 534 N-1 (ie 2) additional circumferences have not yet been detected, with which the cropped image 526 (Figure 6C) is displayed. re-crops 536, obtaining a new cropped image 538 (Figure 6D), in which a second additional circumference 530b concentric to the detected circumference 522a is detected, which corresponds to the outer edge of the outermost circular annulus 402c in the cropped image 536 (third outermost crown of landing template 400, Figure 6B). In this way the iterative process is finished, having detected the two additional circumferences (530a, 530b) required, and detection 506 of landing template 400 is determined.
La unidad de detección de plantilla de aterrizaje también determina 508 la posición (Xc.Yc) del centro 406 de la plantilla de aterrizaje 400. Las coordenadas (Xc.Yc) del centro de la plantilla de aterrizaje 400 se puede hacer corresponder con las coordenadas del centro (606a, 606b, 606c) de cualquiera de las circunferencias concéntricas (522a, 530a, 530b) o se puede calcular a partir de las mismas, por ejemplo como un valor medio de las coordenadas de los centros (606a, 606b, 606c) de las circunferencias concéntricas (522a, 530a, 530b) detectadas. The landing template detection unit also determines 508 the position (Xc.Yc) of the center 406 of the landing template 400. The coordinates (Xc.Yc) of the center of the landing template 400 can be mapped to the coordinates of the center (606a, 606b, 606c) of any of the concentric circles (522a, 530a, 530b) or can be calculated from them, for example as an average value of the coordinates of the centers (606a, 606b, 606c ) of the concentric circles (522a, 530a, 530b) detected.
Los recortes (524,536) de las imágenes se realizan de manera que se elimine la circunferencia detectada, para evitar que se vuelva a detectar el mismo elemento cuando se aplique de nuevo la transformada de Hough a la imagen recortada (526,538). Por ejemplo, en la Figura 6C la imagen recortada 526 se obtiene al eliminar, al menos parcialmente, la coronas circular 402a más externa, de forma que no se pueda detectar ninguno de sus bordes (interior o exterior) cuando se aplique la transformada de Hough a la imagen recortada 526. El recorte concreto a aplicar una vez se detecta una circunferencia y se quiere detectar la siguiente se puede ajustar en función de los ratios de las distancias entre los coronas circulares concéntricas 402 y los espacios 404. The cuts (524,536) of the images are made in such a way that the detected circumference is eliminated, to avoid detecting the same element again when the Hough transform is applied again to the cut image (526,538). For example, in Figure 6C the cropped image 526 is obtained by removing, at least partially, the outermost annulus 402a, so that none of its edges (inner or outer) can be detected when the Hough transform is applied. to the cropped image 526. The specific cropping to be applied once a circumference is detected and the next one is to be detected can be adjusted based on the ratios of the distances between the concentric circular crowns 402 and the spaces 404.
En la Figura 7 se muestra un ejemplo de recorte de una imagen 700 (matriz rectangular) para eliminar una circunferencia detectada 702. La imagen recortada 704 (o sub-imagen) se selecciona de manera que se elimine la circunferencia detectada 702 para evitar que se vuelva a detectar el mismo elemento en la imagen recortada 704. La circunferencia detectada queda definida por los parámetros (x, y, r), donde ‘x’ es la posición del centro 706 de la circunferencia 702 para el eje X, ‘y’ es la posición del centro 706 de la circunferencia 702 para el eje Y, y el valor Ύ es el radio de la circunferencia 702. An example of cropping an image 700 (rectangular array) to remove a detected circumference 702 is shown in Figure 7. The cropped image 704 (or sub-image) is selected such that the detected circumference 702 is removed to prevent it from being detect the same element again in the cropped image 704. The detected circumference is defined by the parameters (x, y, r), where 'x' is the position of the center 706 of the circumference 702 for the X axis, 'y' is the position of the center 706 of the circle 702 for the Y axis, and the value Ύ is the radius of the circle 702.
Para aplicar el recorte se calcula el rectángulo 708 (o cuadrado, cuando la cámara se encuentra dispuesta de manera perpendicular a la plantilla de aterrizaje 400) que contiene a la circunferencia detectada 702. Los puntos que definen las cuatro esquinas del rectángulo 708 se calculan de la siguiente forma: x1 = max(x - r, 0) x2 = min(x + r, W) y1 = max(y - r, 0) y2 = min(y + r, H) To apply the crop, the rectangle 708 (or square, when the camera is arranged perpendicular to the landing template 400) containing the detected circumference 702 is calculated. The points that define the four corners of the rectangle 708 are calculated from as follows: x1 = max(x - r, 0) x2 = min(x + r, W) y1 = max(y - r, 0) y2 = min(y + r, H)
Donde x1, x2, y1 , y2 son las posiciones mínimas y máximas del rectángulo en los ejes X e Y, W es el ancho de la imagen 700 en píxeles y Ή’ es el alto de la imagen 700 en píxeles.Where x1, x2, y1 , y2 are the minimum and maximum positions of the rectangle on the X and Y axes, W is the width of the 700 image in pixels, and Ή' is the height of the 700 image in pixels.
En función de la distancia entre las coronas circulares 402 y los espacios 404, se aplica una función de recorte sobre el rectángulo 708 para eliminar la circunferencia detectada 702, reduciendo el tamaño de la imagen 700 en un determinado porcentaje. Por ejemplo, en los recortes mostrados en las Figuras 6C, 6D y 7 el recorte elegido es tal que la imagen recortada 704 mantenga un determinado porcentaje (e.g. 70%) de las dimensiones del rectángulo 708 que recoge la circunferencia 702. La Figura 7 muestra la mitad (To/2) del tamaño inicial (To) del rectángulo 708 en el eje X, la mitad (Ti/2) del tamaño final (Ti) de la imagen recortada 704, y el porcentaje de recorte (%R) aplicado. Based on the distance between the annuluses 402 and the gaps 404, a crop function is applied to the rectangle 708 to remove the detected circumference 702, reducing the size of the image 700 by a certain percentage. For example, in the cutouts shown in Figures 6C, 6D and 7, the chosen cutout is such that the cutout image 704 maintains a certain percentage (e.g. 70%) of the dimensions of the rectangle 708 that encompasses the circumference 702. Figure 7 shows half (To/2) of the initial size (To) of the rectangle 708 on the X axis, half (Ti/2) of the final size (Ti) of the cropped image 704, and the percentage of crop (%R) applied .
Una vez recortada la imagen 700 se vuelve a realizar la búsqueda de una circunferencia. Además, en cada paso recursivo la imagen recortada 704 es más pequeña, por lo que el coste computacional de aplicar la transformada de Hough es reducido, ya que en la nueva sub imagen el radio máximo a buscar se reduce frente a la imagen anterior y además existen menos píxeles sobre los que aplicar la detección de bordes y de circunferencias. Once the image 700 has been cut, the search for a circumference is carried out again. In addition, in each recursive step the cropped image 704 is smaller, so the computational cost of applying the Hough transform is reduced, since in the new sub image the maximum radius to be searched is reduced compared to the previous image and also there are fewer pixels on which to apply edge and circumference detection.
En caso de identificar una circunferencia interna se calcula con un margen de error predeterminado si es concéntrico. Esto se puede hacer proyectando el centro de las circunferencias encontradas sobre las coordenadas generales de la imagen de entrada, calculando la discrepancia de su posición y comprobando que es inferior a un margen de error considerado como aceptable. La proyección se realiza sumando para cada eje las coordenadas del nuevo centro detectado a las coordenadas del rectángulo de recorte de la sub-imagen. If an internal circumference is identified, it is calculated with a predetermined margin of error if it is concentric. This can be done by projecting the center of the circumferences found on the general coordinates of the input image, calculating the discrepancy of its position and verifying that it is less than a margin of error considered acceptable. The projection is done by adding for each axis the coordinates of the new detected center to the coordinates of the sub-image clipping rectangle.
Si alguno de los pasos de la búsqueda interna falla, el algoritmo de detección de plantilla de aterrizaje descarta la circunferencia. Si se consigue validar las condiciones para las N circunferencias totales requeridas, la detección de plantilla de aterrizaje se toma como válida. If any of the internal search steps fail, the landing template detection algorithm discards the circumference. If it is possible to validate the conditions for the required N total circumferences, the landing template detection is taken as valid.
La plantilla de aterrizaje 400 puede complementarse añadiendo más figuras internas, como cuadrados o rombos, para obtener una mayor robustez frente a falsos positivos. Landing template 400 can be supplemented by adding more internal shapes, such as squares or diamonds, for greater robustness against false positives.
La Figura 8 muestra el diagrama de flujo de un método de aterrizaje de precisión autónomo 800 para drones de acuerdo a una realización de la presente invención, que emplea el proceso de detección de plantilla de aterrizaje explicado en las Figuras 5 a 7. El método de aterrizaje de precisión autónomo 800 comprende las siguientes etapas: Figure 8 shows the flowchart of an autonomous precision landing method 800 for drones according to an embodiment of the present invention, which employs the process of landing template detection explained in Figures 5 to 7. The autonomous precision landing method 800 comprises the following steps:
- Adquirir 501 imágenes 502 repetidamente durante la secuencia de aterrizaje mediante una cámara instalada a bordo de un dron y orientada hacia el suelo.- Acquire 501 images 502 repeatedly during the landing sequence using a camera installed on board a drone and oriented towards the ground.
Detectar 506, en cada imagen 502 capturada por la cámara, una plantilla de aterrizaje 400 formada por una pluralidad de coronas circulares concéntricas 402 de grosor decreciente mediante la detección 503 de un número predeterminado N de circunferencias concéntricas en la correspondiente imagen 502. Detect 506, in each image 502 captured by the camera, a landing template 400 formed by a plurality of concentric circular crowns 402 of decreasing thickness by detecting 503 a predetermined number N of concentric circumferences in the corresponding image 502.
Realizar el aterrizaje autónomo 810 del dron sobre la plantilla de aterrizaje 400 utilizando como referencia las posiciones 508 de la plantilla de aterrizaje 400 detectadas en imágenes capturadas por la cámara durante el aterrizaje. Carry out the autonomous landing 810 of the drone on the landing template 400 using as reference the positions 508 of the landing template 400 detected in images captured by the camera during landing.
En la Figura 9 se ilustran los elementos que componen un sistema de aterrizaje de precisión autónomo 900 para drones de acuerdo a una realización de la presente invención. El sistema de aterrizaje de precisión autónomo 900 incluye elementos embarcados en un dron y que se encargan de la ejecución del método de aterrizaje de precisión autónomo 800 (Figura 8) a bordo del dron durante la secuencia de aterrizaje. Figure 9 illustrates the elements that make up an autonomous precision landing system 900 for drones according to an embodiment of the present invention. The autonomous precision landing system 900 includes elements mounted on a drone and responsible for the execution of the autonomous precision landing method 800 (FIG. 8) on board the drone during the landing sequence.
El sistema de aterrizaje de precisión autónomo 900 comprende los siguientes elementos: una cámara 910 orientada hacia el suelo, una unidad de detección de plantilla de aterrizaje 920 y un sistema de comando 930. Tal y como se muestra en la Figura 9, la unidad de detección de plantilla de aterrizaje 920 y el sistema de comando 930 pueden estar incorporados en una unidad de control 940 (implementado por ejemplo en una CPU, un procesador o un microcontrolador), o pueden ser elementos embarcados en el dron totalmente independientes, no englobados en un dispositivo común. El sistema de aterrizaje de precisión autónomo 900 puede también comprender un sensor de altura 912 para proporcionar al sistema de comando 930 la altura HD del dron respecto al suelo. La dirección de la medida de altura sería preferentemente según la línea de plomada o según el eje Down de un sistema NED (North East Dow) con centro en el centro de masas del dron. The autonomous precision landing system 900 comprises the following elements: a ground-facing camera 910, a landing template detection unit 920, and a command system 930. As shown in Figure 9, the detection unit landing template detection 920 and the command system 930 can be incorporated in a control unit 940 (implemented for example in a CPU, a processor or a microcontroller), or they can be totally independent elements on board the drone, not included in a common device. The autonomous precision landing system 900 may also comprise a height sensor 912 to provide the command system 930 with the HD height of the drone relative to the ground. The direction of the height measurement would preferably be along the plumb line or along the Down axis of a NED (North East Dow) system centered on the drone's center of mass.
La unidad de detección de plantilla de aterrizaje 920 está configurada para detectar, en cada imagen 502 recibida de la cámara 910, una plantilla de aterrizaje 400 formada por una pluralidad de coronas circulares concéntricas 402 de grosor decreciente mediante la detección de un número predeterminado N de circunferencias concéntricas en la correspondiente imagen 502. Cada vez que detecta una plantilla de aterrizaje 400 en una imagen 502 recibida, la unidad de detección de plantilla de aterrizaje 920 calcula la posición 508 de la plantilla de aterrizaje, determinada por ejemplo mediante la posición (Xc.Yc) del centro 406 de la plantilla de aterrizaje 400, y se la proporciona al sistema de comando 930. The landing template detection unit 920 is configured to detect, in each image 502 received from the camera 910, a landing template 400 formed by a plurality of concentric circular annuluses 402 of decreasing thickness by detecting a predetermined number N of concentric circumferences in the corresponding image 502. Each time it detects a landing template 400 in a received image 502, the landing template detection unit 920 calculates the position 508 of the landing template, determined for example by the position (Xc.Yc) of the center 406 of the landing template 400, and provides it to the command system 930.
El sistema de comando 930 está configurado para realizar el aterrizaje autónomo del dron sobre la plantilla de aterrizaje 400 utilizando como referencia las posiciones 508 de la plantilla de aterrizaje 400 detectadas por la unidad de detección de plantilla de aterrizaje 920 en imágenes capturadas por la cámara 910 durante el aterrizaje. The command system 930 is configured to perform the autonomous landing of the drone on the landing template 400 using as reference the positions 508 of the landing template 400 detected by the landing template detection unit 920 in images captured by the camera 910. during landing.
En la Figura 10 se muestra un dron 1010 que incorpora el sistema de aterrizaje de precisión autónomo 900 de la Figura 9, donde la cámara 910 está enfocada hacia una zona de aterrizaje 1020 en la que se dispone una plataforma de aterrizaje 1030 que soporta a una plantilla de aterrizaje 400 dispuesta horizontalmente y orientada hacia arriba. La plantilla de aterrizaje 400 puede estar fijada o adherida de cualquier forma (e.g. mediante medios de fijación o un elemento adhesivo) a la plataforma de aterrizaje 1030, o pintada en su superficie superior externa. En una realización, el sistema de aterrizaje de precisión autónomo 1000 comprende, adicionalmente a los elementos embarcados en el dron y descritos en la Figura 9, la plantilla de aterrizaje 400 y/o la plataforma de aterrizaje 1030. In Figure 10 a drone 1010 is shown that incorporates the autonomous precision landing system 900 of Figure 9, where the camera 910 is focused towards a landing zone 1020 in which a landing platform 1030 is arranged that supports a landing template 400 arranged horizontally and oriented upwards. Landing template 400 may be attached or adhered in any way (e.g. by means of attachment or an adhesive element) to landing pad 1030, or painted on its upper outer surface. In one embodiment, the autonomous precision landing system 1000 comprises, in addition to the elements on board the drone and described in Figure 9, the landing template 400 and/or the landing platform 1030.
El dron 1010 está preparado para iniciar un aterrizaje de precisión 1002, en una posición y a una altura en la que la cámara 910 puede captar la plantilla de aterrizaje 400, que será utilizada como asistente visual durante la ejecución del aterrizaje de precisión. The drone 1010 is ready to initiate a precision landing 1002, in a position and at a height where the camera 910 can capture the landing template 400, which will be used as a visual aid during the execution of the precision landing.
De acuerdo a la realización mostrada en la Figura 9, el sistema de comando 930 se compone a su vez de varios módulos: una unidad de gestión de inicio del comando 932, una unidad de control de movimientos 934, encargada del control general de movimientos del dron 1010 en función de las entradas, y un controlador de velocidad 936 configurado para generar las correcciones necesarias para centrar el dron con respecto a la plantilla de aterrizaje 400 de acuerdo a las detecciones de la plantilla de aterrizaje 400 realizadas por la unidad de detección de plantilla de aterrizaje 920 durante el aterrizaje. According to the embodiment shown in Figure 9, the command system 930 is in turn made up of several modules: a command start management unit 932, a movement control unit 934, in charge of the general control of movements of the drone 1010 based on the inputs, and a speed controller 936 configured to generate the necessary corrections to center the drone with respect to the landing template 400 according to the detections of the landing template 400 made by the detection unit of landing template 920 during landing.
El sistema de comando 930 se mantiene a la espera hasta que se cumplen una serie de condiciones que permiten iniciar el aterrizaje de precisión 1002. Dichas condiciones de inicio del aterrizaje de precisión son detectadas de manera automática por la unidad de gestión de inicio del comando 932. En caso de que alguno de los módulos necesarios para el funcionamiento no esté disponible, debido por ejemplo a errores de conexiones, el programa finaliza. En el diagrama de flujo de la Figura 11 se representan distintas comprobaciones que lleva a cabo la unidad de gestión de inicio del comando 932 antes de iniciar un aterrizaje de precisión 1002. En primer lugar se comprueba la conexión 1102 con el autopiloto y la conexión 1004 con la unidad de detección de plantilla de aterrizaje 920, la cual se encarga de gestionar la cámara 910, realizar las detecciones de la plantilla de aterrizaje 400 y devolver los resultados relativos a su detección y su posición. Dichas comprobaciones pueden incluir un número máximo de intentos de conexión. The 930 command system remains on standby until a series of conditions are met that allow the precision landing 1002 to be initiated. Said precision landing initiation conditions are automatically detected by the 932 command initiation management unit. In the event that any of the modules required for the operation is not available, for example due to connection errors, the program ends. The flow chart in Figure 11 shows different checks carried out by the start management unit of the command 932 before starting a precision landing 1002. First, the connection 1102 with the autopilot and the connection 1004 with the autopilot are checked. with the landing template detection unit 920, which is responsible for managing the camera 910, detecting the landing template 400 and returning the results related to its detection and its position. Such checks may include a maximum number of connection attempts.
Se comprueba a continuación si se cumplen las condiciones para pasar al modo de vuelo con control automático (condiciones OFFBOARD’ 1106). En una realización, mostrada en el diagrama de flujo de la Figura 12, dichas condiciones incluyen las siguientes comprobaciones: It is then checked whether the conditions for switching to flight mode with automatic control (OFFBOARD conditions '1106) are met. In one embodiment, shown in the flowchart of Figure 12, said conditions include the following checks:
1. Estar cerca del punto de despegue (HomePos) y estar volando en el modo de retorno a casa RTL 1202 (“Return to Launch”, es un modo que permite al dron volverá la zona desde la que ha despegado y aterrizar automáticamente). 1. Being close to the takeoff point (HomePos) and flying in the RTL 1202 return home mode (“Return to Launch”, is a mode that allows the drone to return to the area from which it took off and land automatically).
2. No estar moviéndose 1204 (o con un movimiento mínimo, dentro de un umbral), para saber si ya está en la posición final. Esto se hace evaluando las velocidades en cada eje del dron 1010. 2. Not be moving 1204 (or with a minimum movement, within a threshold), to know if it is already in the final position. This is done by evaluating the speeds on each axis of the 1010 drone.
3. Estar con una orientación similar a la orientación que tuvo en el momento de despegue 1206 (orientación de despegue), ya que el modo RTL deja el dron orientado hacia el mismo lugar hacia el que ha despegado. 3. Be in a similar orientation to the orientation it had at takeoff 1206 (takeoff orientation), as RTL mode leaves the drone facing the same place it took off.
4. Opcionalmente, se comprueba 1208 si es necesaria o no la detección de un código de referencia (e.g. etiqueta ArUco, QR, ARtag, o similar) establecido, presente en la zona de aterrizaje 1020, como una comprobación adicional, y en cuyo caso se comprueba si el código de referencia ha sido detectado 1210 en una imagen (502) capturada por la cámara (910). El código de referencia se puede incorporar en la misma plantilla de aterrizaje 400 (exteriormente a las coronas circulares concéntricas 402) o, como se muestra en la Figura 10 (código de referencia 1050), en la propia plataforma de aterrizaje 1030. Para la detección del código de referencia se puede emplear cualquier método de detección conocido en el estado del arte. 4. Optionally, it is checked 1208 whether it is necessary or not to detect a reference code (e.g. ArUco tag, QR, ARtag, or similar) established, present in the landing zone 1020, as an additional check, and in which case it is checked if the reference code has been detected 1210 in an image (502) captured by the camera (910). The reference code can be incorporated into the landing pad 400 itself (exterior to the concentric annuluses 402) or, as shown in Figure 10 (reference code 1050), into the landing pad 1030 itself. of the reference code, any detection method known in the state of the art can be used.
Todas las condiciones OFFBOARD’ 1106 tienes que cumplirse una a una, durante al menos un tiempo específico, según una comprobación en cadena, siguiendo un determinado flujo de comprobaciones (por ejemplo, de acuerdo al diagrama de flujo de la Figura 12). En el caso de que alguna de las condiciones falle, se vuelven a repetir todas las comprobaciones desde el inicio. Volviendo al diagrama de flujo de la Figura 11 , una vez se determina que se cumplen todas las condiciones OFFBOARD’ 1106, se comprueba 1108 si se cumplen unas condiciones para finalmente proceder con el aterrizaje de precisión (condiciones de aterrizaje de precisión), lo cual se determina en función de si los niveles de velocidad de viento superan un determinado límite o si se han producido ráfagas de viento en los últimos instantes (e.g. unos determinados minutos) previos al aterrizaje. All OFFBOARD' conditions 1106 have to be fulfilled one by one, for at least a specific time, according to a chain check, following a certain flow of checks (for example, according to the flow chart of Figure 12). In the event that any of the conditions fail, all the checks are repeated from the beginning. Returning to the flowchart of Figure 11, once it is determined that all the OFFBOARD' conditions are met 1106, it is checked 1108 if some conditions are met to finally proceed with the precision landing (precision landing conditions), which It is determined based on whether the wind speed levels exceed a certain limit or if there have been gusts of wind in the last moments (eg a few minutes) prior to landing.
En el caso de que se cumplan las condiciones para el aterrizaje preciso (e.g. intensidad de viento por debajo de un determinado umbral y/o sin ráfagas de viento durante un determinado tiempo previo al aterrizaje) se inicia el aterrizaje de precisión 1002. En caso de que no se cumplan las condiciones de aterrizaje de precisión, se inicia un aterrizaje alternativo 1110, en el cual el dron se dirige de manera automática, mediante técnicas de guiado ya conocidas en el estado del arte, a un lugar alternativo donde aterrizar, por ejemplo hacia un punto definido por el usuario en la configuración del sistema. Este punto o lugar alternativo se puede definir estableciendo una distancia desde el punto de despegue (HomePos) hasta el punto deseado y una orientación (Norte, Sur, Este, Oeste o un rumbo más específico en grados). Una vez alcanzado el lugar alternativo de aterrizaje, el dron pasa al modo de aterrizaje del autopiloto. Este aterrizaje de precisión, con errores máximos de unos pocos centímetros, permite integrar otros sistemas para automatizar el mantenimiento del dron o dejarlo resguardado, como puede ser la integración con los denominados como Nidos para drones (“Drone Nest”) o aterrizar en espacios con dimensiones limitadas, mientras que un sistema de aterrizaje no preciso o el aterrizaje alternativo están orientados a aterrizar de manera segura, pero para un caso de uso en el que es aceptable asumir los errores de 3 m que se puede llegar a tener con un receptor GNSS convencional. In the event that the conditions for precise landing are met (e.g. wind intensity below a certain threshold and/or without wind gusts for a certain time prior to landing), precision landing 1002 is initiated. If the precision landing conditions are not met, an alternative landing 1110 is initiated, in which the drone is directed automatically, using guidance techniques already known in the state of the art, to an alternative place to land, for example to a point defined by the user in the system configuration. This alternative point or location can be defined by setting a distance from the takeoff point (HomePos) to the desired point and an orientation (North, South, East, West or a more specific heading in degrees). Once the alternate landing site is reached, the drone goes into autopilot landing mode. This precision landing, with maximum errors of a few centimeters, allows the integration of other systems to automate the maintenance of the drone or leave it protected, such as the integration with the so-called Nests for drones ("Drone Nest") or landing in spaces with limited dimensions, while a non-precision landing system or alternate landing are aimed at landing safely, but for a use case where it is acceptable to assume the 3 m errors that can be had with a GNSS receiver conventional.
En la Figura 13 se representa un diagrama de flujo del control de estados realizado por la unidad de control de movimientos 934 durante un aterrizaje de precisión 1002. El dron inicia el aterrizaje de precisión 1002 en un estado estacionario 1302 (sin movimiento, de acuerdo a la comprobación realizada en 1204). Se comprueba 1304 la detección de la plantilla de aterrizaje por parte de la unidad de detección de plantilla de aterrizaje 920, analizando al menos una imagen capturada por la cámara 910 en base al algoritmo de detección previamente descrito. Si no se detecta la plantilla de aterrizaje, se comprueba 1306 si el dron está posicionado a una altura máxima determinada; en caso afirmativo se vuelve al estado estacionario 1302 para volver a comprobar 1304 la detección de la plantilla de aterrizaje, y en caso contrario el dron pasa a un estado de ascensión 1308 para alcanzar la altura máxima determinada a partir de la cual poder detectar la plantilla de aterrizaje y comenzar la secuencia de aterrizaje. En el caso de que el dron permanezca en estadio estacionario 1302 durante un tiempo determinado sin detectar la plantilla de aterrizaje, el dron abandona el aterrizaje de precisión 1002 e inicia en su lugar un aterrizaje alternativo 1110. De manera similar, si desde que se inicia el aterrizaje de precisión 1002 un contador interno de tiempo alcanza un valor máximo determinado, se cancela el aterrizaje de precisión 1002 y se procede con el aterrizaje alternativo 1110. A flowchart of the state control performed by the motion control unit 934 during a precision landing 1002 is shown in Figure 13. The drone initiates the precision landing 1002 in a stationary state 1302 (no movement, according to the check made in 1204). The detection of the landing template by the landing template detection unit 920 is checked 1304 by analyzing at least one image captured by the camera 910 based on the previously described detection algorithm. If the landing template is not detected, it is checked 1306 whether the drone is positioned at a given maximum height; in the affirmative case, it returns to the stationary state 1302 to check again 1304 the detection of the landing template, and in Otherwise, the drone goes to an ascension state 1308 to reach the maximum height determined from which it can detect the landing template and start the landing sequence. In the event that the drone remains hovering 1302 for a certain time without detecting the landing template, the drone abandons the precision landing 1002 and initiates an alternate landing 1110 instead. Similarly, if from the start precision landing 1002 an internal time counter reaches a certain maximum value, precision landing 1002 is canceled and alternate landing 1110 proceeds.
Si se detecta la plantilla de aterrizaje 400, el dron pasa a un estado de acercamiento horizontal 1310 con el objeto de centrar el dron horizontalmente con respecto a la plantilla de aterrizaje 400 (esto es, centrar el dron 1010 en las coordenadas (Xc.Yc) del centro 406 de la plantilla de aterrizaje 400). Una vez el dron entra en dicho estado de acercamiento horizontal 1310, comprueba 1312 la detección de la plantilla de aterrizaje en al menos una imagen capturada por la cámara 910. Si no detecta la plantilla de aterrizaje 400 durante un margen de tiempo determinado (durante las comprobaciones el dron permanece inmóvil), se comprueba 1306 si el dron está posicionado en una altura máxima. If landing template 400 is detected, the drone enters a horizontal approach state 1310 in order to center the drone horizontally with respect to landing template 400 (i.e., center drone 1010 at coordinates (Xc.Yc ) of the center 406 of the landing template 400). Once the drone enters said horizontal approach state 1310, it checks 1312 the detection of the landing template in at least one image captured by the camera 910. If it does not detect the landing template 400 for a certain period of time (during the checks the drone remains stationary), it is checked 1306 if the drone is positioned at a maximum height.
En el caso de que se detecte la plantilla de aterrizaje, se procede a comprobar 1314 si el dron está centrado en la misma. Si no está centrado, se vuelve al estado de acercamiento horizontal 1310 para dar las oportunas órdenes de velocidad que permitan centrar el dron 1010 en la plantilla de aterrizaje 400 detectada. Este proceso de acercamiento horizontal se repite hasta que el dron esté centrado en la plantilla de aterrizaje, entrando en ese momento el dron en un estado de acercamiento vertical 1316 con el objeto de descender paulatinamente y acercarse verticalmente a la plantilla de aterrizaje 400. In the event that the landing template is detected, it proceeds to check 1314 if the drone is centered on it. If it is not centered, it returns to the horizontal approach state 1310 to give the appropriate speed commands that allow the drone 1010 to center on the landing template 400 detected. This horizontal approach process is repeated until the drone is centered on the landing template, at which point the drone enters a vertical approach state 1316 in order to gradually descend and vertically approach the landing template 400.
Durante el estado de acercamiento vertical 1316 el dron desciende vertical mente, comprobando repetidamente durante el descenso si la plantilla sigue siendo detectada 1318 y si el dron sigue centrado 1320. En caso de que se deje de detectar la plantilla de aterrizaje (por ejemplo, al haberse producido una fuerte ráfaga de viento lateral que ha desplazado sustancialmente al dron y le impide detectar la plantilla de aterrizaje a su altura actual), el dron pasa al estado de ascenso 1308 hasta que se vuelvan a obtener detecciones de la plantilla 1304 o se alcance la altura máxima de vuelo establecida 1306, en cuyo caso volverá al estado estacionario 1302, al igual que en el inicio del aterrizaje de precisión 1002. Si durante el descenso se comprueba que el dron deja de estar centrado, se vuelve a un estado de acercamiento horizontal 1310 para centrar al dron en la plantilla de aterrizaje. Conforme el dron 1010 vaya realizando el descenso y siga centrado en la plantilla de aterrizaje 400, se comprueba 1322 si el dron está posicionado a la altura de aterrizaje (e.g. a la altura a la que se encuentra la plantilla de aterrizaje), momento en el cual se pasa a un estado de aterrizaje 1324 y se da por terminada la secuencia de aterrizaje de precisión 1002. During the vertical approach state 1316 the drone descends vertically, repeatedly checking during the descent if the template is still detected 1318 and if the drone is still centered 1320. In case the landing template is no longer detected (for example, when a strong crosswind gust has occurred that has substantially displaced the drone and prevents it from detecting the landing template at its current height), the drone enters climb state 1308 until template 1304 detections are obtained again or the landing template is reached. the established maximum flight height 1306, in which case it will return to the stationary state 1302, as in the beginning of the precision landing 1002. If during the descent it is verified that the drone is no longer centered, it returns to a state of horizontal zoom 1310 to center the drone on the landing template. As the drone 1010 descends and remains centered on the landing template 400, it is checked 1322 whether the drone is positioned at the landing height (eg at the height of the landing template), at which point which enters a landing state 1324 and terminates the precision landing sequence 1002.
La unidad de control de movimientos 934 utiliza al controlador de velocidad 936 en el estado de acercamiento horizontal 1310 para centrar al dron 1010 en la plantilla de aterrizaje 400 durante el aterrizaje de precisión 1002. El controlador de velocidad 936 es el encargado de generar las velocidades de corrección necesarias para centrar el dron, en función de la posición (Xc.Yc) de la plantilla de aterrizaje detectada en cada instante. El controlador de velocidad 936 también se encarga de generar las velocidades verticales de ascenso (en el estado de ascensión 1308) y de descenso (en el estado de acercamiento vertical 1316) del dron. The motion control unit 934 uses the speed controller 936 in the horizontal approach state 1310 to center the drone 1010 on the landing template 400 during precision landing 1002. The speed controller 936 is responsible for generating the speeds of correction necessary to center the drone, depending on the position (Xc.Yc) of the landing template detected at each instant. The speed controller 936 is also responsible for generating the vertical ascent (in climb state 1308) and descent (in vertical approach state 1316) speeds of the drone.
El controlador de velocidad 936 implementado se basa preferentemente en un controlador proporcional con corte de salida máximo y mínimo. Este corte se puede establecer en función de la desviación de entrada en lugar de ser un valor fijo preestablecido. The implemented speed controller 936 is preferably based on a proportional controller with maximum and minimum output cutoff. This cutoff can be set based on the input offset instead of being a fixed preset value.
En una realización, las entradas del controlador de velocidad 936 son las siguientes: In one embodiment, the speed controller 936 inputs are as follows:
Desviación en el eje X ( desviacionx ) en píxeles entre el centro de la imagen 502 y el centro (Xc.Yc) de la plantilla de aterrizaje 400. Offset on the X-axis (offset x ) in pixels between the center of the image 502 and the center (Xc.Yc) of the landing template 400.
Desviación en el eje Y ( desviaciony ) en píxeles entre el centro de la imagen 502 y el centro (Xc.Yc) de la plantilla de aterrizaje 400. Offset on the Y axis (offset y ) in pixels between the center of the image 502 and the center (Xc.Yc) of the landing template 400.
- Altura ( HD ) del dron respecto al suelo, adquirida por un sensor de altura 912 instalado en el dron, como por ejemplo un pequeño telémetro láser. - Height ( HD ) of the drone with respect to the ground, acquired by a height sensor 912 installed on the drone, such as a small laser rangefinder.
Ganancia del proporcional ( g ). Proportional gain ( g ).
Porcentaje ( MaxSpeedPer ) para establecer el valor de paso máximo y mínimo en función de la desviación de entrada. Percentage ( MaxSpeedPer ) to set the maximum and minimum step value based on the input deviation.
El valor de velocidad de salida máxima en el eje horizontal X ( MaxSpeedx ) y en el eje horizontal Y ( MaxSpeedy ) se calcula como: The maximum output speed value on the horizontal X axis ( MaxSpeed x ) and on the horizontal Y axis ( MaxSpeed y ) is calculated as:
MaxSpeedx = abs(desviacionx ) + MaxSpeedPer x abs(desviacionx )MaxSpeed x = abs(deviation x ) + MaxSpeedPer x abs(deviation x )
MaxSpeedy = abs(desviaciony ) + MaxSpeedPer x abs (desviaciony) La velocidad a comandar en cada eje horizontal (X,Y) se calcula de la siguiente manera:MaxSpeed y = abs(deviation y ) + MaxSpeedPer x abs (deviation y ) The speed to command on each horizontal axis (X,Y) is calculated as follows:
1. Calcular la distancia física entre el dron 1010 y el centro (Xc.Yc) de la plantilla de aterrizaje 400 en cada eje horizontal a partir de la desvlaclonx y la desvlaclony. Esta distancia será la que se denomine como ErrorFísico. Esto se realiza mediante el cálculo de la distancia de muestra del suelo (GSD, “Ground Sampling Distance”). Este cálculo consigue establecer, a partir de un valor de altura HD y las propiedades de la cámara (resolución, campo de visión -FOV- y distancia focal de la lente), cuánta distancia física equivale a cada pixel de la imagen, de manera que sabiendo la diferencia en píxeles ( desvlaclonx , desvlaclony) entre el centro de la imagen 502 y el centro (Xc.Yc) de la plantilla de aterrizaje 400 se puede calcular la distancia física en el plano horizontal (ejes X e Y). La precisión del cálculo depende en gran medida de la calidad de la medición de altura así como de la perpendicularidad entre la cámara 910 (el eje de la lente principal) y la plantilla de aterrizaje 400. La distancia medida por el sensor de altura 912 se corrige cuando el dron 1010 no se encuentra perpendicular al plano de la plantilla de aterrizaje 400. En las situaciones reales de aterrizaje, los ángulos de inclinación del dron 1010 al moverse para aplicar las correcciones o debido a la necesidad de sobreponerse al viento son reducidos, por lo que no suponen un problema para el cálculo del ErrorFísico con la cámara. 1. Calculate the physical distance between the drone 1010 and the center (Xc.Yc) of the landing template 400 on each horizontal axis from the offset x and the offset y . This distance will be what is called Physical Error. This is done by calculating the Ground Sampling Distance (GSD). This calculation manages to establish, from a height value H D and the properties of the camera (resolution, field of view -FOV- and focal length of the lens), how much physical distance is equivalent to each pixel of the image, so knowing the difference in pixels ( offset x , offset y ) between the center of the image 502 and the center (Xc.Yc) of the landing template 400, it is possible to calculate the physical distance in the horizontal plane (axes X and Y) . The accuracy of the calculation largely depends on the quality of the height measurement as well as the perpendicularity between the camera 910 (the main lens axis) and the landing template 400. The distance measured by the height sensor 912 is corrects when the drone 1010 is not perpendicular to the plane of the landing template 400. In real landing situations, the pitch angles of the drone 1010 when moving to apply the corrections or due to the need to overcome the wind are reduced, so they do not pose a problem for the calculation of the Physical Error with the camera.
2. Se calcula la velocidad horizontal ( Vx , Vy) de salida a comandar en cada eje horizontal (X,Y) aplicando el controlador proporcional: 2. The horizontal speed (V x , V y ) of the output to be commanded in each horizontal axis (X,Y) is calculated by applying the proportional controller:
Vx = ErrorFísicox x g V x = PhysicalError x xg
Vy = ErrorFísicOy x g V y = ErrorPhysicalOy xg
3. Si la velocidad a comandar en un eje (VelocídadSalídax, V elocidadSaliday) excede la velocidad de salida máxima ( MaxSpeedx , MaxSpeedy) o mínima (valor opuesto) para dicho eje, se limita la velocidad a la velocidad de salida máxima correspondiente a dicho eje. 3. If the speed to be commanded on an axis (OutputVelocity x , OutputVelocity y ) exceeds the maximum output speed ( MaxSpeed x , MaxSpeed y ) or minimum (opposite value) for said axis, the speed is limited to the output speed maximum corresponding to said axis.
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/ES2021/070126 WO2022180276A1 (en) | 2021-02-23 | 2021-02-23 | Autonomous precision landing system, method and program for drones |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/ES2021/070126 WO2022180276A1 (en) | 2021-02-23 | 2021-02-23 | Autonomous precision landing system, method and program for drones |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022180276A1 true WO2022180276A1 (en) | 2022-09-01 |
Family
ID=75497955
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/ES2021/070126 Ceased WO2022180276A1 (en) | 2021-02-23 | 2021-02-23 | Autonomous precision landing system, method and program for drones |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022180276A1 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230343087A1 (en) * | 2016-08-06 | 2023-10-26 | SZ DJI Technology Co., Ltd. | Automatic terrain evaluation of landing surfaces, and associated systems and methods |
| WO2024058890A1 (en) * | 2022-09-16 | 2024-03-21 | Wing Aviation Llc | Pixel-by-pixel segmentation of aerial imagery for autonomous vehicle control |
| EP4484302A1 (en) * | 2023-06-26 | 2025-01-01 | The Boeing Company | Vision-based vehicle guidance |
| WO2026009090A1 (en) * | 2024-07-01 | 2026-01-08 | Drptech S.R.L. | Method for moving a drone of a drone-in-a-box system and a drone-in-a-box system |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018035835A1 (en) * | 2016-08-26 | 2018-03-01 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
| CN109270953A (en) * | 2018-10-10 | 2019-01-25 | 大连理工大学 | A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on concentric circles visual cues |
| CN110569838A (en) * | 2019-04-25 | 2019-12-13 | 内蒙古工业大学 | A method for autonomous landing of quadrotor UAV based on vision positioning |
| CN110618691A (en) * | 2019-09-16 | 2019-12-27 | 南京信息工程大学 | Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle |
| KR102069240B1 (en) * | 2018-10-23 | 2020-01-22 | 한국항공우주연구원 | Drone landing apparatus for ship and control method for drone landing using the same |
-
2021
- 2021-02-23 WO PCT/ES2021/070126 patent/WO2022180276A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018035835A1 (en) * | 2016-08-26 | 2018-03-01 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
| CN109270953A (en) * | 2018-10-10 | 2019-01-25 | 大连理工大学 | A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on concentric circles visual cues |
| KR102069240B1 (en) * | 2018-10-23 | 2020-01-22 | 한국항공우주연구원 | Drone landing apparatus for ship and control method for drone landing using the same |
| CN110569838A (en) * | 2019-04-25 | 2019-12-13 | 内蒙古工业大学 | A method for autonomous landing of quadrotor UAV based on vision positioning |
| CN110618691A (en) * | 2019-09-16 | 2019-12-27 | 南京信息工程大学 | Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230343087A1 (en) * | 2016-08-06 | 2023-10-26 | SZ DJI Technology Co., Ltd. | Automatic terrain evaluation of landing surfaces, and associated systems and methods |
| US12217500B2 (en) * | 2016-08-06 | 2025-02-04 | Sz Dji Technology Co, Ltd. | Automatic terrain evaluation of landing surfaces, and associated systems and methods |
| WO2024058890A1 (en) * | 2022-09-16 | 2024-03-21 | Wing Aviation Llc | Pixel-by-pixel segmentation of aerial imagery for autonomous vehicle control |
| US12242282B2 (en) | 2022-09-16 | 2025-03-04 | Wing Aviation Llc | Pixel-by-pixel segmentation of aerial imagery for autonomous vehicle control |
| EP4484302A1 (en) * | 2023-06-26 | 2025-01-01 | The Boeing Company | Vision-based vehicle guidance |
| WO2026009090A1 (en) * | 2024-07-01 | 2026-01-08 | Drptech S.R.L. | Method for moving a drone of a drone-in-a-box system and a drone-in-a-box system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022180276A1 (en) | Autonomous precision landing system, method and program for drones | |
| US12428147B2 (en) | Unmanned aerial vehicle inspection system | |
| US12007761B2 (en) | Unmanned aerial vehicle inspection system | |
| CN110991207B (en) | Precise landing method of UAV based on H pattern recognition and AprilTag QR code recognition | |
| CN106054929B (en) | A kind of unmanned plane based on light stream lands bootstrap technique automatically | |
| US9513635B1 (en) | Unmanned aerial vehicle inspection system | |
| ES2893959T3 (en) | Autonomous landing methods and system | |
| CN109387186B (en) | Surveying and mapping information acquisition method, device, electronic device and storage medium | |
| EP3668792A1 (en) | Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets | |
| JP2012071645A (en) | Automatic taking-off and landing system | |
| WO2019189381A1 (en) | Moving body, control device, and control program | |
| CN118603103A (en) | UAV indoor navigation system based on visual SLAM | |
| CN113867373A (en) | UAV landing method, device, apron and electronic equipment | |
| Mondragón et al. | Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation | |
| CN109613926A (en) | Multi-rotor unmanned aerial vehicle land automatically it is High Precision Automatic identification drop zone method | |
| KR101778209B1 (en) | Digital map making system using auto generated digital map data of polygon type | |
| Masselli et al. | A novel marker based tracking method for position and attitude control of MAVs | |
| JP7795547B2 (en) | Image processing device, image processing method and program | |
| Masselli et al. | A cross-platform comparison of visual marker based approaches for autonomous flight of quadrocopters | |
| US20200073409A1 (en) | Uav landing system and landing method thereof | |
| Liu et al. | Vision-guided autonomous landing of multirotor UAV on fixed landing marker | |
| ES2895461T3 (en) | Object recognition procedure and system by analyzing digital image signals of a scene | |
| US20210327283A1 (en) | Systems and Methods for Mobile Aerial Flight Planning and Image Capturing Based on Structure Footprints | |
| Gabdullin et al. | Analysis of onboard sensor-based odometry for a quadrotor uav in outdoor environment | |
| JP6714802B2 (en) | Control device, flying body, control method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21718627 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21718627 Country of ref document: EP Kind code of ref document: A1 |