CN106927059A - A kind of unmanned plane landing method and device based on monocular vision - Google Patents
A kind of unmanned plane landing method and device based on monocular vision Download PDFInfo
- Publication number
- CN106927059A CN106927059A CN201710216117.7A CN201710216117A CN106927059A CN 106927059 A CN106927059 A CN 106927059A CN 201710216117 A CN201710216117 A CN 201710216117A CN 106927059 A CN106927059 A CN 106927059A
- Authority
- CN
- China
- Prior art keywords
- flight
- unmanned plane
- point
- pilot point
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000007613 environmental effect Effects 0.000 claims description 13
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 10
- 238000000605 extraction Methods 0.000 claims description 8
- 230000004927 fusion Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 230000001960 triggered effect Effects 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 230000008054 signal transmission Effects 0.000 claims description 3
- 230000010006 flight Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000001914 filtration Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
- B64D45/04—Landing aids; Safety measures to prevent collision with earth's surface
- B64D45/08—Landing aids; Safety measures to prevent collision with earth's surface optical
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
Abstract
The invention discloses a kind of unmanned plane landing method and device based on monocular vision, including:When unmanned plane prepares to make a return voyage, target flight pilot point is determined, and control unmanned plane to be flown to target flight pilot point;When unmanned plane is in target flight pilot point, using target flight pilot point as current flight point, and judge attitude information and monocular image information of the unmanned plane in current flight point, whether matched with the attitude information and monocular image information of the current flight point stored in flight map;If mismatching, the position of unmanned plane is adjusted;If matching, using next flight pilot point as target flight pilot point, and unmanned plane is controlled to be flown to target flight pilot point, until unmanned plane landing ground.It can be seen that, when unmanned plane makes a return voyage, the path of making a return voyage of unmanned plane can be corrected according to the flight map for creating, so as to ensure that the Autonomous landing of unmanned plane, make the landing of unmanned plane more accurate and intelligence.
Description
Technical field
The present invention relates to unmanned air vehicle technique field, more specifically to a kind of unmanned plane landing based on monocular vision
Method and device.
Background technology
Now, unmanned plane by its easy-to-use, maneuverability and it is easily controllable the advantages of the research as aviation field
Focus, and be widely used in video monitor, scouting, aerial survey take photo by plane, the occasion such as disaster is searched and rescued, production of film and TV.However, unmanned plane exists
Independent landing under complex environment is both technological difficulties and our increasingly focus of attention.Unmanned plane independent landing refers to nothing
The man-machine process that location navigation and final landing are carried out using navigation equipment and flight control system.Independent landing is to navigation and control
Precision processed and reliability have requirement higher, are important foundation and key technology that unmanned plane realizes autonomous flight.Traditional
The airmanship of unmanned plane independent landing includes:Inertial navigation system, GPS navigation system and inertial navigation system/GPS navigation system
The navigation system of system combination.
Inertial navigation system is the navigation ginseng such as Position And Velocity that carrier is obtained using inertance elements such as gyro, speedometers
Number, and then realize the navigation system of navigation.General, an inertial navigation system contains three accelerometers and three of single shaft
The gyro of individual single shaft, accelerometer detection object founds the acceleration signal of three axles in carrier coordinate system unification and independence, and gyro detection is carried
Body phase measures object angular speed and acceleration in three dimensions, and solve with this for the angular velocity signal of navigational coordinate system
Calculate the attitude of object.Its disadvantage is that error can dissipate over time, therefore, it is difficult to work independently for a long time,
And need to be applied in combination with other air navigation aids.
GPS navigation system be based on global 24 positioning satellites, to all parts of the world it is round-the-clock three-dimensional is provided
A kind of radio-navigation positioning system of the information such as position, three-dimensional velocity.It is made up of three parts, and one is ground control segment,
It is made up of master station, ground-plane antenna, monitoring station and communication accessory system.Two is space segment, is made up of 24 satellites, is distributed in
6 orbit planes.Three is user's set part, is made up of GPS and satellite antenna.Now civilian positioning precision is reachable
In 10 meters.Because GPS navigation system fully relies on aeronautical satellite, there are problems that signal easily be disturbed with, therefore
It is not a kind of entirely autonomous airmanship.
Therefore, how to solve problem present in conventional navigation mode, improve the accuracy of unmanned plane landing and intelligent,
It is that those skilled in the art need to solve.
The content of the invention
It is an object of the invention to provide a kind of unmanned plane landing method and device based on monocular vision, to realize nobody
The Autonomous landing of machine, makes the landing of unmanned plane more accurate and intelligence.
To achieve the above object, the embodiment of the invention provides following technical scheme:
A kind of unmanned plane landing method based on monocular vision, including:
When unmanned plane prepares to make a return voyage, it is determined that the target flight pilot point adjacent with unmanned plane during flying terminal;
Using the positional information of target flight pilot point, control unmanned plane flies to target flight pilot point;
When unmanned plane is in target flight pilot point, using target flight pilot point as current flight point, and nothing is judged
The attitude letter of the man-machine current flight point stored in the attitude information and monocular image information, with flight map of current flight point
Whether breath and monocular image information match;
If mismatching, the position of unmanned plane is adjusted, and continue executing with appearance of the judgement unmanned plane in current flight point
Whether the attitude information and monocular image information of the current flight point stored in state information and monocular image information, with flight map
The step of matching;
If matching, using next flight pilot point as target flight pilot point, and continues executing with the utilization target
The positional information of flight pilot point, control unmanned plane to target flight pilot point flight the step of, until unmanned plane landing ground.
Wherein, after the unmanned plane landing ground, also include:
Ground station sends unmanned plane landing and notifies.
Wherein, the unmanned plane obtains unmanned plane and is flown at each by being placed in the monocular-camera of unmanned machine base
The monocular image information of pilot point.
Wherein, it is described when unmanned plane prepares to make a return voyage, it is determined that the target flight pilot point adjacent with unmanned plane during flying terminal
Before, also include:
Unmanned plane during by way of each flight pilot point, gathers each flight in the flight course for not reaching final
The attitude information and monocular image information of pilot point, and attitude information and the monocular image letter according to each flight pilot point
Breath creates the flight map.
Wherein, the flight map according to the attitude information and monocular image information creating of each flight pilot point,
Including:
Monocular image information to each flight pilot point is pre-processed, and extraction environment feature;
Flight position to unmanned plane in each flight pilot point is calculated, and will calculate that result carries out data form and turns
Change;
The flight position after the conversion of environmental characteristic and form to each flight pilot point carries out uncertain information fusion, and
Kalman filtering treatment is extended, and the flight map is built according to the result of each flight pilot point.
A kind of unmanned plane landing-gear based on monocular vision, including:
Target flight pilot point determining module, for unmanned plane prepare make a return voyage when, it is determined that with unmanned plane during flying terminal phase
Adjacent target flight pilot point;
UAV Flight Control module, for the positional information using target flight pilot point, control unmanned plane is to target
Flight pilot point is flown, and when unmanned plane is in target flight pilot point, using target flight pilot point as current flight point;
Judge module, for judging attitude information and monocular image information of the unmanned plane in current flight point, with flight ground
Whether the attitude information and monocular image information of the current flight point stored in figure match;
Correction module, for working as being stored in the attitude information and monocular image information of current flight point and flight map
When the attitude information and monocular image information of preceding flight point are mismatched, unmanned seat in the plane is sent to the UAV Flight Control module
Adjust instruction is put, and the judge module is triggered after position adjustment;
Target flight pilot point update module, for the attitude information and monocular image information in current flight point and flight
When the attitude information and monocular image information matches of the current flight point stored in map, using next flight pilot point as mesh
Mark flight pilot point, and the UAV Flight Control module is triggered, until unmanned plane landing ground.
Wherein, also include:
Wireless signal transmission module, for after unmanned plane landing ground, ground station to send unmanned plane landing
Notify.
Wherein, the unmanned plane obtains unmanned plane and is flown at each by being placed in the monocular-camera of unmanned machine base
The monocular image information of pilot point.
Wherein, also include:
Flight map building module, in the flight course that unmanned plane does not reach final, by way of each flight
During pilot point, the attitude information and monocular image information of each flight pilot point are gathered, and according to described each flight pilot point
Attitude information and flight map described in monocular image information creating.
Wherein, the flight map building module includes:
Environmental characteristic extraction unit, pre-processes, and extract for the monocular image information to each flight pilot point
Environmental characteristic;
Flight position projected unit, calculates for the flight position to unmanned plane in each flight pilot point, and will
Calculate that result carries out Data Format Transform;
Flight map constructing unit, for the flight position after the environmental characteristic to each flight pilot point and form conversion
Uncertain information fusion is carried out, and carries out EKF treatment, and according to the result structure of each flight pilot point
Build the flight map.
By above scheme, a kind of unmanned plane landing method based on monocular vision provided in an embodiment of the present invention,
Including:When unmanned plane prepares to make a return voyage, it is determined that the target flight pilot point adjacent with unmanned plane during flying terminal;Using target flight
The positional information of pilot point, control unmanned plane flies to target flight pilot point;When unmanned plane is in target flight pilot point,
Using target flight pilot point as current flight point, and judge that unmanned plane is believed in the attitude information and monocular image of current flight point
Whether breath, match with the attitude information and monocular image information of the current flight point stored in flight map;If mismatching, adjust
The position of whole unmanned plane, and attitude information and monocular image information of the judgement unmanned plane in current flight point are continued executing with,
The step of whether being matched with the attitude information and monocular image information of the current flight point stored in flight map;If matching,
Using next flight pilot point as target flight pilot point, and continue executing with the position letter of the utilization target flight pilot point
Breath, control unmanned plane to target flight pilot point flight the step of, until unmanned plane landing ground.
It can be seen that, in this programme, unmanned plane in flight, can according to by way of multiple flight pilot points create flight ground
Figure, so when unmanned plane makes a return voyage, just can be corrected, so as to protect according to the flight map for creating to the path of making a return voyage of unmanned plane
The Autonomous landing of unmanned plane is demonstrate,proved, has made the landing of unmanned plane more accurate and intelligence;Monocular is based on the invention also discloses one kind
The unmanned plane landing-gear of vision, can equally realize above-mentioned technique effect.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
The accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this
Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can be with
Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of unmanned plane landing method schematic flow sheet based on monocular vision disclosed in the embodiment of the present invention;
Fig. 2 is a kind of unmanned plane landing system schematic diagram based on monocular vision disclosed in the embodiment of the present invention.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on
Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made
Embodiment, belongs to the scope of protection of the invention.
The embodiment of the invention discloses a kind of unmanned plane landing method and device based on monocular vision, to realize unmanned plane
Autonomous landing, make the landing of unmanned plane more accurate and intelligence.
Referring to Fig. 1, a kind of unmanned plane landing method based on monocular vision provided in an embodiment of the present invention, including:
S101, unmanned plane prepare make a return voyage when, it is determined that the target flight pilot point adjacent with unmanned plane during flying terminal;
Specifically, flight pilot point in the present embodiment is taken off and is dropped for the guiding unmanned plane on unmanned plane during flying path
The reference point for falling, during flight pilot point is set, can adjust the number of flight pilot point, to adapt to actual conditions
Need.For example:In the case of higher to landing required precision, more flight pilot point, lifting landing can be set
Precision.In the case of relatively low to landing required precision, the flight pilot point of negligible amounts, the effect of lifting system can be set
Rate.
S102, the positional information using target flight pilot point, control unmanned plane fly to target flight pilot point;
Specifically, in the present embodiment, after unmanned plane during flying to terminal, earth station, ground station reception to nothing can be notified
After the announcement information reached home of man-machine transmission, order of making a return voyage can be assigned, what such unmanned plane just can be provided by this programme
Unmanned plane landing method makes a return voyage and lands.It should be noted that there are multiple flight pilot points in the path of making a return voyage of unmanned plane,
Determine a flight pilot point nearest apart from final as target flight pilot point in the present embodiment first, you can to manage
Xie Wei, the unmanned plane next place to be flown as target flight pilot point during making a return voyage;Therefore, it is determined that target flies
After row pilot point, just unmanned plane can be controlled to be flown to target flight pilot point according to the positional information of target flight pilot point.
S103, when unmanned plane be in target flight pilot point when, using target flight pilot point as current flight point, and sentence
The appearance of the current flight point that disconnected unmanned plane is stored in the attitude information and monocular image information, with flight map of current flight point
Whether state information and monocular image information match;
If S104, mismatch, adjust the position of unmanned plane, and continue executing with the judgement unmanned plane in current flight point
Attitude information and monocular image information, with flight map in store current flight point attitude information and monocular image information
The step of whether matching;
If S105, matching, using next flight pilot point as target flight pilot point, and continue executing with the utilization
The positional information of target flight pilot point, control unmanned plane to target flight pilot point flight the step of, until unmanned plane land
Ground.
Specifically, after control unmanned plane reaches target flight point from final, the target flight of arrival can be drawn
Lead a little as current flight point, unmanned plane, it is necessary to gather monocular image information and attitude information, and passes through on current flight point
Will gather monocular image information and attitude information with flight map storage attitude information corresponding with current flight point and
Monocular image information is matched, and constantly itself attitude of adjustment unmanned plane and position, qualified until matching, then is flown next
Row pilot point so as to realize that control unmanned plane makes a return voyage, and completes unmanned plane landing as target flight pilot point.
It should be noted that flight map in the present embodiment can be unmanned plane Real-time Collection letter in flight course
The flight map that cease and build, or user prestores;Real-time Collection can be understood as in flight course:Nobody
When machine flies to final by way of each flight pilot point, collection is believed in the monocular image information and attitude of each flight pilot point
Breath, and flight map is built, so when making a return voyage, and the reference that the flight map can be maked a return voyage as unmanned plane;User is advance
The flight map of storage can be understood as:Certain paths needs multiple unmanned plane to be surveyed, at this moment can be by unmanned plane structure first
The flight map built is preserved, so, at the path of unmanned plane during flying next time, just can directly according to flying for prestoring
Row map realizes that the accurate of unmanned plane is maked a return voyage as reference.
Specifically, attitude information and monocular image information of the unmanned plane in current flight point are judged in the present embodiment, with
When whether the attitude information and monocular image information of the current flight point stored in flight map match, the nothing that will can be gathered first
Man-machine attitude information is contrasted with the attitude information in flight map, is judged whether identical;If differing, by unmanned plane
Pose adjustment be consistent with the attitude in flight map;After the attitude of unmanned plane is consistent with the attitude in flight map,
Just judge whether the monocular image information that attitude is unanimously gathered afterwards matches with the monocular image information stored in flight map, here
Matching can be understood as identify building position it is whether identical;If matching, represents the current flight point residing for unmanned plane
It is accurate, i.e., unmanned plane is not gone off course.
Based on above-described embodiment, after the unmanned plane landing ground, also include:
Ground station sends unmanned plane landing and notifies.The unmanned plane is imaged by being placed in the monocular of unmanned machine base
Machine, obtains monocular image information of the unmanned plane in each flight pilot point.
Specifically, in this embodiment, it is assumed that in the middle of the flight path A-B of unmanned plane, there are 3 flight pilot points, A is winged
Beginning-of-line, B is final, three in flight path flight pilot point according to apart from flight starting point from closely to remote order
For:M flights pilot point, n flights pilot point and k flight pilot points;Unmanned plane is positioned over A points, unmanned plane standby for takeoff is adjusted.
In the present embodiment, the monocular-camera from 4K resolution ratio is installed on unmanned machine base, constitutes the list of unmanned plane
Mesh module, monocular image information is gathered by monocular module.Earth station sends takeoff order, can be automatic during unmanned plane during flying
Saving sequence number is:The UAV Attitude and monocular image of m flights pilot point, flight pilot point and k flight pilot points, when nobody
Machine flies when to terminal, completes map building and simultaneously preserves map.Now, unmanned plane notifies that earth station reaches home, earth station's hair
The number of delivering letters makes a return voyage to unmanned plane, control unmanned plane.Unmanned plane successively guides k flights pilot point, n flights during making a return voyage
Point and m flights pilot point are used as target flight pilot point, and attitude according to unmanned plane and monocular image correction unmanned plane
Position, it is to avoid unmanned plane makes a return voyage and land failure.After unmanned plane completes landing, earth station is notified, close unmanned plane and terminate flight.
It is described when unmanned plane prepares to make a return voyage based on above-described embodiment, it is determined that the target adjacent with unmanned plane during flying terminal
Before flight pilot point, also include:
Unmanned plane during by way of each flight pilot point, gathers each flight in the flight course for not reaching final
The attitude information and monocular image information of pilot point, and attitude information and the monocular image letter according to each flight pilot point
Breath creates the flight map.
Wherein, the flight map according to the attitude information and monocular image information creating of each flight pilot point,
Including:
Monocular image information to each flight pilot point is pre-processed, and extraction environment feature;
Flight position to unmanned plane in each flight pilot point is calculated, and will calculate that result carries out data form and turns
Change;
The flight position after the conversion of environmental characteristic and form to each flight pilot point carries out uncertain information fusion, and
Kalman filtering treatment is extended, and the flight map is built according to the result of each flight pilot point.
Specifically, the flight map in the present embodiment is unmanned plane being gathered by way of each flight pilot point in flight course
Created after information, the information of collection includes the UAV Attitude and monocular image information of current flight point, for example:Nobody
The machine attitude information such as including the flight angle and the anglec of rotation of unmanned plane, monocular image information is to be shot by monocular camera
Ground two bit images;When creating flight map when unmanned plane during flying to final, can specifically include:
S11:Image preprocessing is carried out to the monocular image information that collection gets;
S12:Environmental characteristic extraction is carried out to the monocular image information after image preprocessing;
S13:Flight position reckoning is carried out to unmanned plane, the result of reckoning is carried out into Data Format Transform;
S14:Uncertain information fusion is carried out to the result after step S12 and S13 treatment;
S15:Result to uncertain information fusion is extended Kalman filtering treatment, is updated according to the result after treatment
Map builds map, and updates the position of unmanned plane according to the result after treatment and carry out unmanned plane and position in real time.
Unmanned plane landing-gear provided in an embodiment of the present invention is introduced below, unmanned plane landing dress described below
Putting can be with cross-referenced with above-described unmanned plane landing method.
Referring to Fig. 2, a kind of unmanned plane landing-gear based on monocular vision provided in an embodiment of the present invention, including:
Target flight pilot point determining module 100, for unmanned plane prepare make a return voyage when, it is determined that with unmanned plane during flying terminal
Adjacent target flight pilot point;
UAV Flight Control module 200, for the positional information using target flight pilot point, control unmanned plane is to mesh
Mark flight pilot point flight, and when unmanned plane is in target flight pilot point, using target flight pilot point as current flight
Point;
Judge module 300, for judging attitude information and monocular image information of the unmanned plane in current flight point, with flight
Whether the attitude information and monocular image information of the current flight point stored in map match;
Correction module 400, for being stored in the attitude information and monocular image information of current flight point and flight map
Current flight point attitude information and monocular image information mismatch when, send nobody to the UAV Flight Control module
Adjust instruction is put in seat in the plane, and the judge module is triggered after position adjustment;
Target flight pilot point update module 500, for the attitude information and monocular image information in current flight point with
When the attitude information and monocular image information matches of the current flight point stored in flight map, next flight pilot point is made
It is target flight pilot point, and triggers the UAV Flight Control module, until unmanned plane landing ground.
Based on above-described embodiment, also include:
Wireless signal transmission module, for after unmanned plane landing ground, ground station to send unmanned plane landing
Notify.
Based on above-described embodiment, the unmanned plane obtains unmanned plane by being placed in the monocular-camera of unmanned machine base
In the monocular image information of each flight pilot point.
Based on above-described embodiment, also include:
Flight map building module, in the flight course that unmanned plane does not reach final, by way of each flight
During pilot point, the attitude information and monocular image information of each flight pilot point are gathered, and according to described each flight pilot point
Attitude information and flight map described in monocular image information creating.
Based on above-described embodiment, the flight map building module includes:
Environmental characteristic extraction unit, pre-processes, and extract for the monocular image information to each flight pilot point
Environmental characteristic;
Flight position projected unit, calculates for the flight position to unmanned plane in each flight pilot point, and will
Calculate that result carries out Data Format Transform;
Flight map constructing unit, for the flight position after the environmental characteristic to each flight pilot point and form conversion
Uncertain information fusion is carried out, and carries out EKF treatment, and according to the result structure of each flight pilot point
Build the flight map.
It should be noted that the unmanned plane landing modes based on monocular vision of this programme are used:Immediately positioning with
Map structuring technology, i.e., under the conditions of self-position is uncertain, create map in complete graphics communication, while using map
Carry out autonomous positioning and navigation.It can be described as:Moved since a unknown position in circumstances not known, in moving process
It is middle that self poisoning is carried out according to location estimation and sensing data, while building increment type map.It mainly includes three steps:
(1) position:Must be known by oneself position in the environment.
(2) figure is built:The position (if it is known that the position of oneself) of feature in environment must be recorded.
(3) map building:Robot sets up environmental map while positioning.Its general principle transported probability statistics
Method, positioning is reached by multiple features matching and position error is reduced.
And it is traditional realize positioning by establishment of coordinate system and navigate, the main principle realized is as follows:By building in advance
Vertical space coordinates, it is only necessary to know current positional information, by calculating position in a coordinate system, so as to realize
Navigation and positioning.
It can be seen that, compared with traditional location navigation mode, this programme is by this instant positioning and map structuring for this programme
Technology, when unmanned plane is in flight, according to by way of multiple flight pilot points create flight map, so maked a return voyage in unmanned plane
When, just the path of making a return voyage of unmanned plane can be corrected according to the flight map for creating, so as to ensure that the autonomous drop of unmanned plane
Fall, make the landing of unmanned plane more accurate and intelligence;And said process, is independently realized by unmanned plane, without artificially entering
Row manipulation, intelligence degree is very high, and ensure that unmanned plane either in flight course, process of making a return voyage or descent
All without departing from default navigation channel, precisely landing is realized.
Each embodiment is described by the way of progressive in this specification, and what each embodiment was stressed is and other
The difference of embodiment, between each embodiment identical similar portion mutually referring to.
The foregoing description of the disclosed embodiments, enables professional and technical personnel in the field to realize or uses the present invention.
Various modifications to these embodiments will be apparent for those skilled in the art, as defined herein
General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, the present invention
The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one
The scope most wide for causing.
Claims (10)
1. a kind of unmanned plane landing method based on monocular vision, it is characterised in that including:
When unmanned plane prepares to make a return voyage, it is determined that the target flight pilot point adjacent with unmanned plane during flying terminal;
Using the positional information of target flight pilot point, control unmanned plane flies to target flight pilot point;
When unmanned plane is in target flight pilot point, using target flight pilot point as current flight point, and unmanned plane is judged
In the attitude information and monocular image information, with flight map of current flight point store current flight point attitude information and
Whether monocular image information matches;
If mismatching, the position of unmanned plane is adjusted, and continue executing with the judgement unmanned plane and believed in the attitude of current flight point
Whether breath and monocular image information, match with the attitude information and monocular image information of the current flight point stored in flight map
The step of;
If matching, using next flight pilot point as target flight pilot point, and continues executing with the utilization target flight
The positional information of pilot point, control unmanned plane to target flight pilot point flight the step of, until unmanned plane landing ground.
2. unmanned plane landing method according to claim 1, it is characterised in that after the unmanned plane landing ground, also
Including:
Ground station sends unmanned plane landing and notifies.
3. unmanned plane landing method according to claim 2, it is characterised in that the unmanned plane is by being placed in unmanned plane
The monocular-camera of base, obtains monocular image information of the unmanned plane in each flight pilot point.
4. the unmanned plane landing method according to any one in claim 1-3, it is characterised in that described accurate in unmanned plane
For when making a return voyage, it is determined that before the target flight pilot point adjacent with unmanned plane during flying terminal, also including:
Unmanned plane during by way of each flight pilot point, gathers each flight guiding in the flight course for not reaching final
The attitude information and monocular image information of point, and attitude information and monocular image the information wound according to each flight pilot point
Build the flight map.
5. unmanned plane landing method according to claim 4, it is characterised in that according to the appearance of each flight pilot point
State information and flight map described in monocular image information creating, including:
Monocular image information to each flight pilot point is pre-processed, and extraction environment feature;
Flight position to unmanned plane in each flight pilot point is calculated, and will calculate that result carries out Data Format Transform;
The flight position after the conversion of environmental characteristic and form to each flight pilot point carries out uncertain information fusion, and carries out
EKF treatment, and the flight map is built according to the result of each flight pilot point.
6. a kind of unmanned plane landing-gear based on monocular vision, it is characterised in that including:
Target flight pilot point determining module, for when unmanned plane prepares to make a return voyage, it is determined that adjacent with unmanned plane during flying terminal
Target flight pilot point;
UAV Flight Control module, for the positional information using target flight pilot point, control unmanned plane is to target flight
Pilot point is flown, and when unmanned plane is in target flight pilot point, using target flight pilot point as current flight point;
Judge module, for judging unmanned plane in the attitude information and monocular image information, with flight map of current flight point
Whether the attitude information and monocular image information of the current flight point of storage match;
Correction module, it is current winged for what is stored in the attitude information and monocular image information of current flight point and flight map
When the attitude information and monocular image information of row point are mismatched, send unmanned plane position to the UAV Flight Control module and adjust
Whole instruction, and the judge module is triggered after position adjustment;
Target flight pilot point update module, for the attitude information and monocular image information in current flight point and flight map
When the attitude information and monocular image information matches of the current flight point of middle storage, next flight pilot point is flown as target
Row pilot point, and the UAV Flight Control module is triggered, until unmanned plane landing ground.
7. unmanned plane landing-gear according to claim 6, it is characterised in that also include:
Wireless signal transmission module, for after unmanned plane landing ground, ground station to send unmanned plane landing and notifies.
8. unmanned plane landing-gear according to claim 7, it is characterised in that the unmanned plane is by being placed in unmanned plane
The monocular-camera of base, obtains monocular image information of the unmanned plane in each flight pilot point.
9. the unmanned plane landing-gear according to any one in claim 6-8, it is characterised in that also include:
Flight map building module, in the flight course that unmanned plane does not reach final, by way of each flight guiding
During point, the attitude information and monocular image information of each flight pilot point are gathered, and according to the appearance of each flight pilot point
State information and flight map described in monocular image information creating.
10. unmanned plane landing-gear according to claim 9, it is characterised in that the flight map building module includes:
Environmental characteristic extraction unit, pre-processes for the monocular image information to each flight pilot point, and extraction environment
Feature;
Flight position projected unit, calculates for the flight position to unmanned plane in each flight pilot point, and will calculate
Result carries out Data Format Transform;
Flight map constructing unit, is carried out for the flight position after the environmental characteristic to each flight pilot point and form conversion
Uncertain information is merged, and carries out EKF treatment, and builds institute according to the result of each flight pilot point
State flight map.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710216117.7A CN106927059A (en) | 2017-04-01 | 2017-04-01 | A kind of unmanned plane landing method and device based on monocular vision |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710216117.7A CN106927059A (en) | 2017-04-01 | 2017-04-01 | A kind of unmanned plane landing method and device based on monocular vision |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN106927059A true CN106927059A (en) | 2017-07-07 |
Family
ID=59426046
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710216117.7A Pending CN106927059A (en) | 2017-04-01 | 2017-04-01 | A kind of unmanned plane landing method and device based on monocular vision |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN106927059A (en) |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108153334A (en) * | 2017-12-01 | 2018-06-12 | 南京航空航天大学 | No cooperative target formula unmanned helicopter vision is independently maked a return voyage and drop method and system |
| CN108983812A (en) * | 2018-07-25 | 2018-12-11 | 哈尔滨工业大学 | A kind of onboard control system that unmanned plane sea is landed |
| CN109521787A (en) * | 2018-09-28 | 2019-03-26 | 易瓦特科技股份公司 | Method and device for aircraft mark landing |
| CN110362104A (en) * | 2019-06-06 | 2019-10-22 | 武汉易科空间信息技术股份有限公司 | A method and system for improving accuracy during unmanned aerial vehicle navigation |
| WO2020000386A1 (en) * | 2018-06-29 | 2020-01-02 | 深圳市大疆创新科技有限公司 | Flight control method, device and system, and storage medium |
| CN111352434A (en) * | 2018-12-20 | 2020-06-30 | 波音公司 | Apparatus and method for supporting an aircraft approaching an airport runway at an airport |
| CN111615677A (en) * | 2018-11-28 | 2020-09-01 | 深圳市大疆创新科技有限公司 | A safe landing method, device, unmanned aerial vehicle and medium for unmanned aerial vehicle |
| CN112327891A (en) * | 2020-11-16 | 2021-02-05 | 南京邮电大学 | Unmanned aerial vehicle autonomous landing system and method |
| WO2021078167A1 (en) * | 2019-10-21 | 2021-04-29 | 深圳市道通智能航空技术有限公司 | Aerial vehicle return control method and apparatus, aerial vehicle, and storage medium |
| CN114355378A (en) * | 2022-03-08 | 2022-04-15 | 天津云圣智能科技有限责任公司 | Autonomous navigation method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium |
| CN114489140A (en) * | 2022-02-16 | 2022-05-13 | 中国电子科技集团公司第五十四研究所 | A method for precise autonomous take-off and landing of unmanned aerial vehicles in an unmarked environment |
| CN114973780A (en) * | 2022-07-27 | 2022-08-30 | 中国铁塔股份有限公司湖北省分公司 | Unmanned aerial vehicle shutdown data communication method, device, equipment and storage medium |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102435188A (en) * | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment |
| CN105607645A (en) * | 2016-01-20 | 2016-05-25 | 杭州米为科技有限公司 | Unmanned aerial vehicle, unmanned aerial vehicle return method, and control terminal |
| CN105678289A (en) * | 2016-03-07 | 2016-06-15 | 谭圆圆 | Control method and device of unmanned aerial vehicle |
| CN106325299A (en) * | 2016-09-13 | 2017-01-11 | 上海顺砾智能科技有限公司 | Unmanned plane return flight landing method based on visual sense |
| TWI571718B (en) * | 2015-09-08 | 2017-02-21 | Nat Chin-Yi Univ Of Tech | Automatic cruise spray cleaning method and system for unmanned aerial vehicles |
| EP2366131B1 (en) * | 2008-12-15 | 2017-03-22 | Saab AB | Method and system for facilitating autonomous landing of aerial vehicles on a surface |
-
2017
- 2017-04-01 CN CN201710216117.7A patent/CN106927059A/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2366131B1 (en) * | 2008-12-15 | 2017-03-22 | Saab AB | Method and system for facilitating autonomous landing of aerial vehicles on a surface |
| CN102435188A (en) * | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | A Monocular Vision/Inertial Fully Autonomous Navigation Method for Indoor Environment |
| TWI571718B (en) * | 2015-09-08 | 2017-02-21 | Nat Chin-Yi Univ Of Tech | Automatic cruise spray cleaning method and system for unmanned aerial vehicles |
| CN105607645A (en) * | 2016-01-20 | 2016-05-25 | 杭州米为科技有限公司 | Unmanned aerial vehicle, unmanned aerial vehicle return method, and control terminal |
| CN105678289A (en) * | 2016-03-07 | 2016-06-15 | 谭圆圆 | Control method and device of unmanned aerial vehicle |
| CN106325299A (en) * | 2016-09-13 | 2017-01-11 | 上海顺砾智能科技有限公司 | Unmanned plane return flight landing method based on visual sense |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108153334B (en) * | 2017-12-01 | 2020-09-25 | 南京航空航天大学 | Visual autonomous return and landing method and system for unmanned helicopter without cooperative target |
| CN108153334A (en) * | 2017-12-01 | 2018-06-12 | 南京航空航天大学 | No cooperative target formula unmanned helicopter vision is independently maked a return voyage and drop method and system |
| WO2020000386A1 (en) * | 2018-06-29 | 2020-01-02 | 深圳市大疆创新科技有限公司 | Flight control method, device and system, and storage medium |
| CN108983812B (en) * | 2018-07-25 | 2021-06-04 | 哈尔滨工业大学 | A shipboard control system for unmanned aerial vehicle landing at sea |
| CN108983812A (en) * | 2018-07-25 | 2018-12-11 | 哈尔滨工业大学 | A kind of onboard control system that unmanned plane sea is landed |
| CN109521787A (en) * | 2018-09-28 | 2019-03-26 | 易瓦特科技股份公司 | Method and device for aircraft mark landing |
| CN111615677A (en) * | 2018-11-28 | 2020-09-01 | 深圳市大疆创新科技有限公司 | A safe landing method, device, unmanned aerial vehicle and medium for unmanned aerial vehicle |
| CN111615677B (en) * | 2018-11-28 | 2024-04-12 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium |
| CN111352434B (en) * | 2018-12-20 | 2023-11-28 | 波音公司 | Apparatus and method for supporting aircraft approaching an airport runway |
| CN111352434A (en) * | 2018-12-20 | 2020-06-30 | 波音公司 | Apparatus and method for supporting an aircraft approaching an airport runway at an airport |
| CN110362104A (en) * | 2019-06-06 | 2019-10-22 | 武汉易科空间信息技术股份有限公司 | A method and system for improving accuracy during unmanned aerial vehicle navigation |
| CN110362104B (en) * | 2019-06-06 | 2022-03-15 | 武汉易科空间信息技术股份有限公司 | Method and system for improving precision in unmanned aerial vehicle navigation process |
| WO2021078167A1 (en) * | 2019-10-21 | 2021-04-29 | 深圳市道通智能航空技术有限公司 | Aerial vehicle return control method and apparatus, aerial vehicle, and storage medium |
| US20220317705A1 (en) * | 2019-10-21 | 2022-10-06 | Autel Robotics Co., Ltd. | Aircraft return control method and device, aircraft and storage medium |
| CN112327891A (en) * | 2020-11-16 | 2021-02-05 | 南京邮电大学 | Unmanned aerial vehicle autonomous landing system and method |
| CN114489140A (en) * | 2022-02-16 | 2022-05-13 | 中国电子科技集团公司第五十四研究所 | A method for precise autonomous take-off and landing of unmanned aerial vehicles in an unmarked environment |
| CN114355378A (en) * | 2022-03-08 | 2022-04-15 | 天津云圣智能科技有限责任公司 | Autonomous navigation method and device for unmanned aerial vehicle, unmanned aerial vehicle and storage medium |
| CN114973780A (en) * | 2022-07-27 | 2022-08-30 | 中国铁塔股份有限公司湖北省分公司 | Unmanned aerial vehicle shutdown data communication method, device, equipment and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106927059A (en) | A kind of unmanned plane landing method and device based on monocular vision | |
| AU2021202509B2 (en) | Image based localization for unmanned aerial vehicles, and associated systems and methods | |
| CN105022401B (en) | Many four rotor wing unmanned aerial vehicles collaboration SLAM methods of view-based access control model | |
| CN109813311B (en) | A UAV formation collaborative navigation method | |
| US10240930B2 (en) | Sensor fusion | |
| CN106708066B (en) | Autonomous landing method of UAV based on vision/inertial navigation | |
| Xiang et al. | Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV) | |
| US20070093945A1 (en) | System and method for onboard vision processing | |
| US12315268B2 (en) | Vision-based landing system | |
| CN107065925A (en) | A kind of unmanned plane makes a return voyage method and device | |
| CN108255190B (en) | Accurate landing method based on multiple sensors and tethered unmanned aerial vehicle using same | |
| KR20120006160A (en) | Unmanned Vehicle Automatic and Manual Control System Using Smartphone | |
| CN106125761B (en) | UAV Navigation System and air navigation aid | |
| CN108496130A (en) | Flight control method, equipment, control terminal and its control method, unmanned plane | |
| CN106767791A (en) | A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing | |
| CN117554990B (en) | Lidar SLAM positioning and navigation method and unmanned aerial vehicle system | |
| CN107144281A (en) | Unmanned plane indoor locating system and localization method based on cooperative target and monocular vision | |
| CN113920186B (en) | Low-altitude unmanned-machine multi-source fusion positioning method | |
| CN112198903A (en) | Modular multifunctional onboard computer system | |
| CN113063401A (en) | Unmanned aerial vehicle aerial survey system | |
| CN108107905A (en) | A kind of scenic spot is taken photo by plane flight system and its control method | |
| KR20250028773A (en) | Dead Reckoning System and Automatic Returning Method for Aircraft including the same | |
| CN115790595B (en) | Unmanned aerial vehicle fault-tolerant navigation method and system based on combination of communication and perception | |
| CN120540327A (en) | Guidance method and system for autonomous landing of unmanned aerial vehicle | |
| CN120803008A (en) | Accurate positioning control method and device for vehicle-mounted unmanned aerial vehicle |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170707 |