CN117218564B - A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles - Google Patents
A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles Download PDFInfo
- Publication number
- CN117218564B CN117218564B CN202311324627.8A CN202311324627A CN117218564B CN 117218564 B CN117218564 B CN 117218564B CN 202311324627 A CN202311324627 A CN 202311324627A CN 117218564 B CN117218564 B CN 117218564B
- Authority
- CN
- China
- Prior art keywords
- grid image
- equipment
- unmanned aerial
- aerial vehicle
- sample
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Image Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a method, a system, equipment and a medium for generating a collaborative decision criterion of an unmanned aerial vehicle, and relates to the technical field of unmanned aerial vehicles. And acquiring task data of the unmanned aerial vehicle and corresponding task emergency data when the decision criterion trigger data are received. And when the task emergency data is not emergency, carrying out opponent equipment identification on the surrounding environment image data corresponding to the unmanned aerial vehicle to obtain the number of opponent equipment and the characteristic data of the opponent equipment. And when the number of the opponent equipment is smaller than a preset value, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station. And performing image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image. The environment situation grid image is input into a multi-collaborative decision criterion classification model, so that collaborative decision criteria meeting the current environment situation of the unmanned aerial vehicle can be scientifically and accurately generated.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to a method, a system, equipment and a medium for generating a collaborative decision criterion of an unmanned aerial vehicle.
Background
With the rapid development of unmanned aerial vehicle technology, unmanned aerial vehicle can carry out various complicated tasks, but in the task execution process, unmanned aerial vehicle receives the constraint of factors such as task diversity, environmental complexity and computational power deficiency, makes it difficult to carry out autonomous decision under the changeable task environment of complicacy. When an emergency is encountered during task execution, the unmanned aerial vehicle needs to adjust own tasks according to real-time conditions, but when adjusting own task planning, certain decision criteria need to be followed, such as maximum task completion rate, maximum area coverage, fastest task planning time and the like.
The existing unmanned aerial vehicle decision criterion generation method mainly comprises rule matching based on a rule base and rule prediction based on deep reinforcement learning, wherein the rule base is manually extracted and constructed for subsequent rule matching after the existing massive historical data are analyzed and processed, and the method is time-consuming and labor-consuming and is difficult to adapt to a dynamic situation environment. The latter can match different situations in a dynamic simulation mode, aims at making predictions for various situations which may occur in the future, but predicts a single criterion, and does not consider the correlation and priority among a plurality of criteria, so that the generated decision criterion has poor applicability.
Disclosure of Invention
The invention provides a method, a system, equipment and a medium for generating a collaborative decision criterion of an unmanned aerial vehicle, which solve the technical problems that the existing unmanned aerial vehicle decision criterion generating method only can predict a single criterion and does not consider the correlation and priority among a plurality of criteria, so that the generated decision criterion has poor applicability.
The invention provides a method for generating a collaborative decision criterion of an unmanned aerial vehicle, which comprises the following steps:
when the decision criterion triggering data is received, acquiring task data of the unmanned aerial vehicle and corresponding task emergency data;
When the task emergency data is not emergency, carrying out opponent equipment identification on the surrounding environment image data corresponding to the unmanned aerial vehicle to obtain opponent equipment quantity and opponent equipment characteristic data;
When the number of the opponent equipment is smaller than a preset value, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station;
performing image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image;
and inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain a collaborative decision criterion corresponding to the unmanned aerial vehicle.
Optionally, the step of acquiring task data and corresponding task emergency data of the unmanned aerial vehicle includes:
Acquiring unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle;
and when the unmanned aerial vehicle authorization condition data is authorization, carrying out emergency judgment on the task data of the unmanned aerial vehicle to obtain task emergency data.
Optionally, the step of image rasterizing the opponent equipment feature data, the unmanned aerial vehicle equipment feature data and the ground data to construct an environmental situation grid image includes:
performing feature fusion on the opponent equipment feature data, the unmanned aerial vehicle equipment feature data and the ground data to obtain an initial real situation grid image set;
Rasterizing the initial real situation grid image set to obtain a target real situation grid image set;
According to the target real situation grid image set, an up-sampling algorithm SMOTE is adopted to generate a sample, and a first virtual situation grid image set is obtained;
generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set;
constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set;
and selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set, and constructing an environment situation grid image.
Optionally, the step of generating a sample by using an upsampling algorithm SMOTE according to the target real situation grid image set to obtain a first virtual situation grid image set includes:
respectively taking the equipment in the target real situation grid image set as a first sample generation object to obtain a first grid image sample set;
Selecting a root sample of a synthesized sample from the first grid image sample set according to a preset selection standard;
selecting a plurality of first sample generating objects with Euclidean distances between the first grid image sample set and the root sample within a preset distance range, and obtaining a neighbor auxiliary sample point set;
Selecting auxiliary sample points from the adjacent auxiliary sample point sets according to a preset up-sampling rate to obtain auxiliary sample point sets;
Substituting the auxiliary sample points in the auxiliary sample point set into a preset sample formula respectively to update samples, and generating a virtual sample;
The preset sample formula is:
xnew=xi+(x′i-xi)×rand(0,1);
Wherein x new is a first virtual situation grid image, x i is a root sample, x' i is an auxiliary sample point, and rand (0, 1) is a random function between 0 and 1;
And constructing a first virtual situation grid image set by adopting all the virtual samples.
Optionally, the step of generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set includes:
respectively taking the equipment in the target real situation grid image set as a second sample generation object to obtain a second grid image sample set;
Respectively selecting a maximum value and a minimum value corresponding to each equipment characteristic in the second grid image sample set to obtain a characteristic maximum value and a characteristic minimum value;
calculating an average value between the characteristic maximum value and the characteristic minimum value to obtain a characteristic average value;
calculating the number of equipment smaller than the characteristic mean value in the second grid image sample set to obtain a first number of equipment;
Calculating the number of equipment larger than the characteristic mean value in the second grid image sample set to obtain a second number of equipment;
Substituting the first equipment number, the second equipment number and the characteristic mean value into a preset boundary value calculation formula to calculate a boundary value, so as to obtain an upper boundary value and a lower boundary value corresponding to the equipment characteristic;
the preset boundary value calculation formula is as follows:
Wherein UB is an upper boundary value, CL is a characteristic mean value, skew U is a right skewness based on the characteristic mean value; The method comprises the steps of setting equipment variance corresponding to a second grid image sample set, setting N U as the second equipment number, setting LB as a lower boundary value, setting Skew L as left deviation based on a characteristic mean value, setting N L as the first equipment number, and setting x j as an observation value; n is the number of equipment corresponding to the second grid image sample set;
Substituting the upper boundary value and the lower boundary value into a preset triangular membership function value calculation formula to calculate so as to obtain a triangular membership function value corresponding to the equipment characteristic;
the preset triangle membership function value calculation formula is as follows:
wherein, MF is a triangle membership function value, x' is a virtual sample, LB is a lower boundary value, CL is a characteristic mean value, UB is an upper boundary value;
Taking the triangular membership function value larger than the preset function value as an equipment virtual sample;
and constructing a second virtual situation grid image set by adopting all the equipment virtual samples.
Optionally, the step of selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set to construct an environment situation grid image includes:
selecting a standard according to a preset equipment type, and selecting the equipment type from the target real situation grid image set to obtain the equipment type;
Selecting a plurality of supplementary samples from the virtual situation grid image set according to the equipment types and the preset selection quantity, obtaining a supplementary sample set corresponding to the equipment types and counting the number of sample supplementation in real time;
When the sample supplementing times are smaller than a preset times threshold, skipping to execute the step of selecting equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types;
When the sample supplementing times are equal to a preset times threshold, all the supplementing sample sets corresponding to the current time are adopted to construct an environment situation grid image.
Optionally, the step of inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain a collaborative decision criterion corresponding to the unmanned aerial vehicle includes:
expert labeling is carried out on the environment situation grid image according to a preset label, and a category label corresponding to the environment situation grid image is obtained;
taking the environment situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-classification neural network model to obtain a multi-collaborative decision criterion classification model;
And inputting the environmental situation grid image into the multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
The invention also provides a system for generating the collaborative decision criterion of the unmanned aerial vehicle, which comprises the following steps:
The task emergency data acquisition module is used for acquiring task data of the unmanned aerial vehicle and corresponding task emergency data when receiving the decision criterion trigger data;
the opponent equipment quantity and opponent equipment characteristic data obtaining module is used for identifying opponent equipment according to the surrounding environment image data corresponding to the unmanned aerial vehicle when the task emergency data is not emergency, so as to obtain opponent equipment quantity and opponent equipment characteristic data;
The unmanned aerial vehicle equipment characteristic data and ground data extraction module is used for extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by the ground control station when the number of opponent equipment is smaller than a preset value;
The environment situation grid image obtaining module is used for carrying out image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image;
the collaborative decision criterion obtaining module is used for inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain a collaborative decision criterion corresponding to the unmanned aerial vehicle.
The invention also provides an electronic device, which comprises a memory and a processor, wherein the memory stores a computer program, and the computer program when executed by the processor causes the processor to execute the steps of the method for generating the collaborative decision criterion of any unmanned plane.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed implements a method of generating a collaborative decision-making criterion for a drone as described in any of the above.
From the above technical scheme, the invention has the following advantages:
According to the invention, in the flight process of the unmanned aerial vehicle, factors such as emergency situations of the unmanned aerial vehicle for executing tasks, estimated calculated amount and the like are comprehensively considered, and under the premise that the executed tasks are not urgent and the estimated calculated amount is small, the collaborative decision criterion conforming to the unmanned aerial vehicle under the current environment situation can be scientifically and accurately generated. Secondly, in the process of generating the collaborative decision-making criterion, the ground control station is fused to send ground data and unmanned aerial vehicle acquisition data, and ground global information and unmanned aerial vehicle local information are effectively combined. Finally, a multi-collaborative decision criterion classification model is adopted to generate decision criteria under different environmental situations, so that one-to-many matching of the dynamic environmental situations and the decision criteria is realized, prediction is made for various situations possibly occurring in the future, and the unmanned aerial vehicle can adapt to complex and changeable environments. In addition, in the process of training a multi-collaborative decision criterion classification model, aiming at the problem of insufficient training samples, the invention provides a method for generating a virtual situation grid image set by using SMOTE without considering sample labels and MTD without considering sample labels, thereby effectively solving the problem of model overfitting caused by insufficient sample data.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained from these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flowchart of steps of a method for generating a collaborative decision criterion of an unmanned aerial vehicle according to a first embodiment of the present invention;
fig. 2 is a flowchart of steps of a method for generating a collaborative decision criterion of an unmanned aerial vehicle according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of one-to-many matching of environmental situation data and decision criteria provided in a second embodiment of the present invention;
Fig. 4 is a schematic diagram of an environmental situation grid image provided in the second embodiment of the present invention;
fig. 5 is a schematic diagram of a model structure of a preset multi-classification neural network model according to a second embodiment of the present invention;
FIG. 6 is a flow chart of the construction of a classification model with multiple collaborative decision criteria according to a second embodiment of the present invention;
fig. 7 is an image schematic diagram of a first virtual situation grid image set generated by an up-sampling algorithm according to a second embodiment of the present invention;
Fig. 8 is an image schematic diagram of a second virtual situation grid image set generated by the information diffusion method provided in the second embodiment of the present invention;
FIG. 9 is a flowchart of a collaborative decision criterion generating method according to a second embodiment of the present invention;
fig. 10 is a structural block diagram of an unmanned aerial vehicle collaborative decision criterion generating system according to a third embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method, a system, equipment and a medium for generating a collaborative decision criterion of an unmanned aerial vehicle, which are used for solving the technical problems that the existing unmanned aerial vehicle decision criterion generating method only can predict a single criterion and does not consider the correlation and priority among a plurality of criteria, so that the generated decision criterion has poor applicability.
In order to make the objects, features and advantages of the present invention more comprehensible, the technical solutions in the embodiments of the present invention are described in detail below with reference to the accompanying drawings, and it is apparent that the embodiments described below are only some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a method for generating a collaborative decision criterion for an unmanned aerial vehicle according to an embodiment of the present invention.
The first embodiment of the invention provides a method for generating a collaborative decision criterion of an unmanned aerial vehicle, which comprises the following steps:
And 101, acquiring task data of the unmanned aerial vehicle and corresponding task emergency data when the decision criterion trigger data are received.
In the embodiment of the invention, the decision criterion triggering data is triggered when an emergency is received, and can also be set as timing call. Specifically, a timed task calling time interval is set to be T or a time point T, and according to the time interval T or the time point T, the unmanned aerial vehicle collaborative decision criterion generating method in the embodiment of the invention is automatically called every T time or at the appointed time point T.
And (3) checking the authorization condition of the unmanned aerial vehicle, wherein before the unmanned aerial vehicle flies, decision criteria are preset in an intelligent processing chip loaded in the unmanned aerial vehicle, so that the unmanned aerial vehicle makes a decision according to the criteria when encountering an emergency. If the vehicle is not authorized before flying, the vehicle can execute tasks according to ground instructions even if emergency conditions are encountered during the flying process. Therefore, unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle needs to be acquired, and when the unmanned aerial vehicle authorization condition data is authorization, task data of the unmanned aerial vehicle is subjected to emergency judgment to obtain task emergency data.
And 102, when the task emergency data is not emergency, carrying out opponent equipment identification on surrounding environment image data corresponding to the unmanned aerial vehicle to obtain the number of opponent equipment and opponent equipment characteristic data.
In the embodiment of the invention, if the task emergency data of the execution task is urgent, the unmanned aerial vehicle continues to complete the task, and the decision criterion at the moment is that the task coincidence rate is maximum. If the task emergency data of executing the task is not urgent, the unmanned aerial vehicle identifies opponent equipment according to the surrounding environment image data corresponding to the unmanned aerial vehicle, so as to obtain the number of opponent equipment and the characteristic data of the opponent equipment, wherein the opponent equipment data refer to equipment, articles, animals and the like which do not belong to task planning, and the types of the opponent equipment are different according to different technical fields. Opponent equipment characteristic data typically includes speed, location, altitude, and type characteristics of the equipment. According to the collected surrounding environment image data, the number of opponent equipment and the characteristic data of the opponent equipment are calculated, so that the calculated amount of a decision criterion generated subsequently can be evaluated, and whether the ground assistance is requested or not is determined according to the calculated amount.
And 103, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station when the number of the opponent equipment is smaller than a preset value.
In the embodiment of the invention, if the estimated calculation amount is too large, and the calculation capacity of the intelligent processing chip is limited, the unmanned aerial vehicle requests the ground server to calculate, and the ground server returns the decision criterion of the unmanned aerial vehicle as the final decision criterion, namely the collaborative decision criterion. If the estimated calculation amount is small, the unmanned aerial vehicle autonomously generates a collaborative decision criterion, namely, according to ground data acquired by a ground control station, speed, position, height and type characteristic information of equipment of the two opposite parties are acquired, and unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle is extracted.
And 104, performing image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image.
In the embodiment of the invention, the opponent equipment characteristic data calculated by the unmanned aerial vehicle, the unmanned aerial vehicle equipment characteristic data and the ground data acquired by the ground control station are subjected to characteristic fusion to obtain an initial real situation grid image set. And rasterizing the initial real situation grid image set to obtain a target real situation grid image set. And generating a sample by adopting an up-sampling algorithm SMOTE according to the target real situation grid image set to obtain a first virtual situation grid image set. And generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set. And constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set. And selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set, and constructing an environment situation grid image.
And 105, inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
In the embodiment of the invention, expert labeling is carried out on the environment situation grid image according to a preset label, and a category label corresponding to the environment situation grid image is constructed. And taking the environmental situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-class neural network model to obtain a multi-collaborative decision criterion class model. And inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
In the embodiment of the invention, when the decision criterion triggering data is received, the task data of the unmanned aerial vehicle and the corresponding task emergency data are acquired. And when the task emergency data is not emergency, carrying out opponent equipment identification on the surrounding environment image data corresponding to the unmanned aerial vehicle to obtain the number of opponent equipment and the characteristic data of the opponent equipment. And when the number of the opponent equipment is smaller than a preset value, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station. And performing image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image. And inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to generate collaborative decision criterion conforming to the current situation. And through fusing ground transmission data, unmanned aerial vehicle acquisition data and unmanned aerial vehicle own tasks, intelligently generating a collaborative decision criterion corresponding to the unmanned aerial vehicle-ground control station in a complex situation environment. The method solves the technical problems that the existing unmanned plane decision criterion generating method only can predict a single criterion, and the generated decision criterion has poor applicability because the relevance and the priority among a plurality of criteria are not considered. Factors such as emergency situations of tasks executed by the unmanned aerial vehicle, the number of estimated opponent equipment and the like are comprehensively considered, and under the premise that the tasks executed are not urgent and the estimated number of opponent equipment is small, a collaborative decision criterion conforming to the unmanned aerial vehicle under the current environment situation can be scientifically and accurately generated.
Referring to fig. 2, fig. 2 is a flowchart illustrating steps of a method for generating a collaborative decision criterion for an unmanned aerial vehicle according to a second embodiment of the present invention.
The second unmanned plane collaborative decision criterion generation method provided by the embodiment of the invention comprises the following steps:
step 201, when receiving the decision criterion triggering data, acquiring task data of the unmanned aerial vehicle and corresponding task emergency data.
Further, step 201 may include the following sub-steps S11-S12:
s11, acquiring unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle.
And S12, when the unmanned aerial vehicle authorization condition data is authorization, carrying out emergency judgment on the task data of the unmanned aerial vehicle to obtain task emergency data.
In the embodiment of the invention, the unmanned aerial vehicle authorization condition is checked, and decision criteria are preset in an intelligent processing chip loaded in the unmanned aerial vehicle before flying, so that when the unmanned aerial vehicle encounters an emergency, the unmanned aerial vehicle makes a decision according to the criteria, and if the unmanned aerial vehicle is authorized before flying, the unmanned aerial vehicle needs to communicate with a ground control station in real time and make a collaborative decision in the flying process. Before flying, the unmanned plane stores task sequences (such as burst defense, reconnaissance, winding flying and striking and the like) which are required to be executed during the flying and emergency situations (such as emergency, non-emergency and the like) of corresponding tasks in a task management module. Therefore, when the decision criterion triggering data is received, firstly, whether the unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle is authorized or not is judged, and if the unmanned aerial vehicle authorization condition data is authorized, whether the task emergency condition data corresponding to the unmanned aerial vehicle is not emergency is further judged.
And 202, when the task emergency data is not emergency, carrying out opponent equipment identification on surrounding environment image data corresponding to the unmanned aerial vehicle, and obtaining the number of opponent equipment and opponent equipment characteristic data.
In the embodiment of the invention, when a task management module is called up in the flight process, the emergency situation of the current execution task of the unmanned aerial vehicle is evaluated, if the task is not urgent, surrounding environment image data acquired by the unmanned aerial vehicle is identified by opponent equipment to obtain the number of opponent equipment and opponent equipment characteristic data, the unmanned aerial vehicle comprehensively evaluates according to the environment situation and the calculated amount to generate decision criteria, namely, according to the data acquired by the unmanned aerial vehicle, the speed, the position and the height characteristic information of the equipment of the opposing parties are calculated, then the ground data and the equipment characteristic information are fused, an environment situation grid image representing the environment is calculated, and finally the grid image is input into a multi-collaborative decision criterion prediction model to generate a multi-collaborative decision criterion conforming to the current environment situation. If it is an urgent task, then the previously scheduled task continues to be performed.
And 203, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station when the number of the opponent equipment is smaller than a preset value.
In the embodiment of the invention, the unmanned aerial vehicle calculates the types and the number of the equipment of the acquired surrounding environment image data, and if the number of the surrounding opponent equipment is not large, namely the number of the opponent equipment is smaller than a preset value, the unmanned aerial vehicle can autonomously generate a decision criterion, and the unmanned aerial vehicle equipment characteristics acquired by the volumetric unmanned aerial vehicle and the ground data acquired by the ground control station are extracted to autonomously generate a multi-collaborative decision criterion. If the number of surrounding opponent equipment is calculated to be too large, and the calculation capacity of the intelligent processing chip is limited, the unmanned aerial vehicle requests a ground server of the ground control station to calculate, and the ground server is returned to the decision criterion of the unmanned aerial vehicle as a final collaborative decision criterion.
And 204, carrying out feature fusion on the opponent equipment feature data, the unmanned aerial vehicle equipment feature data and the ground data to obtain an initial real situation grid image set.
In the embodiment of the invention, the speed, position, height and type characteristic information of the equipment of the opposite parties are acquired, namely, the opponent equipment characteristic data and the unmanned aerial vehicle equipment characteristic data are acquired, the speed, position, height and type characteristic information of the equipment are acquired from ground transmission data, namely, ground data, and the equipment characteristic information calculated by the unmanned aerial vehicle and the equipment characteristic information acquired from the ground data are subjected to characteristic fusion to obtain an initial real situation grid image set.
As shown in fig. 3, the environment situation corresponding to the initial real situation grid image set is composed of feature information of each equipment in the surrounding environment of the unmanned aerial vehicle, where the equipment information is from the unmanned aerial vehicle acquisition data and the ground transmission data, and the feature information is typically a feature vector of an equipment attribute set, such as an attribute set= { fight both sides equipment, equipment type, speed, x coordinate, y coordinate, altitude }. The decision criteria may be maximum cost effectiveness, maximum task completion rate, maximum area coverage, maximum task planning time, maximum damage ratio to both parties, maximum target damage rate, maximum self survival rate, etc. The environmental situation at a certain moment may correspond to a plurality of decision criteria.
Step 205, rasterizing the initial real situation grid image set to obtain a target real situation grid image set.
In the embodiment of the invention, as shown in fig. 4, according to the calculated equipment characteristic information around the unmanned aerial vehicle, a map near the unmanned aerial vehicle is gridded, so that only one equipment exists in each cell, in order to vividly display the targets of both sides and the unmanned aerial vehicle, the own equipment is marked black in the cell, the opponent equipment is marked gray in the cell, the unmanned aerial vehicle is represented by a black triangle, the targets which do not exist in the cell are marked white, the scattered black gray marks are random targets, and different from RGB values stored in the image, all attribute states of the targets such as speed, x coordinates, y coordinates and the like are stored in the grid data. And saving the final environmental situation information as a target real situation grid image set in a gridding image form. The unmanned aerial vehicle detection data reflect local information, the ground data reflect global information, and the data of the unmanned aerial vehicle detection data and the ground data are fused and represented by using a rasterized image, namely, a target real situation grid image set combines the ground global information and the unmanned aerial vehicle local information, so that the environment situation can be reflected more accurately.
And 206, generating a sample by adopting an up-sampling algorithm SMOTE according to the target real situation grid image set to obtain a first virtual situation grid image set.
Further, step 206 may include the following substeps S21-S26:
s21, respectively taking equipment in the target real situation grid image set as a first sample generation object to obtain a first grid image sample set.
S22, selecting a root sample of the synthesized sample from the first grid image sample set according to a preset selection standard.
S23, selecting a plurality of first sample generating objects with Euclidean distances between the first grid image sample set and the root sample within a preset distance range, and obtaining a neighbor auxiliary sample point set.
S24, selecting auxiliary sample points from the adjacent auxiliary sample point sets according to a preset up-sampling rate to obtain the auxiliary sample point sets.
S25, substituting the auxiliary sample points in the auxiliary sample point set into a preset sample formula respectively to update samples, and generating a virtual sample.
The preset sample formula is:
xnew=xi+(x′i-xi)×rand(0,1);
Wherein x new is the first virtual situation grid image, x i is the root sample, x' i is the auxiliary sample point, and rand (0, 1) is a random function between 0 and 1.
S26, constructing a first virtual situation grid image set by adopting all the virtual samples.
The preset selection standard refers to a selection method for selecting a root sample, which is set in advance based on actual needs.
The preset distance range refers to a selection range constructed based on the number k of preset neighbor auxiliary sample points by sequencing all the first sample generating objects from the root sample according to the sequence from small to large after the Euclidean distance between the first sample generating objects and the root sample is calculated. The first sample generating objects are ordered according to the order of the Euclidean distance from the first sample generating object to the second sample generating object, and the first k first sample generating objects are selected to construct a neighbor auxiliary sample point set. The number k of neighboring auxiliary sample points is typically an odd number, e.g. k=5.
The up-sampling rate N is preset to be n=100%, 200%,300%, etc.
In the embodiment of the invention, equipment in a target real situation grid image set is respectively used as a first sample generation object, the first sample generation object comprises equipment characteristic data, and equipment E= { x 1,x2,…,xn }, namely a first grid image sample set is obtained. And selecting a sample x i from the first grid image sample set E= { x 1,x2,…,xn } as a root sample of the synthesized sample, calculating a k (generally odd number, such as k=5) neighbor auxiliary sample point of the sample x i based on the Euclidean distance in the first grid image sample set E= { x 1,x2,…,xi } and obtaining a neighbor auxiliary sample point set.
Setting a preset up-sampling rate N, and randomly selecting from k neighbor auxiliary sample points according to the up-sampling rate NAnd obtaining auxiliary sample point sets by using the auxiliary sample points x' i. And for each randomly selected auxiliary sample point, generating a new sample point by adopting a preset sample formula, and traversing all real equipment data to obtain a first virtual situation grid image set.
And 207, generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set.
Further, step 207 may include the following substeps S31-S39:
s31, respectively taking equipment in the target real situation grid image set as a second sample generation object to obtain a second grid image sample set.
S32, respectively selecting a maximum value and a minimum value corresponding to each equipment characteristic in the second grid image sample set, and obtaining a characteristic maximum value and a characteristic minimum value.
S33, calculating an average value between the feature maximum value and the feature minimum value to obtain a feature average value.
S34, calculating the number of equipment smaller than the characteristic mean value in the second grid image sample set to obtain the first number of equipment.
And S35, calculating the number of equipment larger than the characteristic mean value in the second grid image sample set to obtain the second number of equipment.
S36, substituting the first equipment number, the second equipment number and the characteristic mean value into a preset boundary value calculation formula to calculate the boundary value, and obtaining an upper boundary value and a lower boundary value corresponding to the equipment characteristic.
The preset boundary value calculation formula is as follows:
Wherein UB is an upper boundary value, CL is a characteristic mean value, skew U is a right skewness based on the characteristic mean value; The method comprises the steps of setting equipment variance corresponding to a second grid image sample set, setting N U as the second equipment number, setting LB as a lower boundary value, setting Skew L as the left deviation degree based on a characteristic mean value, setting N L as the first equipment number, setting x j as an observation value, setting x as a mean value, and setting N as the equipment number corresponding to the second grid image sample set.
S37, substituting the upper boundary value and the lower boundary value into a preset triangular membership function value calculation formula to calculate, and obtaining the triangular membership function value corresponding to the equipment characteristic.
The preset trigonometric membership function value calculation formula is as follows:
Wherein MF is a triangular membership function value, x' is a virtual sample, LB is a lower boundary value, CL is a feature mean value, and UB is an upper boundary value.
S38, taking the triangular membership function value larger than the preset function value as a device virtual sample.
S39, constructing a second virtual situation grid image set by adopting all equipment virtual samples.
In the embodiment of the invention, equipment in the target real situation grid image set is respectively used as a second sample generation object, so as to obtain a second grid image sample set. And respectively selecting a maximum value and a minimum value corresponding to each equipment characteristic in the second grid image sample set to obtain a characteristic maximum value max and a characteristic minimum value min. And calculating an average value between the characteristic maximum value and the characteristic minimum value to obtain a characteristic average value CL= (min+max)/2. And calculating the number of equipment smaller than the characteristic mean value CL in the second grid image sample set based on the equipment characteristics corresponding to the characteristic mean value to obtain a first equipment number N L. And calculating the number of equipment greater than the characteristic mean value CL in the second grid image sample set based on the equipment characteristics corresponding to the characteristic mean value to obtain a second equipment number N U. Substituting the first equipment number, the second equipment number and the characteristic mean value into a preset boundary value calculation formula to calculate the boundary value, and obtaining an upper boundary value and a lower boundary value corresponding to the equipment characteristic. And generating a virtual sample according to the upper boundary value and the lower boundary value, namely substituting the upper boundary value and the lower boundary value into a preset triangular membership function value calculation formula to calculate so as to obtain a triangular membership function value corresponding to the equipment characteristic, and taking the triangular membership function value MF as the generated virtual sample. And generating a value r which is a preset function value and is randomly and uniformly distributed between [0 and 1], and if MF (x') > r, generating a sample which is qualified, and obtaining an equipment virtual sample. Otherwise, the sample will be discarded. And after calculating the equipment virtual samples corresponding to all the equipment characteristics, constructing a second virtual situation grid image set by adopting all the equipment virtual samples corresponding to the current moment.
And step 208, constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set.
In the embodiment of the invention, a sample label-free SMOTE (on-sampling algorithm) or a sample label-free MTD (maximum transfer device) or an information diffusion method is adopted to obtain a virtual situation grid image corresponding to a target real situation grid image set, and all virtual situation grid images, namely a first virtual situation grid image set and a second virtual situation grid image set, are adopted to construct the virtual situation grid image set.
Step 209, selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set, and constructing an environment situation grid image.
Further, step 209 may include the following substeps S41-S44:
S41, selecting the equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types.
S42, selecting a plurality of supplementary samples from the virtual situation grid image set according to the equipment types and the preset selection quantity, obtaining supplementary sample sets corresponding to the equipment types, and counting the number of sample supplementary times in real time.
And S43, when the sample supplementing times are smaller than a preset times threshold, jumping to execute the step of selecting equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types.
And S44, when the sample supplementing times are equal to a preset times threshold, all the corresponding supplementing sample sets at the current time are adopted to construct an environment situation grid image.
The preset equipment type selection standard refers to randomly selecting equipment types in the target real situation grid image set.
The preset selection number n refers to a number threshold value corresponding to each selection of the supplemental samples.
The preset frequency threshold m refers to a critical value corresponding to the selected frequency.
In the embodiment of the invention, the equipment type E i is randomly selected from the equipment E= { E 1,E2,…,Ej } corresponding to the target real situation grid image set, and n virtual samples are randomly selected from virtual samples generated by SMOTE and MTD according to the equipment type E i to serve as complementary samples of the equipment E i. Repeating the steps m times, selecting different equipment each time, obtaining a supplementing sample set containing n multiplied by m equipment characteristic information, and counting the sample supplementing times in real time. And when the sample supplementing times are smaller than a preset times threshold, jumping to execute the step of selecting the equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types. When the sample supplementing times are equal to a preset times threshold value, all the supplementing sample sets corresponding to the current time are adopted to construct an environment situation grid image.
And 210, inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
Further, step 210 may include the following substeps S51-S53:
S51, expert labeling is carried out on the environment situation grid image according to a preset label, and a category label corresponding to the environment situation grid image is constructed.
S52, taking the environmental situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-class neural network model to obtain a multi-collaborative decision criterion classification model.
S53, inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
The preset labels can be one or more of the highest cost efficiency, the highest task completion rate, the highest area coverage rate, the highest task planning time, the highest damage ratio of the two parties, the largest target damage rate and the largest self survival rate.
In the embodiment of the invention, expert labeling is carried out on real and virtual pipe diameter situation grid image sample data sets to obtain class labels of environment situation grid images, namely, the environment situation grid images are expert labeled according to preset labels to construct class labels corresponding to the environment situation grid images. And taking the environmental situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-class neural network model to obtain a multi-collaborative decision criterion class model.
Specifically, as shown in fig. 6, characteristic information of the speed, the position and the altitude of equipment of the two parties in the acquired data of the unmanned aerial vehicle and ground data sent by a ground control station are firstly acquired, fused and calculated to obtain a grid image representing an environment situation, and a target real situation grid image set is obtained. Because the environmental situation data is very limited, a virtual environmental situation grid image, namely a virtual situation grid image set, is generated according to a small amount of real environmental situation data by adopting an SMOTE method without considering sample tags and an MTD method without considering the sample tags. And then, expert labeling is carried out on the environment situation grid image, and a category label of the environment situation grid image is obtained. After the environment situation grid image and the category label are obtained, a preset multi-classification neural network model is established, the real and virtual environment situation grid image is used as input of the neural network model, the category label is used as output, and the neural network model is trained to obtain the multi-collaborative decision criterion classification model. And inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
In the model training test process, real and virtual environment situation grid images, namely environment situation grid images, are divided into a training set, a verification set and a test set, the training set and the verification set are used for training and evaluating a neural network model, verification set data are used for verifying classification precision of multiple collaborative decision criteria, when the classification precision reaches the set classification precision, the model obtained through training is the multiple collaborative decision criteria classification model, otherwise, a model framework, parameters and the like are adjusted, and training is continued until the set precision is reached. And after the final multi-collaborative decision criterion classification model is obtained, evaluating the generalization capability and the precision of the model by using the number of test sets.
In the model training process, the loss function selects cross entropy, the optimizer adopts Adam for back propagation to update network model parameters, and by setting different mini-batch-size, epoch, learning rate and other parameters, a model with higher accuracy is obtained in the continuous optimization process.
Preferably, the evaluation of the model uses an evaluation manner for the sample, i.e., an index F1 value is used for the evaluation, as shown in the following formula.
Wherein F 1 is an index value, precison is an accuracy rate, recall is a recall rate, p is the number of samples, Y i is a tag set associated with x, and h (x i) is the correct tag set for returning x by the multi-tag classifier.
Fig. 7 is an image schematic diagram of a first virtual situation grid image set generated by an up-sampling algorithm according to a second embodiment of the present invention, as shown in fig. 7. Where black dots are real situation data samples and grey triangles are generated virtual situation data samples, it can be seen from the figure that the samples generated with SMOTE are all on a line connecting the known two samples.
Fig. 8 is an image schematic diagram of a second virtual situation grid image set generated by the information diffusion method according to the second embodiment of the present invention, as shown in fig. 8. Where black dots are real situation data samples and grey triangles are generated virtual situation data samples, it can be seen from the figure that the sample distribution generated with MTD is not necessarily between two samples.
As shown in fig. 5, the preset multi-classification neural network model includes an input layer, a convolution layer, a pooling layer, a full-connection layer and the like, the input layer inputs an environmental situation grid image, the convolution layer is used for reading local information of image-like environmental situation data, the convolution layer internally comprises a plurality of convolution kernels with the same size, the convolution layer is added with the largest pooling layer to reduce parameters and prevent overfitting, a linear layer is added before the convolution layer is input to the full-connection layer to reduce data dimension, finally the full-connection layer is used for mapping the extracted features on physical features, the last full-connection layer uses an activation function softmax to output a plurality of collaborative decision criteria, such as maximum task completion rate, maximum area coverage rate, maximum efficiency cost ratio and the like, and the number of the output decision criteria is determined by comparing the prediction probability of each decision criterion with a set threshold value.
In the embodiment of the invention, as shown in fig. 9, in the flight process of the unmanned aerial vehicle, according to the time setting parameters in the timing task calling module, the timing calling decision criteria generates a related program, firstly, whether the unmanned aerial vehicle is authorized is checked, and before the unmanned aerial vehicle flies, the decision criteria is preset in an intelligent processing chip loaded in the unmanned aerial vehicle, so that when the unmanned aerial vehicle encounters an emergency, the unmanned aerial vehicle makes a decision according to the decision criteria. If the vehicle is not authorized before flying, the vehicle does not communicate with the ground even if an emergency is encountered in the flying process, and the vehicle continues to execute tasks according to a preset decision criterion. If the unmanned aerial vehicle is authorized before flying, the unmanned aerial vehicle needs to communicate with the ground in real time and make a collaborative decision in the flying process. Then, checking the emergency situation of the unmanned aerial vehicle executing the task, if the emergency situation of the unmanned aerial vehicle executing the task is emergency, the unmanned aerial vehicle continuously completes the task, and the decision criterion at the moment is the maximum task compliance rate, and it is noted that the unmanned aerial vehicle can still use the collaborative decision criterion to generate a related method when executing the task allocation. And if the emergency situation of executing the task is not emergency, the unmanned aerial vehicle comprehensively evaluates according to the environment situation and then generates a decision criterion.
Under the condition that the conditions are met, the unmanned aerial vehicle can generate a collaborative decision criterion with the ground, firstly, the unmanned aerial vehicle calculates the types and the numbers of surrounding equipment according to collected surrounding image information, if the number of surrounding opponent equipment is calculated to be too large, and the calculation capacity of an intelligent processing chip is limited, the unmanned aerial vehicle requests a ground server to calculate, the ground server returns the decision criterion to the unmanned aerial vehicle as a final decision criterion, if the number of surrounding opponent equipment is calculated to be not large, the unmanned aerial vehicle autonomously generates the decision criterion, at the moment, the unmanned aerial vehicle calculates the speed, the position, the height and the type characteristic information of the equipment of the opposing sides according to collected environment image data, acquires the speed, the position, the height and the type characteristic information of the equipment from ground transmission data, fuses the equipment characteristic information calculated by the unmanned aerial vehicle with the equipment characteristic information acquired from the ground data to obtain final environment situation information, and stores the final environment situation information into a grid image in a grid image mode. And finally, inputting the calculated grid image into a constructed multi-collaborative decision criterion classification model to generate a multi-collaborative decision criterion conforming to the current environment situation.
Referring to fig. 10, fig. 10 is a block diagram illustrating a collaborative decision criterion generating system for an unmanned aerial vehicle according to a third embodiment of the present invention.
The third embodiment of the invention provides an unmanned plane collaborative decision criterion generation system, which comprises:
the task emergency data obtaining module 1001 is configured to obtain task data of the unmanned aerial vehicle and corresponding task emergency data when receiving the decision criterion trigger data.
And the opponent equipment number and opponent equipment characteristic data obtaining module 1002 is configured to identify opponent equipment with the surrounding environment image data corresponding to the unmanned aerial vehicle when the task emergency data is not emergency, so as to obtain opponent equipment number and opponent equipment characteristic data.
And the unmanned aerial vehicle equipment characteristic data and ground data extraction module 1003 is used for extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by the ground control station when the number of opponent equipment is smaller than a preset value.
The environmental situation grid image obtaining module 1004 is configured to image-rasterize opponent equipment feature data, unmanned aerial vehicle equipment feature data and ground data to construct an environmental situation grid image.
The collaborative decision criterion obtaining module 1005 is configured to input the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
Optionally, the task emergency data acquisition module 1001 includes:
and the unmanned aerial vehicle authorization condition data acquisition module is used for acquiring unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle.
And the task emergency data acquisition sub-module is used for carrying out emergency judgment on the task data of the unmanned aerial vehicle when the unmanned aerial vehicle authorization condition data is authorization, so as to obtain the task emergency data.
Optionally, the environmental situation grid image obtaining module 1004 includes:
the initial real situation grid image set obtaining module is used for carrying out feature fusion on opponent equipment feature data, unmanned aerial vehicle equipment feature data and ground data to obtain an initial real situation grid image set.
The target real situation grid image set obtaining module is used for rasterizing the initial real situation grid image set to obtain a target real situation grid image set.
The first virtual situation grid image set obtaining module is used for generating a sample by adopting an upsampling algorithm SMOTE according to the target real situation grid image set to obtain the first virtual situation grid image set.
The second virtual situation grid image set obtaining module is used for generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain the second virtual situation grid image set.
The virtual situation grid image set construction module is used for constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set.
The environment situation grid image obtaining sub-module is used for selecting grid image samples from the target real situation grid image set and the virtual situation grid image set and constructing an environment situation grid image.
Optionally, the first virtual situation grid image set obtaining module may perform the following steps:
respectively taking equipment in a target real situation grid image set as a first sample generation object to obtain a first grid image sample set;
Selecting a root sample of the synthesized sample from the first grid image sample set according to a preset selection standard;
Selecting a plurality of first sample generating objects with Euclidean distances between the first grid image sample set and the root sample within a preset distance range, and obtaining a neighbor auxiliary sample point set;
selecting auxiliary sample points from the adjacent auxiliary sample point sets according to a preset up-sampling rate to obtain auxiliary sample point sets;
substituting auxiliary sample points in the auxiliary sample point set into a preset sample formula respectively to update samples, so as to generate a virtual sample;
The preset sample formula is:
xnew=xi+(x′i-xi)×rand(0,1);
Wherein x ne is a first virtual situation grid image, x i is a root sample, x' i is an auxiliary sample point, and rand (0, 1) is a random function between 0 and 1;
and constructing a first virtual situation grid image set by adopting all the virtual samples.
Optionally, the second virtual situation grid image set obtaining module may perform the steps of:
Respectively taking the equipment in the target real situation grid image set as a second sample generation object to obtain a second grid image sample set;
Respectively selecting a maximum value and a minimum value corresponding to each equipment characteristic in the second grid image sample set to obtain a characteristic maximum value and a characteristic minimum value;
Calculating an average value between the characteristic maximum value and the characteristic minimum value to obtain a characteristic average value;
calculating the number of equipment smaller than the characteristic mean value in the second grid image sample set to obtain the first number of equipment;
calculating the number of equipment larger than the characteristic mean value in the second grid image sample set to obtain a second number of equipment;
substituting the first equipment number, the second equipment number and the characteristic mean value into a preset boundary value calculation formula to calculate a boundary value, so as to obtain an upper boundary value and a lower boundary value corresponding to the equipment characteristic;
The preset boundary value calculation formula is as follows:
Wherein UB is an upper boundary value, CL is a characteristic mean value, skew U is a right skewness based on the characteristic mean value; The method comprises the steps of setting equipment variance corresponding to a second grid image sample set, setting N U as the second equipment number, setting LB as a lower boundary value, setting Skew L as the left deviation based on a characteristic mean value, setting N L as the first equipment number, setting x j as an observation value, setting x as a mean value, and setting N as the equipment number corresponding to the second grid image sample set;
Substituting the upper boundary value and the lower boundary value into a preset triangular membership function value calculation formula to calculate so as to obtain a triangular membership function value corresponding to the equipment characteristic, wherein the preset triangular membership function value calculation formula is as follows:
wherein, MF is a triangle membership function value, x' is a virtual sample, LB is a lower boundary value, CL is a characteristic mean value, UB is an upper boundary value;
Taking the triangular membership function value larger than the preset function value as an equipment virtual sample;
and constructing a second virtual situation grid image set by adopting all the equipment virtual samples.
Optionally, the environmental situation grid image obtaining sub-module may perform the following steps:
selecting a standard according to a preset equipment type, and selecting the equipment type from the target real situation grid image set to obtain the equipment type;
selecting a plurality of supplementary samples from the virtual situation grid image set according to the equipment types and the preset selection quantity, obtaining a supplementary sample set corresponding to the equipment types and counting the number of sample supplementation in real time;
When the sample supplementing times are smaller than a preset times threshold value, jumping to execute the step of selecting equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types;
When the sample supplementing times are equal to a preset times threshold value, all the supplementing sample sets corresponding to the current time are adopted to construct an environment situation grid image.
Optionally, the collaborative decision criterion derivation module 1005 includes:
the category label obtaining module is used for expert labeling the environmental situation grid image according to a preset label and constructing a category label corresponding to the environmental situation grid image.
The multi-collaborative decision criterion classification model obtaining module is used for taking the environmental situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-classification neural network model to obtain the multi-collaborative decision criterion classification model.
The collaborative decision criterion obtaining submodule is used for inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain collaborative decision criterion corresponding to the unmanned aerial vehicle.
The embodiment of the invention also provides the electronic equipment, which comprises a memory and a processor, wherein the memory stores a computer program, and when the computer program is executed by the processor, the processor executes the unmanned aerial vehicle collaborative decision criterion generating method according to any embodiment.
The memory may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory has memory space for program code to perform any of the method steps described above. For example, the memory space for the program code may include individual program code for implementing the various steps in the above method, respectively. The program code can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. The program code may be compressed, for example, in a suitable form. The codes, when executed by a computing processing device, cause the computing processing device to perform the steps in the unmanned aerial vehicle collaborative decision criterion generation method described above.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the unmanned aerial vehicle collaborative decision-making criterion generating method according to any of the above embodiments.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer apparatus (which may be a personal computer, a server, or a network apparatus, etc.) to perform all or part of the steps of the method of the embodiments of the present invention. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing embodiments are merely for illustrating the technical solution of the present invention, but not for limiting the same, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that modifications may be made to the technical solution described in the foregoing embodiments or equivalents may be substituted for parts of the technical features thereof, and that such modifications or substitutions do not depart from the spirit and scope of the technical solution of the embodiments of the present invention in essence.
Claims (9)
1. The unmanned aerial vehicle collaborative decision criterion generation method is characterized by comprising the following steps of:
when the decision criterion triggering data is received, acquiring task data of the unmanned aerial vehicle and corresponding task emergency data;
When the task emergency data is not emergency, carrying out opponent equipment identification on the surrounding environment image data corresponding to the unmanned aerial vehicle to obtain opponent equipment quantity and opponent equipment characteristic data;
When the number of the opponent equipment is smaller than a preset value, extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by a ground control station;
performing image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image;
Inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain a collaborative decision criterion corresponding to the unmanned aerial vehicle;
The step of performing image rasterization on the opponent equipment feature data, the unmanned aerial vehicle equipment feature data and the ground data to construct an environment situation grid image comprises the following steps:
performing feature fusion on the opponent equipment feature data, the unmanned aerial vehicle equipment feature data and the ground data to obtain an initial real situation grid image set;
Rasterizing the initial real situation grid image set to obtain a target real situation grid image set;
According to the target real situation grid image set, an up-sampling algorithm SMOTE is adopted to generate a sample, and a first virtual situation grid image set is obtained;
generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set;
constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set;
and selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set, and constructing an environment situation grid image.
2. The unmanned aerial vehicle collaborative decision-making criterion generating method according to claim 1, wherein the step of acquiring task data and corresponding task emergency data of an unmanned aerial vehicle comprises:
Acquiring unmanned aerial vehicle authorization condition data corresponding to the unmanned aerial vehicle;
and when the unmanned aerial vehicle authorization condition data is authorization, carrying out emergency judgment on the task data of the unmanned aerial vehicle to obtain task emergency data.
3. The method for generating the collaborative decision-making criterion of the unmanned aerial vehicle according to claim 1, wherein the step of generating the sample by using an upsampling algorithm SMOTE according to the target real situation grid image set to obtain the first virtual situation grid image set comprises the steps of:
respectively taking the equipment in the target real situation grid image set as a first sample generation object to obtain a first grid image sample set;
Selecting a root sample of a synthesized sample from the first grid image sample set according to a preset selection standard;
selecting a plurality of first sample generating objects with Euclidean distances between the first grid image sample set and the root sample within a preset distance range, and obtaining a neighbor auxiliary sample point set;
Selecting auxiliary sample points from the adjacent auxiliary sample point sets according to a preset up-sampling rate to obtain auxiliary sample point sets;
Substituting the auxiliary sample points in the auxiliary sample point set into a preset sample formula respectively to update samples, and generating a virtual sample;
The preset sample formula is:
xnew=xi+(x′i-xi)×rand(0,1);
Wherein x new is a first virtual situation grid image, x i is a root sample, x' i is an auxiliary sample point, and rand (0, 1) is a random function between 0 and 1;
And constructing a first virtual situation grid image set by adopting all the virtual samples.
4. The method for generating the collaborative decision-making criterion of the unmanned aerial vehicle according to claim 1, wherein the step of generating the sample by adopting the MTD according to the target real situation grid image set to obtain the second virtual situation grid image set comprises the following steps:
respectively taking the equipment in the target real situation grid image set as a second sample generation object to obtain a second grid image sample set;
Respectively selecting a maximum value and a minimum value corresponding to each equipment characteristic in the second grid image sample set to obtain a characteristic maximum value and a characteristic minimum value;
calculating an average value between the characteristic maximum value and the characteristic minimum value to obtain a characteristic average value;
calculating the number of equipment smaller than the characteristic mean value in the second grid image sample set to obtain a first number of equipment;
Calculating the number of equipment larger than the characteristic mean value in the second grid image sample set to obtain a second number of equipment;
Substituting the first equipment number, the second equipment number and the characteristic mean value into a preset boundary value calculation formula to calculate a boundary value, so as to obtain an upper boundary value and a lower boundary value corresponding to the equipment characteristic;
the preset boundary value calculation formula is as follows:
Wherein UB is an upper boundary value, CL is a characteristic mean value, skew U is a right skewness based on the characteristic mean value; The method comprises the steps of setting equipment variance corresponding to a second grid image sample set, setting N U as the second equipment number, setting LB as a lower boundary value, setting Skew L as left deviation based on a characteristic mean value, setting N L as the first equipment number, and setting x j as an observation value; n is the number of equipment corresponding to the second grid image sample set;
Substituting the upper boundary value and the lower boundary value into a preset triangular membership function value calculation formula to calculate so as to obtain a triangular membership function value corresponding to the equipment characteristic;
the preset triangle membership function value calculation formula is as follows:
wherein, MF is a triangle membership function value, x' is a virtual sample, LB is a lower boundary value, CL is a characteristic mean value, UB is an upper boundary value;
Taking the triangular membership function value larger than the preset function value as an equipment virtual sample;
and constructing a second virtual situation grid image set by adopting all the equipment virtual samples.
5. The unmanned aerial vehicle collaborative decision-making criterion generating method according to claim 1, wherein the step of selecting a grid image sample from the target real situation grid image set and the virtual situation grid image set to construct an environmental situation grid image comprises:
selecting a standard according to a preset equipment type, and selecting the equipment type from the target real situation grid image set to obtain the equipment type;
Selecting a plurality of supplementary samples from the virtual situation grid image set according to the equipment types and the preset selection quantity, obtaining a supplementary sample set corresponding to the equipment types and counting the number of sample supplementation in real time;
When the sample supplementing times are smaller than a preset times threshold, skipping to execute the step of selecting equipment types from the target real situation grid image set according to the preset equipment type selection standard to obtain the equipment types;
When the sample supplementing times are equal to a preset times threshold, all the supplementing sample sets corresponding to the current time are adopted to construct an environment situation grid image.
6. The method for generating a collaborative decision-making criterion for an unmanned aerial vehicle according to claim 1, wherein the step of inputting the environmental situation grid image into a multi-collaborative decision-making criterion classification model to obtain the collaborative decision-making criterion corresponding to the unmanned aerial vehicle comprises the steps of:
expert labeling is carried out on the environment situation grid image according to a preset label, and a category label corresponding to the environment situation grid image is obtained;
taking the environment situation grid image as input, taking the class label as output, and carrying out model training on a preset multi-classification neural network model to obtain a multi-collaborative decision criterion classification model;
And inputting the environmental situation grid image into the multi-collaborative decision criterion classification model to obtain collaborative decision criteria corresponding to the unmanned aerial vehicle.
7. An unmanned aerial vehicle collaborative decision-making criterion generation system, comprising:
The task emergency data acquisition module is used for acquiring task data of the unmanned aerial vehicle and corresponding task emergency data when receiving the decision criterion trigger data;
the opponent equipment quantity and opponent equipment characteristic data obtaining module is used for identifying opponent equipment according to the surrounding environment image data corresponding to the unmanned aerial vehicle when the task emergency data is not emergency, so as to obtain opponent equipment quantity and opponent equipment characteristic data;
The unmanned aerial vehicle equipment characteristic data and ground data extraction module is used for extracting unmanned aerial vehicle equipment characteristic data corresponding to the unmanned aerial vehicle and ground data acquired by the ground control station when the number of opponent equipment is smaller than a preset value;
The environment situation grid image obtaining module is used for carrying out image rasterization on the opponent equipment characteristic data, the unmanned aerial vehicle equipment characteristic data and the ground data to construct an environment situation grid image;
the collaborative decision criterion obtaining module is used for inputting the environmental situation grid image into a multi-collaborative decision criterion classification model to obtain a collaborative decision criterion corresponding to the unmanned aerial vehicle;
The environment situation grid image obtaining module comprises:
the initial real situation grid image set obtaining module is used for carrying out feature fusion on opponent equipment feature data, unmanned aerial vehicle equipment feature data and ground data to obtain an initial real situation grid image set;
The target real situation grid image set obtaining module is used for rasterizing the initial real situation grid image set to obtain a target real situation grid image set;
The first virtual situation grid image set obtaining module is used for generating a sample by adopting an upsampling algorithm SMOTE according to the target real situation grid image set to obtain a first virtual situation grid image set;
The second virtual situation grid image set obtaining module is used for generating a sample by adopting an information diffusion method MTD according to the target real situation grid image set to obtain a second virtual situation grid image set;
the virtual situation grid image set construction module is used for constructing a virtual situation grid image set by adopting the first virtual situation grid image set and the second virtual situation grid image set;
the environment situation grid image obtaining sub-module is used for selecting grid image samples from the target real situation grid image set and the virtual situation grid image set and constructing an environment situation grid image.
8. An electronic device comprising a memory and a processor, wherein the memory stores a computer program that, when executed by the processor, causes the processor to perform the steps of the unmanned aerial vehicle collaborative decision-criterion generation method of any of claims 1 to 6.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed, implements the unmanned aerial vehicle collaborative decision-making criterion generating method of any of claims 1 to 6.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311324627.8A CN117218564B (en) | 2023-10-12 | 2023-10-12 | A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311324627.8A CN117218564B (en) | 2023-10-12 | 2023-10-12 | A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN117218564A CN117218564A (en) | 2023-12-12 |
| CN117218564B true CN117218564B (en) | 2025-03-28 |
Family
ID=89042580
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311324627.8A Active CN117218564B (en) | 2023-10-12 | 2023-10-12 | A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117218564B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119717843A (en) * | 2024-11-21 | 2025-03-28 | 杭州智元研究院有限公司 | Global situation information sharing method and system for isomorphic unmanned aerial vehicle |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110488872A (en) * | 2019-09-04 | 2019-11-22 | 中国人民解放军国防科技大学 | A kind of unmanned plane real-time route planing method based on deeply study |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9715235B2 (en) * | 2015-06-05 | 2017-07-25 | The Boeing Company | Autonomous unmanned aerial vehicle decision-making |
| CN111221352B (en) * | 2020-03-03 | 2021-01-29 | 中国科学院自动化研究所 | Control system based on cooperative game countermeasure of multiple unmanned aerial vehicles |
| CN111708355B (en) * | 2020-06-19 | 2023-04-18 | 中国人民解放军国防科技大学 | Multi-unmanned aerial vehicle action decision method and device based on reinforcement learning |
| CN112420135B (en) * | 2020-11-20 | 2024-09-13 | 北京化工大学 | Virtual sample generation method based on sample method and quantile regression |
| CN113469125B (en) * | 2021-07-20 | 2022-07-19 | 中国人民解放军国防科技大学 | Multi-unmanned aerial vehicle cooperative signal identification method and identification system |
| CN113595622B (en) * | 2021-09-29 | 2022-01-18 | 南京航空航天大学 | Digital twin-based cluster collaborative search virtual-real combined verification method |
-
2023
- 2023-10-12 CN CN202311324627.8A patent/CN117218564B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110488872A (en) * | 2019-09-04 | 2019-11-22 | 中国人民解放军国防科技大学 | A kind of unmanned plane real-time route planing method based on deeply study |
Also Published As
| Publication number | Publication date |
|---|---|
| CN117218564A (en) | 2023-12-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Adarsh et al. | YOLO v3-Tiny: Object Detection and Recognition using one stage improved model | |
| US20240160978A1 (en) | Rf signal classification device incorporating quantum computing with game theoretic optimization and related methods | |
| CN111881958B (en) | License plate classification recognition method, device, equipment and storage medium | |
| US20210158023A1 (en) | System and Method for Generating Image Landmarks | |
| US20190213448A1 (en) | Image recognition method, apparatus, server, and storage medium | |
| US10943352B2 (en) | Object shape regression using wasserstein distance | |
| CN111666921A (en) | Vehicle control method, apparatus, computer device, and computer-readable storage medium | |
| CN113888748B (en) | Point cloud data processing method and device | |
| WO2018021576A1 (en) | Method for detecting object in image and objection detection system | |
| CN118115952B (en) | All-weather detection method and system for unmanned aerial vehicle image under urban low-altitude complex background | |
| CN111783996B (en) | Data processing method, device and equipment | |
| CN117218564B (en) | A method, system, equipment and medium for generating collaborative decision-making criteria for unmanned aerial vehicles | |
| CN117710728A (en) | SAR image target recognition method, device, computer equipment and storage medium | |
| CN119649254A (en) | UAV target intelligent recognition and perception system and method based on deep learning | |
| CN119129708A (en) | Reversal knowledge distillation method and system based on federated large model | |
| US20240054377A1 (en) | Perturbation rf signal generator incorporating quantum computing with game theoretic optimization and related methods | |
| US20240355089A1 (en) | Object Detection Device Incorporating Quantum Computing and Game Theoretic Optimization and Related methods | |
| Nowakowski et al. | Transfer learning in Earth observation data analysis: A review | |
| CN111339952B (en) | Image classification method and device based on artificial intelligence and electronic equipment | |
| CN119249224A (en) | Power grid protocol identification method, device, computer equipment, readable storage medium and program product based on knowledge graph | |
| US20240111024A1 (en) | Change detection device incorporating quantum computing with game theoretic optimization and related methods | |
| Ahmed et al. | Classification of semantic segmentation using fully convolutional networks based unmanned aerial vehicle application | |
| CN113762520B (en) | Data processing method, device and equipment | |
| Sun et al. | DiReDi: Distillation and Reverse Distillation for AIoT Applications | |
| CN111815658B (en) | An image recognition method and device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |