CN117692819A - AI identification-based inspection system - Google Patents
AI identification-based inspection system Download PDFInfo
- Publication number
- CN117692819A CN117692819A CN202211031832.0A CN202211031832A CN117692819A CN 117692819 A CN117692819 A CN 117692819A CN 202211031832 A CN202211031832 A CN 202211031832A CN 117692819 A CN117692819 A CN 117692819A
- Authority
- CN
- China
- Prior art keywords
- inspection
- fixed monitoring
- data
- management control
- control platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/806—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
- H04W84/22—Self-organising networks, e.g. ad-hoc networks or sensor networks with access to wired networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/30—Arrangements in telecontrol or telemetry systems using a wired architecture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/40—Arrangements in telecontrol or telemetry systems using a wireless architecture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q2209/00—Arrangements in telecontrol or telemetry systems
- H04Q2209/50—Arrangements in telecontrol or telemetry systems using a mobile data collecting device, e.g. walk by or drive by
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Alarm Systems (AREA)
Abstract
The invention provides an AI identification inspection system, which comprises: management control platform, fixed supervisory equipment and remove inspection robot. The mesh network composed of the fixed monitoring equipment and the mobile inspection machine realizes the network full coverage of the inspection site. Fixed supervisory equipment realizes the fixed angle control of fixed hidden danger identification point, and the rethread removes the robot of patrolling and examining and provides the supplementary control of multi-angle, can realize the no dead angle control in the place of patrolling and examining. The management control platform can flexibly generate a routing inspection path and routing inspection tasks, and flexibly design routing inspection schemes according to the characteristics of functional block division in the routing inspection field so as to acquire comprehensive monitoring data. The hidden danger identification model deployed through the management control platform is used for automatically identifying hidden danger, and compared with manual inspection, the hidden danger identification method and device greatly improve the accuracy and the identification efficiency of hidden danger identification.
Description
Technical Field
The invention relates to the technical field of equipment inspection, in particular to an AI-recognition-based inspection system.
Background
In order to prevent accidents and ensure safety, a certain area needs to be repeatedly inspected, such as a production workshop, a factory area, a power station, a gas station and the like.
The traditional inspection mode mainly comprises the steps of periodically inspecting manually, collecting environment and equipment information of each security inspection point by a security inspector, reporting and preparing abnormal conditions, writing inspection reports, and uploading the inspection reports to a background. The manual mode has low efficiency, relies on subjective judgment of security check personnel, and has poor reliability. In addition, the occurrence time of the fault cannot be predicted, and the fault is difficult to find in time in the periodic inspection mode.
Disclosure of Invention
The invention aims to: in order to overcome the defects of the prior art, the invention provides an AI-recognition-based inspection system which can automatically inspect an inspection site in real time at multiple angles, automatically recognize potential safety hazards and improve inspection efficiency.
In order to achieve the above purpose, the present invention proposes the following technical solutions:
an AI-recognition-based inspection system, comprising: the system comprises a management control platform, fixed monitoring equipment and a mobile inspection robot; the fixed monitoring equipment is arranged at hidden danger identification points in the inspection site, forms a mesh network with other fixed monitoring equipment and the mobile inspection robot, and realizes communication with the management control platform through the mesh;
the management control platform is configured to generate a patrol field grid map and a patrol path and issue the patrol field grid map and the patrol path to the mobile patrol robot; the system is used for generating a fixed monitoring task and a patrol task and issuing the fixed monitoring task and the patrol task to the fixed monitoring equipment; the system is also used for receiving the fixed monitoring data and the patrol data uploaded by the fixed monitoring equipment, and identifying the fault type through a pre-constructed hidden danger identification model;
the mobile inspection robot is used for running based on the inspection path, interacting with the fixed monitoring equipment in the inspection path, executing the inspection task issued by the fixed monitoring equipment, and returning inspection data to the fixed monitoring equipment;
the fixed monitoring equipment is configured to execute the fixed monitoring task and upload fixed monitoring data to the management control platform; and the mobile inspection robot is used for interacting with the mobile inspection robot, issuing the inspection task, receiving the inspection data and uploading the inspection data to the management control platform.
As an optional implementation manner of the embodiment of the present disclosure, the management control platform is further configured to adjust, by configuring network parameters, network access states of the fixed monitoring device and the mobile inspection robot in an inspection site, including joining and exiting the mesh network; and the working parameters are used for controlling the working state of the fixed monitoring equipment by configuring the working parameters of the fixed monitoring equipment, wherein the working parameters comprise data acquisition frequency, data uploading frequency, starting time point, standby time length and data acquisition angle.
As an optional implementation manner of the embodiment of the disclosure, the fixed monitoring device has a unique ID, where the ID corresponds to and is associated with an address of the fixed monitoring device that is pre-stored in the management control platform.
As an optional implementation manner of the embodiment of the present disclosure, the ID of the fixed monitoring device is calculated by the management control platform according to the position coordinate of the fixed monitoring device by using a hash function, where a calculation formula is id=h (x, y), where (x, y) represents the position coordinate of the fixed monitoring device on the grid map, and H represents the hash function.
As an optional implementation manner of the embodiment of the present disclosure, before the fixed monitoring device uploads data to the management control platform, the ID and the timestamp are written in a header of a data packet to be transmitted.
As an optional implementation manner of the embodiment of the present disclosure, before the fixed monitoring device uploads data to the management control platform, a data type identifier is further written in a header of a data packet to be transmitted, where the data type identifier is used to distinguish the patrol data from the fixed monitoring data.
As an optional implementation manner of the embodiment of the disclosure, the inspection task includes collecting position coordinates, collecting angles and collecting time.
As an optional implementation manner of the embodiment of the disclosure, the mobile inspection robot is configured with a radio frequency tag, and the radio frequency tag stores an identity code of the mobile inspection robot; the fixed monitoring equipment is provided with a radio frequency reader-writer; when the mobile inspection robot moves to the communication range of the radio frequency reader-writer, the radio frequency reader-writer reads the identity code in the radio frequency tag, matches the identity code with the identity code carried in the inspection task issued by the management control platform, and sends the inspection task to the corresponding mobile inspection robot if the matching is successful; otherwise, the patrol task is not sent.
As an optional implementation manner of the embodiment of the disclosure, the mobile inspection robot is configured with a UWB tag, the fixed monitoring device is used as a UWB positioning base station, and the management control platform is deployed with a positioning solution server; the fixed monitoring equipment measures the distance between the fixed monitoring equipment and the mobile inspection robot through a TOF ranging method, and uploads ranging information to the positioning resolving server, and the positioning resolving server resolves the position coordinates of the mobile inspection robot according to the position coordinates of the fixed monitoring equipment to achieve positioning of the mobile inspection robot.
As an optional implementation manner of the embodiment of the disclosure, the hidden danger identification model is divided into two layers, and the upper layer is a target detection model YOLOv5, which is used for completing target detection, and intercepting a detection target image from an original image to generate an image set to be classified; and the lower layer uses a long-short-term memory neural network LSTM for identifying and classifying hidden dangers of the detection target image.
The AI identification inspection system provided by the invention constructs a mesh network consisting of fixed monitoring equipment and a mobile inspection machine, so that the network full coverage of an inspection site is realized. Fixed supervisory equipment realizes the fixed angle control of fixed hidden danger identification point, and the rethread removes the robot of patrolling and examining and provides the supplementary control of multi-angle, can realize the no dead angle control in the place of patrolling and examining.
The invention can flexibly generate the inspection path and the inspection task through the management control platform, and flexibly design the inspection scheme according to the characteristics of the division of the functional blocks in the inspection site so as to acquire comprehensive monitoring data.
According to the hidden danger identification method, hidden danger is automatically identified through the hidden danger identification model deployed by the management control platform, and compared with manual inspection, the hidden danger identification method has the advantages that the accuracy and the identification efficiency of hidden danger identification are greatly improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a block diagram of a wireless mesh network schematically provided in an embodiment of the present disclosure;
FIG. 2 is a diagram of a YOLOv5 network model architecture, according to an embodiment of the present disclosure;
fig. 3 is a structural diagram of an LSTM network according to an embodiment of the present disclosure.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the particular embodiments described herein are illustrative of the invention only and are not intended to limit the invention.
Those of skill in the art will appreciate that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments.
The terms "first," "second," and the like in this disclosure are used solely to distinguish one from another device, module, or unit, and are not intended to limit the order or interdependence of functions performed by such devices, modules, or units.
It should be noted that, as used herein, the singular forms "a", "an", "the" and "the" include plural referents unless the content clearly dictates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be noted that the terms "coupled," "connected," and "connected" are to be construed broadly, and may be, for example, directly connected, indirectly connected through an intermediary, or may be in communication with one another in two devices or in an interaction relationship between two devices, unless explicitly stated or defined otherwise. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
It is noted that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The embodiment of the disclosure provides an AI (advanced technology attachment) -identification-based inspection system, which mainly comprises a management control platform, fixed monitoring equipment and a mobile inspection robot. The functional principles of the management control platform, the fixed monitoring equipment and the mobile inspection robot are respectively described below.
Fixed monitoring equipment
The fixed monitoring device may be placed in a location where hidden danger is likely to occur, and embodiments of the present disclosure record such a location as a hidden danger identification point. And the fixed monitoring equipment is in communication connection with the management control platform and is used for continuously collecting environment data of hidden danger identification points and uploading the environment data to the management control platform. In order to facilitate the management control platform to identify the data acquisition position, the fixed monitoring equipment is provided with a unique identity ID, the identity ID corresponds to the address of the fixed monitoring equipment prestored in the management control platform one by one and is associated with the address, and the fixed monitoring equipment writes the identity ID and the timestamp T in the head of the data packet when transmitting the environment data every time.
The fixed monitoring equipment also has the function of communication with the mobile inspection robot. In the embodiment of the disclosure, the fixed monitoring device, the rest of the fixed monitoring devices and the mobile inspection robot together form a wireless mesh network. Fig. 1 exemplarily shows a structure diagram of a wireless mesh network, as shown in fig. 1, at least one fixed monitoring device may be selected as a fixed mesh route, the fixed mesh route is connected to a gateway device through an optical fiber, and data transmission is performed between the fixed mesh route and mobile inspection and between a person and other fixed monitoring devices through a wireless mesh link, and data transmission is performed between the rest of fixed monitoring preparation and mobile inspection robots through a wireless mesh link, so that full network coverage in an inspection site is realized.
Through wireless mesh network, fixed supervisory equipment can receive the inspection task that management control platform sent, and when corresponding removal inspection robot arrived in the communication range of fixed supervisory equipment, fixed supervisory equipment sent the inspection task to the removal inspection robot, and the data of patrolling and examining that the back was passed back after receiving the removal inspection robot to carry out the inspection task, then will patrol and examine data and wait to transmit environmental data together and pack into the monitoring data (with the identity ID and the timestamp T of fixed supervisory equipment) and upload to management control platform.
Because the monitoring angle of the fixed monitoring equipment is single, in order to supplement the defects of the fixed monitoring equipment in the aspect of monitoring view angles, the position which is not monitored by the fixed monitoring equipment is supplemented and acquired by the mobile inspection robot. When the patrol data is transmitted, the position information needs to be matched, so that the management control platform can know which position the received data is the patrol data. If the inspection data is directly uploaded by the inspection robots, a positioning device such as a GPS and the like and a storage hardware for storing the inspection data need to be configured for each mobile inspection robot, which increases a lot of cost. Therefore, in the embodiment of the disclosure, the task of uploading data is allocated to the fixed monitoring device, the mobile inspection robot only needs to travel according to the planned path, and when the hidden danger identification point (the position of the fixed monitoring device) in the path is reached, the inspection task is received and executed, and then the inspection data is transmitted back to the fixed monitoring device. Therefore, the mobile inspection robot does not need to be provided with a GPS or a large-capacity data storage device, and a large amount of cost can be saved. The inspection task comprises information such as acquisition position coordinates, acquisition angles, acquisition time and the like.
For path planning of a mobile inspection robot, the position of the inspection robot needs to be known. According to the embodiment of the disclosure, an UWB indoor positioning system is adopted, a mobile inspection robot is provided with UWB labels, a fixed monitoring device is used as a UWB positioning base station, and a positioning resolving server is deployed on a management control platform. And measuring the distance between the mobile inspection robot and each fixed monitoring device by using a TOF ranging method, and solving the position coordinates of the mobile inspection robot by using a positioning resolving server according to the position coordinates of the fixed monitoring devices to realize the positioning of the mobile inspection robot.
In some embodiments, the ID of the fixed monitoring device is calculated by the management control platform according to the position coordinates of the fixed monitoring device by using a hash function, i.e. id=h (x, y), where (x, y) represents the position coordinates of the fixed monitoring device and H represents the hash function.
In some embodiments, identity recognition can be performed between the fixed monitoring device and the mobile inspection robot through an FRID technology, namely, the mobile inspection robot carries a radio frequency tag with an identity code written therein, the fixed monitoring device is provided with a radio frequency reader, when the mobile inspection robot moves within a communication range of the radio frequency reader in the fixed monitoring device, the radio frequency reader reads the identity code in the radio frequency tag of the mobile inspection robot, the identity code is matched with the identity code in the inspection task issued by the management control platform, and if the matching is successful, the inspection task is sent to the mobile inspection robot.
Through an identity matching mechanism, the aim of executing different tasks by different mobile inspection robots can be fulfilled. By adopting the mode, on one hand, the mobile inspection machine scheduling can be conveniently carried out, and on the other hand, the management control platform can distribute inspection tasks according to the functions of the mobile inspection robots with different models as the models of the mobile inspection robots are not necessarily the same.
(II) remove inspection robot
The mobile inspection robot mainly comprises a walking module, a processor, image acquisition equipment, a communication module and a UWB tag.
The communication module is used for realizing the mesh networking function, and when the mobile inspection machine is added into a mesh network formed by the fixed monitoring equipment and other mobile inspection machines, data communication can be realized in the inspection site. The mobile inspection robot can download the grid map of the inspection site through the network and receive the inspection path issued by the management control platform and the inspection task issued by the fixed monitoring equipment.
The processor is used for generating a driving instruction according to the received inspection path and driving the walking module to walk according to the inspection path; the processor is also used for receiving the inspection task transmitted by the fixed monitoring equipment, adjusting the gesture according to the inspection task, collecting image data of an angle regulated by the inspection task through the image collecting equipment, and transmitting the collected image data back to the fixed monitoring equipment. The inspection task here includes acquisition of position coordinates, acquisition angles, acquisition frequencies, etc.
The radio frequency tag is pre-written with an identity code of the mobile inspection robot. The radio frequency tag can be an active tag or a passive tag.
(III) management control platform
The management control platform is mainly configured to realize the following functions:
(1) Providing human-computer interaction function
The staff can input path control instructions through the man-machine interaction interface, and the management control platform generates a routing inspection path according to the control instructions and sends the routing inspection path to the mobile routing inspection robot.
The management control platform can generate a grid map of the inspection site according to site parameters imported from the outside, and staff can mark hidden danger identification position points, obstacle ranges and the like on the grid map to form the site map.
The management control platform can also generate a patrol task and a fixed monitoring task according to task configuration information input by a manager and send the patrol task and the fixed monitoring task to the fixed monitoring equipment.
(2) Configuring network parameters and operating parameters of a fixed monitoring device
The management control platform can adjust the access states of the fixed monitoring equipment and the mobile inspection robot in the inspection site by configuring network parameters, including the joining and the exiting of network nodes.
The working parameters are used for controlling the working state of the fixed monitoring equipment, such as data acquisition frequency, data uploading frequency, starting time point, standby time length, data acquisition angle, network configuration parameters and the like of the fixed monitoring equipment.
(3) Inspection robot position identification
The management control platform is deployed with a positioning resolving server, and the position of the mobile inspection robot in the inspection site is resolved by acquiring the distance between the fixed monitoring equipment and the mobile inspection robot, which is measured by TOF ranging, so that the position tracking and track correction of the mobile inspection robot can be realized.
(4) Data processing
The management control platform acquires the monitoring data uploaded by the fixed monitoring equipment, and identifies the fault type through a pre-constructed hidden danger identification model. It takes an image or a part of an image as input and predicts the content contained in the image. The output is a class label corresponding to different fault types.
According to the actual requirements of the field environment, the hidden danger identification model can adopt different network structures, so that different potential safety hazards can be identified, and the potential safety hazards can comprise: the helmet is not worn, the smoke is abnormal, the fire is abnormal, the person falls off the guard and sleeps, the illegal intrudes, the illegal operation and the like.
The hidden danger identification model adopted by the embodiment of the disclosure is divided into two layers, wherein the upper layer is a target detection model YOLOv5 and is used for completing target detection, detecting target images are intercepted from an original image, and an image set to be classified is generated; and the lower layer uses a long-short-term memory neural network (Long Short Term Memory Neural Network: LSTM) to carry out hidden danger identification and classification on the intercepted detection target image.
The network structure of YOLOv5 is shown in fig. 2, and mainly comprises an input terminal, a BackBone network module, a PANet module and an OutPut terminal out put.
The input end mainly completes the Mosaic data enhancement (including random scaling, random clipping, random arrangement and the like) of the small target, the self-adaptive anchor frame calculation and the self-adaptive picture scaling.
The Backbone is a Backbone part of a network, is aggregated on different image fine granularity to form a convolutional neural network corresponding to image characteristics, and mainly comprises Focus, conv, bottleneckcsp and SPP.
Focus: the input is copied into four parts and divided into four slices through slicing operation, and the four parts are spliced through a concat layer, wherein the splicing refers to the combination of channel numbers, the feature number of the image is increased, and the information under each feature is unchanged. Then through the CBL layer, namely through the convolution layer (conv), the input different features are extracted, so that the specific local image features can be found; secondly, controlling gradient distribution of each time near an origin through a batch norm layer to realize result normalization, so that deviation of each batch is not overlarge; finally, the result is input to the next layer convolution by using the leak_relu activation function. It is known from the original state of the developer that the Focus module is designed to reduce the calculation amount and the number of layers, so as to increase the speed, not the mAP.
Bottleneckcsp: comprising two parts, namely bottleneck and CSP. Wherein the bottleneck is a classical residual structure, a convolution layer of 1x1 is used, which reduces the computational effort well. The CSP divides the input into two parts, one part is firstly subjected to n times of bottleneck operation, then is subjected to convolution operation, and the other part is directly subjected to convolution operation, wherein the purpose of the convolution operation is to halve the number of channels, and then the two parts are output after being spliced by the concat. The Bottleneck CSP not only reduces the calculation amount, but also improves the learning ability of the model.
SPP: spatial pyramid pooling, consisting essentially of three parts: conv, maxpooling, convectitude. Firstly performing conv extraction feature output, then performing downsampling through the largest pooling layers of three different kernel_size, performing splicing fusion on respective output results, adding the output results with initial features of the output results, and finally recovering the output to be consistent with the initial input through convolution conv.
PANet: the fusion part of the network mixes and combines the characteristics and passes the characteristics to the prediction layer. The FPN structure of transmitting strong features from top to bottom is adopted so as to improve the propagation of low-level features, and the feature pyramid containing two PAN structures from bottom to top is combined and operated, so that the capability of network feature fusion is enhanced.
The output section, yolov5, uses giou as a loss function and screens the target box by non-maximal suppression NMS.
The training process of the YOLOv5 network model is as follows:
s1, constructing sample image sets aiming at hidden dangers of different types, wherein each sample set comprises hidden danger scene images and non-hidden danger scene images;
s2, marking a hidden danger scene image with a label 1, and marking a non-hidden danger scene image with a label 0;
and S3, training a YOLOv5 network model by adopting the constructed sample image set.
The LSTM network is shown in fig. 3 and comprises an input gate 1, a memory gate 2 and an output gate 3. And adding manual labels (for example, label 1 represents a non-wearing safety helmet, label 2 represents abnormal smoke, label 3 represents abnormal fire, label 4 represents off-duty sleep, label 5 represents illegal intrusion and label 6 represents illegal operation) for representing different hidden danger categories in each sample set to obtain a sample set of the LSTM network, and then training the LSTM network by using the sample set.
In the embodiment of the disclosure, the management control platform performs fault recognition on the picture and video information uploaded by the fixed monitoring equipment through the hidden danger recognition model, if the recognition result is that no fault exists, the monitoring data is automatically saved to the background and a monitoring record is generated, and if the fault is recognized, the alarm information is generated.
(5) And updating the hidden danger identification model according to the collected fault information.
Through constantly collecting different fault data, can update the training dataset of hidden danger recognition model, train hidden danger recognition model through new training dataset, update hidden danger recognition model's parameter, can let hidden danger recognition model increase the trouble quantity that can discern, can improve hidden danger recognition model's recognition accuracy simultaneously.
In the description of the present invention, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention, and are intended to be included within the scope of the appended claims and description.
Claims (10)
1. An AI-recognition-based inspection system, comprising: the system comprises a management control platform, fixed monitoring equipment and a mobile inspection robot; the fixed monitoring equipment is arranged at hidden danger identification points in the inspection site, forms a mesh network with other fixed monitoring equipment and the mobile inspection robot, and realizes communication with the management control platform through the mesh;
the management control platform is configured to generate a patrol field grid map and a patrol path and issue the patrol field grid map and the patrol path to the mobile patrol robot; the system is used for generating a fixed monitoring task and a patrol task and issuing the fixed monitoring task and the patrol task to the fixed monitoring equipment; the system is also used for receiving the fixed monitoring data and the patrol data uploaded by the fixed monitoring equipment, and identifying the fault type through a pre-constructed hidden danger identification model;
the mobile inspection robot is used for running based on the inspection path, interacting with the fixed monitoring equipment in the inspection path, executing the inspection task issued by the fixed monitoring equipment, and returning inspection data to the fixed monitoring equipment;
the fixed monitoring equipment is configured to execute the fixed monitoring task and upload fixed monitoring data to the management control platform; and the mobile inspection robot is used for interacting with the mobile inspection robot, issuing the inspection task, receiving the inspection data and uploading the inspection data to the management control platform.
2. The inspection system of claim 1, wherein the management control platform is further configured to adjust network access states of the fixed monitoring device and the mobile inspection robot in an inspection site by configuring network parameters, including joining and exiting the mesh network; and the working parameters are used for controlling the working state of the fixed monitoring equipment by configuring the working parameters of the fixed monitoring equipment, wherein the working parameters comprise data acquisition frequency, data uploading frequency, starting time point, standby time length and data acquisition angle.
3. The inspection system according to claim 1 or 2, wherein the fixed monitoring device has a unique ID, and the ID corresponds to and correlates with an address of the fixed monitoring device stored in advance in the management control platform.
4. A patrol system according to claim 3, wherein the ID of the fixed monitoring device is calculated by the management control platform according to the position coordinates of the fixed monitoring device using a hash function, where the calculation formula is id=h (x, y), where (x, y) represents the position coordinates of the fixed monitoring device on the grid map, and H represents the hash function.
5. The inspection system of claim 4, wherein the ID and timestamp are written in a header of a data packet to be transmitted before the fixed monitoring device uploads data to the management control platform.
6. The inspection system according to any one of claims 1 to 5, wherein before the fixed monitoring device uploads data to the management control platform, a data type identifier is also written in a header of a data packet to be transmitted, and the data type identifier is used for distinguishing the inspection data from the fixed monitoring data.
7. The inspection system of claim 6, wherein the inspection task comprises acquiring position coordinates, an acquisition angle, and an acquisition time.
8. The inspection system of claim 6, wherein the mobile inspection robot is configured with a radio frequency tag, and an identity code of the mobile inspection robot is stored in the radio frequency tag; the fixed monitoring equipment is provided with a radio frequency reader-writer; when the mobile inspection robot moves to the communication range of the radio frequency reader-writer, the radio frequency reader-writer reads the identity code in the radio frequency tag, matches the identity code with the identity code carried in the inspection task issued by the management control platform, and sends the inspection task to the corresponding mobile inspection robot if the matching is successful; otherwise, the patrol task is not sent.
9. The inspection system of claim 6, wherein the mobile inspection robot is configured with UWB tags, the fixed monitoring device acts as a UWB positioning base station, and the management control platform is deployed with a positioning solution server; the fixed monitoring equipment measures the distance between the fixed monitoring equipment and the mobile inspection robot through a TOF ranging method, and uploads ranging information to the positioning resolving server, and the positioning resolving server resolves the position coordinates of the mobile inspection robot according to the position coordinates of the fixed monitoring equipment to achieve positioning of the mobile inspection robot.
10. The inspection system according to claims 1 to 9, wherein the hidden danger identification model is divided into two layers, and the upper layer is a target detection model YOLOv5 for completing target detection, and a detection target image is intercepted from an original image to generate an image set to be classified; and the lower layer uses a long-short-term memory neural network LSTM for identifying and classifying hidden dangers of the detection target image.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211031832.0A CN117692819A (en) | 2022-08-26 | 2022-08-26 | AI identification-based inspection system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211031832.0A CN117692819A (en) | 2022-08-26 | 2022-08-26 | AI identification-based inspection system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN117692819A true CN117692819A (en) | 2024-03-12 |
Family
ID=90137672
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202211031832.0A Pending CN117692819A (en) | 2022-08-26 | 2022-08-26 | AI identification-based inspection system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117692819A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120318049A (en) * | 2025-06-16 | 2025-07-15 | 江西省公路工程检测中心 | A highway infrastructure intelligent inspection management system and method |
-
2022
- 2022-08-26 CN CN202211031832.0A patent/CN117692819A/en active Pending
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120318049A (en) * | 2025-06-16 | 2025-07-15 | 江西省公路工程检测中心 | A highway infrastructure intelligent inspection management system and method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109980781B (en) | A substation intelligent monitoring system | |
| KR102554662B1 (en) | Safety management system using unmanned detector | |
| CN117319609A (en) | Internet of things big data intelligent video monitoring system and method | |
| CN110908370A (en) | Unmanned inspection task planning method and system for thermal power plant | |
| CN110968941A (en) | UAV control platform and control method based on airspace security assessment | |
| CN112288984A (en) | Three-dimensional visual unattended substation intelligent linkage system based on video fusion | |
| CN118799125A (en) | An intelligent monitoring system for construction sites based on artificial intelligence | |
| CN105187790B (en) | Method, device and system for monitoring working state of vehicle-mounted terminal | |
| CN116165981B (en) | Intelligent monitoring system for industrial industry safety production | |
| CN103235562A (en) | Patrol-robot-based comprehensive parameter detection system and method for substations | |
| CN104079874A (en) | Security and protection integrated system and method based on Internet of Things technology | |
| CN109544870A (en) | Alarm decision method and intelligent monitor system for intelligent monitor system | |
| CN108733073A (en) | Unmanned plane managing and control system, method and readable medium in a kind of region | |
| JP7368582B1 (en) | Certification management system and method for power plant patrol equipment | |
| CN117558074A (en) | Inspection methods, devices, systems, electronic equipment and storage media | |
| CN112671104A (en) | Transformer substation multidimensional scene control platform facing complex scene | |
| CN120178917A (en) | A distributed drone protection network dynamic generation optimization method and system | |
| CN120279488A (en) | Intelligent inspection method and system for special operation real object examination room | |
| CN120544267A (en) | An intelligent identification and evaluation method for power operation behavior in multi-machine collaborative mode | |
| CN118781537A (en) | Photovoltaic power station intelligent inspection method and system | |
| CN117692819A (en) | AI identification-based inspection system | |
| CN118634460A (en) | A fire point tracking method and system for fire fighting robots based on thermal imaging | |
| CN108875857A (en) | Method for inspecting, apparatus and system | |
| CN115880631A (en) | Power distribution station fault identification system, method and medium | |
| CN114741255A (en) | Fault self-healing technology based on automatic execution of service scene |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |