Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
The rapid movement refers to a rapid movement behavior, and many accidents in public places often have a relationship with the rapid movement of a target object, such as robbery, harassment, mechanical fight and the like, so that the detection of whether the target object is rapidly moved has important significance. At present, an image acquisition device is often used to acquire an image of a target object and determine whether the target object moves rapidly according to a pixel distance of the target object moving in a unit time. When the image acquisition device is in a slant shooting state, the obtained image has a near-far effect, that is, the actual distance of the unit pixel in the image represented by the near-field scene is smaller than that of the far-field scene, which causes the near-field scene to easily judge whether the target object moves fast or not, and makes it difficult to accurately detect the fast movement of the target object under the far-field scene, thereby affecting the accuracy of the overall detection result.
According to the embodiment, various possible acquisition states of the image acquisition device are fully considered, the target object is judged to move rapidly by using the detection frame and the expansion frame, the near-far effect possibly existing in the image acquired by the image acquisition device is eliminated, and the accuracy of the detection result is effectively improved.
Referring to fig. 1, a first aspect of the present embodiment provides a fast moving detection method, which includes the following steps:
step S1: and determining the coordinate position of a first detection frame corresponding to the target object in a historical frame image before the current frame image is acquired by the image acquisition device.
The method comprises the steps that an image acquisition device is used for image acquisition, the image acquisition device can acquire a series of images including a current frame image and a historical frame image, wherein the current frame image refers to an image acquired at the current moment, and can be, for example, a latest image acquired by the image acquisition device at the current moment or a latest image acquired when an existing video is subjected to frame processing; the historical frame image refers to an image corresponding to a unit time length before the current frame image or a preset number of frames, for example, the historical frame image may be an image of a frame before the current frame image, or may be an image of 10 frames before the current frame image, and the unit time length may be 3s, 5s, 10s, or the like, that is, the historical frame image may be an image of 3s, 5s, or 10s before the current frame image. It can be understood that, when the image capturing device continuously captures images or processes a video acquired by the image capturing device, the current frame image and the historical frame image are continuously updated, for example, when the image capturing device captures the next frame image of the current frame image according to a preset frequency, the next frame image becomes the latest current frame image, and the current frame image becomes the historical frame image, so the method provided in this embodiment is continuously performed in real time. And carrying out target detection in the historical frame image, generating a first detection frame corresponding to the target object, and determining the coordinate position of the first detection frame, wherein the coordinate position of the first detection frame is the pixel coordinate position of the detection frame of the target object in the historical frame image. The target object mentioned in this embodiment may be a person, a vehicle, or an animal, the image capturing device may be a camera or a video camera, and when the image capturing device captures a video, the video is subjected to frame processing, so that a series of images may also be acquired.
Step S2: and determining the corresponding coordinate position of the expansion frame of the target object in the historical frame image based on the coordinate position of the first detection frame and preset precision.
The preset precision is a preset parameter, and a user can set the preset precision according to the actual application scene of the image acquisition device. The coordinate position of the extension frame in this embodiment is adjusted according to the coordinate position of the first detection frame, and the size of the extension frame is determined by the size of the first detection frame. When the oblique-shooting image acquisition device is used, the size of a target object in an acquired image is small, when the target object is near, a first detection frame corresponding to the target object is large, a corresponding expansion frame determined according to the first detection frame is large, when the target object is far, a first detection frame corresponding to the target object is small, and a corresponding expansion frame determined according to the first detection frame is small. It can be understood that the method provided by the embodiment can be applied to an oblique-shooting image acquisition device and can also be applied to a top-shooting image acquisition device. And after the coordinate position of the expansion frame in the historical frame image is acquired, the coordinate position of the expansion frame is saved.
Step S3: and determining the coordinate position of a second detection frame corresponding to the target object in the current frame image.
The second detection frame coordinate position refers to the pixel coordinate position of the detection frame of the target object in the current frame image. It can be understood that, when the image acquisition device is used for real-time monitoring, the images acquired by the image acquisition device in the time sequence always precede the historical frame image and then follow the current frame image, so that the method provided by this embodiment performs real-time processing according to the acquisition sequence of the images, and can ensure the real-time performance of the detection result.
Step S4: and determining whether the target object moves rapidly or not based on the second detection frame coordinate position and the expansion frame coordinate position.
And judging whether the target object moves rapidly according to the coordinate position of the second detection frame and the coordinate position of the expansion frame, wherein the image acquisition device for acquiring the image is fixedly arranged, so that the seat position of the expansion frame in the historical frame image can be directly compared with the coordinate position of the second detection frame.
The fast moving detection method provided by the embodiment has the beneficial effects that: in this embodiment, target detection is performed in a historical frame image, a first detection frame coordinate position corresponding to a target object is determined, then an extension frame coordinate position corresponding to the target object in the historical frame image is determined according to the first detection frame coordinate position and preset precision, a second detection frame coordinate position corresponding to the target object is further determined in a current frame image, and whether the target object moves rapidly is determined according to the second detection frame coordinate position and the extension frame coordinate position. In the embodiment, whether the target object moves fast or not can be determined through the expansion frame and the second detection frame without calculating the moving speed of the target object, and the size of the expansion frame is adjusted according to the size of the first detection frame, so that the influence caused by the near-far effect in the image is avoided, and whether the target object moves fast or not can be accurately determined.
Referring to fig. 2, further, in step S2, determining a corresponding coordinate position of an expansion frame of the target object in the historical frame image based on the coordinate position of the first detection frame and a preset precision, includes:
step S21: determining a coordinate position of a center point of the first detection frame based on the coordinate position of the first detection frame;
step S22: and determining the corresponding extended frame coordinate position of the target object in the historical frame image according to the product of the distance between the first detection frame central point coordinate position and the first detection frame coordinate position and preset precision.
In the above embodiment, according to the coordinate position of the first detection frame, the coordinate position of the center point of the first detection frame may be determined, then the distance between the coordinate position of the center point of the first detection frame and the coordinate position of the first detection frame is determined, and the product of the distance and the preset precision is used to determine the corresponding coordinate position of the extension frame of the target object in the history frame image.
In a possible implementation manner, referring to fig. 3, the step S22 determines, according to a product of a preset precision and a distance between a center point coordinate position of the first detection frame and a coordinate position of the first detection frame, a corresponding extended frame coordinate position of the target object in the historical frame image, including:
step S221: and determining the distance from the coordinate position of the center point of the first detection frame to each edge of the first detection frame.
Step S222: and determining the distance from the center point coordinate of the first detection frame to each side of the expansion frame according to the product of the distance from the center point coordinate of the first detection frame to each side of the first detection frame and the preset precision.
Step S223: and determining the corresponding coordinate position of the extension frame of the target object in the historical frame image according to the shape of the first detection frame and the distance from the center point coordinate of the first detection frame to each side of the extension frame.
In the above embodiment, in order to determine the seat position of the extension frame, the distances from the coordinate position of the center point of the first detection frame to the sides of the first detection frame are determined, for example, when the detection frame is rectangular, the distances from the center point of the rectangular detection frame to the upper, lower, left and right sides of the rectangular detection frame are determined, and y1, y2, x1 and x2 can be obtained. Multiplying the obtained y1, y2, x1 and x2 with preset precision respectively to obtain y1 ', y 2', x1 'and x 2', wherein y1 ', y 2', x1 'and x 2' are distances from the center point coordinates of the first detection frame to each edge of the expansion frame, and then generating the expansion frame with the same shape as the first detection frame according to y1 ', y 2', x1 'and x 2', so as to determine the coordinate position of the expansion frame.
In another possible implementation manner, referring to fig. 4, further, the step S22 determines, according to a product of a distance between the coordinate position of the center point of the first detection frame and the coordinate position of the first detection frame and a preset precision, a corresponding coordinate position of an expansion frame of the target object in the historical frame image, and includes:
step S224: and determining the distance between the coordinate position of the center point of the first detection frame and the coordinate position of each vertex of the first detection frame.
Step S225: and determining each vertex coordinate position of the expansion frame corresponding to the target object according to the product of the distance between the coordinate position of the central point of the first detection frame and each vertex coordinate position of the first detection frame and the preset precision.
Step S226: and connecting the coordinate positions of all vertexes of the expansion frame to determine the corresponding coordinate position of the expansion frame of the target object in the historical frame image.
In the above embodiment, the distances between the coordinate position of the center point of the first detection frame and the coordinate positions of the vertices of the first detection frame are determined, for example, the distances between the coordinate position of the center point of the first detection frame and the coordinate positions of the 4 vertices of the rectangular detection frame are determined, then, the determined distances are multiplied by the preset precision to obtain the distances between the coordinate position of the center point of the first detection frame and the vertices of the extension frame, the coordinate positions of the vertices of the extension frame are determined, and the coordinate positions of the vertices of the extension frame are connected to determine the coordinate position of the extension frame corresponding to the target object. Of course, an expanded frame having the same shape as the first detection frame may be generated according to the distance between the coordinate position of the center point of the first detection frame and each vertex of the expanded frame, or the coordinate position of the expanded frame may be determined.
It can be understood that, the shape of first detection frame can carry out self-setting, except common rectangle, can also set to arbitrary polygon, certainly also can set to circularly, when the shape that detects the frame is circular, can determine the distance of first detection frame central point coordinate position and the arbitrary point on the circle, utilize the product of this distance and preset precision, determine the distance of extension frame and first detection frame central point coordinate position, use first detection frame central point as the centre of a circle, use extension frame and the distance of first detection frame central point coordinate position to generate a circular as the extension frame as the radius, and then determine extension frame coordinate position.
Referring to fig. 5, further, the step S4, determining whether the target object moves fast based on the second detection frame coordinate position and the expansion frame coordinate position, includes:
step S41: and determining the coordinate position of the center point of the second detection frame based on the coordinate position of the second detection frame.
Step S42: and judging whether the coordinate position of the central point of the second detection frame is located in the coordinate position of the extension frame.
Step S43: and if the coordinate position of the central point of the second detection frame is located in the coordinate position of the extension frame, the target object does not move rapidly.
Step S44: and if the coordinate position of the central point of the second detection frame is not located in the coordinate position of the extension frame, the target object moves rapidly.
In the above-described embodiment, the extended frame coordinate position is based on the position of the target object in the history frame image and the position of the target object in the current frame image predicted when the target object normally moves, and therefore it can be determined whether the target object has moved rapidly based on whether the second detection frame center point coordinate position is located within the extended frame coordinate position. If the coordinate position of the central point of the second detection frame is located in the coordinate position of the extension frame, the target object moves normally and does not move rapidly; and if the coordinate position of the central point of the second detection frame is not located in the coordinate position of the extension frame, the target object moves rapidly. It should be noted that, when the coordinate of the center point of the second detection frame is located at the coordinate position of the extension frame, the coordinate position of the center point of the second detection frame is located in the coordinate position of the extension frame.
Referring to fig. 6, further, in step S44, if the coordinate position of the center point of the second detection frame is not located in the coordinate position of the extension frame, after the target object moves fast, the method further includes:
step S441: and tracking the target object which moves rapidly in the subsequent frame image acquired by the image acquisition device, and determining whether the tracked target object continuously moves rapidly.
Step S442: and if the tracked target object continuously moves fast, sending out warning information.
When the target object moves rapidly, the target object can be found in the first time by using the historical frame image and the current frame image, and the target object can be tracked after the target object is detected to move rapidly due to the fact that the target object can move rapidly for a short time caused by an accidental event, and warning information is sent out when the target object is found to move rapidly continuously. Specifically, whether the target object continuously moves fast is determined by using the subsequent frame image, where the subsequent frame image may refer to an image corresponding to a unit duration or a preset number of frames acquired after the current frame image, for example, the subsequent frame image may be a next frame image acquired after the current frame image, or an image of 10 frames after the current frame image, and the unit duration may be 3s, 5s, 10s, or the like, that is, the subsequent frame image may be an image of 3s, 5s, or 10s after the current frame image, where the unit duration and the preset number of frames between the subsequent frame image and the current frame image, and the unit duration and the preset number of frames between the current frame image and the historical frame image are the same. When the target object is determined to be rapidly moved according to the current frame image and the historical frame image, marking the target object, determining the corresponding expansion frame coordinate position of the target object in the current frame image according to the second detection frame coordinate position and the preset precision corresponding to the target object in the current frame image, determining the third detection frame coordinate position of the target object in the subsequent frame image, determining whether the third detection frame center point coordinate position is located in the expansion frame coordinate position determined according to the current frame image, if so, continuously and rapidly moving the target object, and sending out warning information. It should be noted that, in a possible case, the warning message may also be generated immediately after the target object moves rapidly.
Referring to fig. 7, further, if the tracked target object continuously moves fast, the step S442 sends out an alert message, which includes:
step S4421: and acquiring the time of the tracked target object continuously and rapidly moving.
Step S4422: and if the time for the tracked target object to continuously move fast is greater than a preset time threshold value, sending out warning information.
Because the frequency of the image acquisition device for acquiring the images is high, namely the intervals of different frame images are short, in order to more accurately screen out the occurrence of accidental events, the time of the target object continuously and rapidly moving can be acquired, at the moment, the subsequent frame image refers to a plurality of frame images acquired by the image acquisition device after the current frame image, and the target object is rapidly moved by judging according to the historical frame image, the current frame image and the plurality of subsequent frame images, the time of the target object continuously and rapidly moving is recorded by taking the time of the historical frame image as a starting point, the time of the target object continuously and rapidly moving is compared with a preset time threshold, and once the time of the target object continuously and rapidly moving is greater than the preset time threshold, warning information is sent. The preset time threshold value can be set according to the actual scene.
After it is determined that the target object moves rapidly, there may be a variety of situations where the warning information is sent. For example, the number of the target objects which move fast in the current frame image is determined, and if the number of the target objects which move fast is greater than or equal to 2, warning information is sent out. Or when the target object moves rapidly, acquiring at least one frame of image of the target object before the historical frame of image, identifying the action of the target object in the image, and if the action of the target object accords with the preset action, namely the target object possibly has the situations of robbery, mechanical fight and the like, sending out warning information at the moment. Of course, there may be other possibilities of sending out the warning message, and this embodiment is not limited in particular.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Referring to fig. 8, based on the same inventive concept, an embodiment of the present invention further provides a fast moving detection apparatus, including:
the first position determining module 81 is configured to determine a first detection frame coordinate position corresponding to the target object in a historical frame image before the current frame image is acquired by the image acquisition device;
a second position determining module 82, configured to determine, based on the first detection frame coordinate position and a preset precision, a corresponding extended frame coordinate position of the target object in the historical frame image;
a third position determining module 83, configured to determine, in the current frame image, a coordinate position of a second detection frame corresponding to the target object;
a detection processing module 84, configured to determine whether the target object moves fast based on the second detection frame coordinate position and the expansion frame coordinate position.
Of course, in other embodiments, each module of the fast movement detection apparatus may further include one or more units for implementing corresponding functions, which are not described herein again.
Fig. 9 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 9, the terminal device 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62, such as a sound amplification program based on image recognition, stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps in the various embodiments of the fast movement detection method described above, such as the steps S1-S4 shown in fig. 1. Alternatively, the processor 60, when executing the computer program 62, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the modules 81 to 84 shown in fig. 8.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 62 in the terminal device 6.
The terminal device 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 9 is merely an example of a terminal device 6 and does not constitute a limitation of terminal device 6 and may include more or fewer components than shown, or some components may be combined, or different components, for example, the terminal device may also include input output devices, network access devices, buses, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may also be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing the computer programs and other programs and data required by the terminal device. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.