CN115290921B - Moving speed measurement method, system, electronic device and readable storage medium - Google Patents
Moving speed measurement method, system, electronic device and readable storage medium Download PDFInfo
- Publication number
- CN115290921B CN115290921B CN202210518817.2A CN202210518817A CN115290921B CN 115290921 B CN115290921 B CN 115290921B CN 202210518817 A CN202210518817 A CN 202210518817A CN 115290921 B CN115290921 B CN 115290921B
- Authority
- CN
- China
- Prior art keywords
- area
- target
- coordinate
- image
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/64—Devices characterised by the determination of the time taken to traverse a fixed distance
- G01P3/68—Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/02—Affine transformations
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The invention relates to the technical field of video processing, and discloses a method, a system, electronic equipment and a readable storage medium for measuring moving speed, wherein the method obtains a perspective transformation matrix according to a geographic coordinate matrix and an image coordinate matrix, the image coordinate points of the object to be measured in the image are converted into the geographic coordinate points of the real world through the perspective transformation matrix, so that the real moving speed of the object to be measured is determined, and compared with the method that the real moving speed is estimated through the pixel moving distance, the image deformation caused by the perspective principle is reduced, and the accuracy of moving speed measurement is improved.
Description
Technical Field
The present invention relates to the field of video processing technologies, and in particular, to a method and a system for measuring a moving speed, an electronic device, and a readable storage medium.
Background
At present, along with the development of video technology, the speed measurement of moving objects such as automobiles, bicycles, motorcycles and the like in a target area can be realized through an image monitoring technology and a computer image carding technology in the security field, and then the moving objects with too high moving speed are supervised, warned and avoided, so that potential safety hazards are avoided.
The existing moving speed measuring method generally estimates the actual moving speed through the pixel moving distance of a moving object in an image, but the monitoring equipment is used as a viewpoint and has an inclined angle with the ground, and due to the perspective principle, the phenomena of size change and shape change can occur when objects with different directions and different distances are observed. Therefore, the image pixels cannot accurately judge the movement of the target object, so that the accuracy of acquiring the movement speed is low.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview, and is intended to neither identify key/critical elements nor delineate the scope of such embodiments, but is intended as a prelude to the more detailed description that follows.
In view of the above-mentioned shortcomings of the prior art, the present invention discloses a method, a system, an electronic device and a readable storage medium for measuring a moving speed, so as to improve accuracy of moving speed acquisition.
The invention discloses a moving speed measuring method which comprises the steps of obtaining a geographic coordinate matrix corresponding to a target area and an area image stream corresponding to the target area, establishing an image coordinate matrix corresponding to the target area based on the area image stream, determining a perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix, determining image coordinate points corresponding to a plurality of recording time stamps of a target to be measured in the area image stream, determining geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix, determining the moving distance of the target to be measured according to at least one part of geographic coordinate points corresponding to the recording time stamps, and determining the moving speed of the target to be measured according to the moving distance and the recording time stamps corresponding to the moving distance.
Optionally, determining image coordinate points of a target to be detected corresponding to a plurality of recording time stamps in the regional image stream comprises identifying the target corresponding to the target to be detected in the regional image stream, if the target to be detected is identified in the regional image stream, acquiring a minimum external frame of the target to be detected in the regional image stream, taking any point in the minimum external frame as a target reference point, acquiring a current time stamp as a recording time stamp every other preset image frame number, and determining the position of the target reference point at the current time stamp as the image coordinate point corresponding to the recording time stamp.
Optionally, before determining the geographic coordinate point corresponding to each image coordinate point based on the perspective transformation matrix, the method further comprises dividing a target area in the area image stream to obtain a plurality of subareas, determining scaling corresponding to each subarea according to the geographic coordinate matrix and the image coordinate matrix corresponding to each subarea, taking the subarea corresponding to the maximum pixel area as a main area according to the pixel area of each subarea, determining the subarea as a region to be corrected if the pixel difference value between one subarea and the main area is greater than a preset threshold value, determining a correction parameter based on the scaling corresponding to the main area and the region to be corrected, and after determining the geographic coordinate point corresponding to each image coordinate point based on the perspective transformation matrix, performing perspective correction on the geographic coordinate point corresponding to the coordinate point to be corrected according to the correction parameter, wherein the coordinate point to be corrected is the image coordinate point located in the region to be corrected.
Optionally, determining the moving distance of the target to be measured according to the geographic coordinate points corresponding to at least a part of the recording time stamps, and determining the moving speed of the target to be measured according to the moving distance and the recording time stamp corresponding to the moving distance, wherein the moving speed of the target to be measured comprises the steps of taking any two recording time stamps as a first time stamp and a second time stamp respectively, determining the moving time of the target to be measured according to the first time stamp and the second time stamp, determining the moving distance of the target to be measured according to the geographic coordinate points corresponding to the first time stamp and the geographic coordinate points corresponding to the second time stamp, and determining the moving speed of the target to be measured based on the moving distance and the moving time.
The method comprises the steps of storing a perspective transformation matrix in a preset storage space, determining a monitoring device from preset monitoring devices as target devices, acquiring an area image stream corresponding to the target area through the target devices, determining image coordinate points of the target to be measured in a plurality of recording time stamps in the area image stream, extracting the perspective transformation matrix from the preset storage space through the target devices, and determining the geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix.
Optionally, acquiring a geographic coordinate matrix corresponding to a target area by acquiring area size information of the target area based on a preset length unit, and establishing a Dier coordinate system in the target area, wherein any point of the target area is determined to be a coordinate origin of the Dier coordinate system, any two straight lines which are perpendicular to each other and intersect at the coordinate origin are respectively taken as an x axis and a y axis of the Dier coordinate system, and the geographic coordinate matrix corresponding to the target area is determined based on the area size information and the Dier coordinate system.
The method comprises the steps of obtaining a geographic coordinate matrix corresponding to a target area, collecting area size information of the rectangular area based on a preset length unit, establishing a Dier coordinate system on a plane where the rectangular area is located, determining the vertex of any corner of the rectangular area as the coordinate origin of the Dier coordinate system, respectively taking two sides intersecting with the coordinate origin in the rectangular area as the x axis and the y axis of the Dier coordinate system, determining vertex geographic coordinates corresponding to four vertexes in the target area according to the area size information and the Dier coordinate system, and determining the geographic coordinate matrix corresponding to the target area based on the vertex geographic coordinates.
Optionally, establishing an image coordinate matrix corresponding to the target area based on the area image stream comprises establishing a pixel coordinate system in the area image stream, extracting vertex pixel coordinates corresponding to four vertexes in the target area based on the pixel coordinate system in the area image stream, and determining the image coordinate matrix corresponding to the target area based on each vertex pixel coordinate.
The invention discloses a moving speed measurement system which comprises an acquisition module, a matrix determination module, a coordinate determination module and a calculation module, wherein the acquisition module is used for acquiring a geographic coordinate matrix corresponding to a target area and an area image stream corresponding to the target area, the matrix determination module is used for establishing an image coordinate matrix corresponding to the target area based on the area image stream, determining a perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix, the coordinate determination module is used for determining image coordinate points corresponding to a target to be measured in a plurality of recording time stamps in the area image stream, determining geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix, and the calculation module is used for determining the moving distance of the target to be measured according to at least one part of geographic coordinate points corresponding to the recording time stamps and determining the moving speed of the target to be measured according to the moving distance and the recording time stamp corresponding to the moving distance.
The invention discloses an electronic device, which comprises a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for executing the computer program stored in the memory so as to enable the electronic device to execute the method.
The present invention discloses a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the above-mentioned method.
The invention has the beneficial effects that:
The method comprises the steps of obtaining a geographic coordinate matrix corresponding to a target area and an area image stream corresponding to the target area, establishing an image coordinate matrix corresponding to the target area based on the area image stream, determining a perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix, determining image coordinate points corresponding to a plurality of recording time stamps of a target to be detected in the area image stream, determining geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix, determining the moving distance of the target to be detected according to at least one part of geographic coordinate points corresponding to the recording time stamps, and determining the moving speed of the target to be detected according to the moving distance and the recording time stamps corresponding to the moving distance. In this way, a perspective transformation matrix is obtained according to the geographic coordinate matrix and the image coordinate matrix, and the image coordinate points of the object to be measured in the image are converted into the geographic coordinate points in the real world through the perspective transformation matrix, so that the real moving speed of the object to be measured is determined.
Drawings
FIG. 1 is a flow chart of a method for measuring moving speed according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of four vertices of a target region in a regional image stream in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of another method for measuring moving speed according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a system for measuring moving speed according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present invention with reference to specific examples. The invention is capable of other and different embodiments and its several details are capable of modification and/or various other uses and applications in various respects, all without departing from the spirit of the present invention. It should be noted that the following embodiments and sub-samples in the embodiments may be combined with each other without collision.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present invention by way of illustration, and only the components related to the present invention are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of each component in actual implementation may be arbitrarily changed, and the layout of the components may be more complex.
In the following description, numerous details are set forth in order to provide a more thorough explanation of embodiments of the present invention, it will be apparent, however, to one skilled in the art that embodiments of the present invention may be practiced without these specific details, in other embodiments, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the embodiments of the present invention.
The terms first, second and the like in the description and in the claims of the embodiments of the disclosure and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe embodiments of the present disclosure. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion.
The term "plurality" means two or more, unless otherwise indicated.
In the embodiment of the present disclosure, the character "/" indicates that the front and rear objects are an or relationship. For example, A/B represents A or B.
The term "and/or" is an associative relationship that describes an object, meaning that there may be three relationships. For example, A and/or B, represent A or B, or three relationships of A and B.
As shown in fig. 1, an embodiment of the present disclosure provides a method for measuring a moving speed, including:
Step S101, a geographic coordinate matrix corresponding to a target area and an area image stream corresponding to the target area are obtained;
step S102, an image coordinate matrix corresponding to a target area is established based on the area image stream, and a perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix is determined;
step S103, determining image coordinate points corresponding to the object to be detected in a plurality of recording time stamps in the regional image stream, and determining geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix;
Step S104, determining the moving distance of the object to be measured according to at least a part of the geographical coordinate points corresponding to the recording time stamps, and determining the moving speed of the object to be measured according to the moving distance and the recording time stamp corresponding to the moving distance.
By adopting the moving speed measuring method provided by the embodiment of the disclosure, the geographic coordinate matrix corresponding to the target area and the area image flow corresponding to the target area are obtained, the image coordinate matrix corresponding to the target area is established based on the area image flow, the perspective transformation matrix (namely, the transmission transformation matrix) of the image coordinate matrix mapped to the geographic coordinate matrix is determined, the image coordinate points corresponding to the target to be measured at a plurality of recording time stamps are determined in the area image flow, the geographic coordinate points corresponding to the image coordinate points are determined based on the perspective transformation matrix, the moving distance of the target to be measured is determined according to at least one part of the geographic coordinate points corresponding to the recording time stamps, and the moving speed of the target to be measured is determined according to the moving distance and the recording time stamps corresponding to the moving distance. In this way, a perspective transformation matrix is obtained according to the geographic coordinate matrix and the image coordinate matrix, and the image coordinate points of the object to be measured in the image are converted into the geographic coordinate points in the real world through the perspective transformation matrix, so that the real moving speed of the object to be measured is determined.
Optionally, the geographic coordinate matrix corresponding to the target area is obtained by acquiring area size information of the target area based on a preset length unit, and establishing a Dier coordinate system in the target area, wherein any point of the target area is determined to be a coordinate origin of the Dier coordinate system, any two straight lines which are perpendicular to each other and intersect at the coordinate origin are respectively taken as an x axis and a y axis of the Dier coordinate system, and the geographic coordinate matrix corresponding to the target area is determined based on the area size information and the Dier coordinate system.
In some embodiments, the preset length units are one of centimeter, inch, meter, mile, kilometer, etc., units, e.g., meter.
The method comprises the steps of obtaining a geographic coordinate matrix corresponding to a target area, collecting area size information of the rectangular area based on a preset length unit, establishing a Dier coordinate system on a plane where the rectangular area is located, determining vertexes of any corner of the rectangular area as coordinate origins of the Dier coordinate system, respectively taking two sides intersecting the coordinate origins in the rectangular area as x-axis and y-axis of the Dier coordinate system, determining vertex geographic coordinates corresponding to four vertexes in the target area according to the area size information and the Dier coordinate system, and determining the geographic coordinate matrix corresponding to the target area based on the vertex geographic coordinates.
In some embodiments, a rectangular plane area with a length of H meters and a width of W meters is arbitrarily selected as a target area, a Dier coordinate system is established on the plane of the rectangular area, the geographic coordinates of the vertexes corresponding to the four vertexes in the target area are determined according to the area size information and the Dier coordinate system, and are respectively (0, 0) (0, H) (W, 0) (W, H), and a geographic coordinate matrix is established based on the geographic coordinates of the vertexes corresponding to the four vertexes
Optionally, establishing an image coordinate matrix corresponding to the target area based on the area image stream comprises establishing a pixel coordinate system in the area image stream, extracting vertex pixel coordinates corresponding to four vertexes in the target area based on the pixel coordinate system, and determining the image coordinate matrix corresponding to the target area based on the vertex pixel coordinates.
In some embodiments, the target area is a planar rectangular area with a length of H meters and a width of W meters, a pixel coordinate system is established in the area image stream, vertex pixel coordinates corresponding to four vertexes are determined in the area image stream, the four vertexes of the target area in the area image stream are shown as (x 1,y1)、(x2,y2)、(x3,y3)、(x4,y4) in FIG. 2, and an image coordinate matrix is established based on the vertex pixel coordinates corresponding to the 4 vertexes
Optionally, the perspective transformation matrix is determined by the following formula:
S=P*M;
Wherein S is a geographic coordinate matrix, P is an image coordinate matrix, and M is a perspective transformation matrix.
In some embodiments, determining a perspective transformation matrix that maps the image coordinate matrix to the geographic coordinate matrix includes computing a P→S perspective transformation matrix, wherein the perspective transformation matrix is used to transform the target region at any point in the regional image stream into a position in the Diercard coordinate system.
Optionally, determining image coordinate points of the object to be measured corresponding to a plurality of recording time stamps in the regional image stream comprises the steps of carrying out object identification corresponding to the object to be measured on the regional image stream, acquiring a minimum external frame of the object to be measured in the regional image stream if the object to be measured is identified in the regional image stream, taking any point in the minimum external frame as a target reference point, acquiring a current time stamp as the recording time stamp every other preset image frame number, and determining the position of the target reference point at the current time stamp as the image coordinate point corresponding to the recording time stamp.
Optionally, the minimum circumscribed frame includes a minimum circumscribed rectangle, and a bottom edge center of the minimum circumscribed rectangle is used as the target reference point.
In this way, any point of the object to be measured in the minimum external frame is used as a target reference point, the image coordinate point corresponding to the recording time stamp is determined through the target reference point, and then the geographic coordinate point corresponding to the image coordinate point is determined through the perspective transformation matrix, so that the influence of transformation between the three-dimensional world and the two-dimensional image is eliminated, the height and width information of the object to be measured are not required to be acquired, and the accuracy of moving speed measurement is improved.
The method comprises the steps of dividing a target area in an area image stream to obtain a plurality of subareas, determining scaling corresponding to each subarea according to the geographical coordinate matrix corresponding to each subarea and the image coordinate matrix, taking the subarea corresponding to the maximum pixel area as a main area according to the pixel area of each subarea, determining the subarea as a region to be corrected if the pixel difference between one subarea and the main area is larger than a preset threshold value, determining correction parameters based on the scaling corresponding to the main area and the region to be corrected, and performing perspective distortion correction on the geographical coordinate point corresponding to the coordinate point to be corrected according to the correction parameters after determining the geographical coordinate point corresponding to each image coordinate point based on the perspective transformation matrix.
In this way, the target area is divided into a plurality of subareas, the subarea with the largest area is used as the main area because the subarea with the large area contains more information, and perspective distortion correction is carried out on each area to be corrected through the main area, so that not only is the global perspective distortion of an image lightened, but also the local perspective distortion is lightened, the deformation phenomenon (such as transverse stretching and radial stretching) caused by different areas of the image is solved, the accuracy of a geographic coordinate point is improved, and the accuracy of movement speed measurement is further improved.
In some embodiments, dividing a target area in an area image stream according to lane lines to obtain a plurality of subareas, determining scaling corresponding to each subarea according to a geographic coordinate matrix corresponding to each subarea and an image coordinate matrix, taking the subarea corresponding to the maximum pixel area as a main area according to the pixel area of each subarea, determining the subarea as an area to be corrected if the pixel difference between one subarea and the main area is greater than a preset threshold value, determining correction parameters based on the scaling corresponding to the main area and the area to be corrected, and performing perspective distortion correction on the geographic coordinate points corresponding to the coordinate points to be corrected according to the correction parameters after determining the geographic coordinate points corresponding to each image coordinate point based on a perspective transformation matrix.
Optionally, determining the moving distance of the target to be measured according to at least a part of geographic coordinate points corresponding to the record time stamps, and determining the moving speed of the target to be measured according to the moving distance and the record time stamp corresponding to the moving distance, wherein the method comprises the steps of taking any two record time stamps as a first time stamp and a second time stamp respectively, determining the moving time of the target to be measured according to the first time stamp and the second time stamp, determining the moving distance of the target to be measured according to the geographic coordinate points corresponding to the first time stamp and the geographic coordinate points corresponding to the second time stamp, and determining the moving speed of the target to be measured based on the moving distance and the moving time.
Optionally, the moving distance of the object to be measured is determined by the following formula:
Where s is the distance of movement, the coordinates of the geographic coordinate point of the first time stamp are (a 1,b1), and the coordinates of the geographic coordinate point of the second time stamp are (a 2,b2).
Optionally, the moving speed of the object to be measured is determined by the following formula:
Wherein v is the moving speed of the object to be measured, t 1 is the time point of the first time stamp, and t 2 is the time point of the second time stamp.
The method comprises the steps of storing a perspective transformation matrix in a preset storage space, determining a monitoring device from the preset monitoring device as target equipment, acquiring an area image stream corresponding to a target area through the target equipment, determining image coordinate points of the target to be detected in the area image stream corresponding to a plurality of recording time stamps, extracting the perspective transformation matrix from the preset storage space through the target equipment, and determining the geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix.
Optionally, the preset storage space comprises one or more of a database end, a server end and the like, and the monitoring equipment comprises one or more of IPC (IP CAMERA, network CAMERA) or NVR (Network Video Recorder ) and the like.
In some embodiments, a monitoring device is designated as a target device by a position calibration method, a moving speed measurement task is created in the target device, the target device carries out target recognition on the regional image stream based on an artificial intelligence algorithm to obtain a target to be measured, image coordinate points corresponding to a plurality of recording time stamps of the target to be measured are determined, a perspective transformation matrix is extracted through a preset storage space, geographic coordinate points corresponding to the image coordinate points are determined based on the perspective transformation matrix, and then the moving speed of the target to be measured is determined.
As shown in fig. 3, an embodiment of the present disclosure provides a method for measuring a moving speed, including:
Step S301, a target device acquires a geographic coordinate matrix and an image coordinate matrix of a target area;
step S302, a target device determines a perspective transformation matrix of an image coordinate matrix mapped to a geographic coordinate matrix;
step S303, the target equipment sends a perspective transformation matrix to a database end;
step S304, the database end stores perspective transformation matrixes;
step S305, the target equipment identifies a target to be detected in the region image stream corresponding to the target region;
step S306, if the target to be detected is identified, the target equipment determines image coordinate points corresponding to the target to be detected at a plurality of recording time stamps;
Step S307, the database end sends the stored perspective transformation matrix to the target equipment;
Step S308, the target equipment determines geographic coordinate points corresponding to the coordinate points of each image based on the perspective transformation matrix;
In step S309, the target device determines the moving speed of the target to be measured based on the geographic coordinate point corresponding to the recording timestamp.
By adopting the moving speed measuring method provided by the embodiment of the disclosure, by acquiring the geographic coordinate matrix corresponding to the target area and the area image stream corresponding to the target area, establishing the image coordinate matrix corresponding to the target area based on the area image stream, determining the perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix, determining the image coordinate points corresponding to the target to be measured at a plurality of recording time stamps in the area image stream, determining the geographic coordinate points corresponding to the image coordinate points based on the perspective transformation matrix, determining the moving distance of the target to be measured according to at least one part of the geographic coordinate points corresponding to the recording time stamps, and determining the moving speed of the target to be measured according to the moving distance and the recording time stamps corresponding to the moving distance, the moving speed measuring method has the following advantages:
Firstly, acquiring a perspective transformation matrix according to a geographic coordinate matrix and an image coordinate matrix, converting image coordinate points of a target to be measured in an image into geographic coordinate points in the real world through the perspective transformation matrix, further determining the real moving speed of the target to be measured, reducing image deformation caused by a perspective principle and improving the accuracy of moving speed measurement compared with the method that the real moving speed is estimated through a pixel moving distance;
Secondly, the target area is divided into a plurality of subareas, and the subarea with the largest area is used as a main area because the subarea with the large area contains more information, and perspective distortion correction is carried out on each area to be corrected through the main area, so that not only is the global perspective distortion of an image lightened, but also the local perspective distortion is lightened, deformation phenomena (such as transverse stretching and radial stretching) caused by different areas of the image are solved, the accuracy of geographic coordinate points is improved, and the accuracy of movement speed measurement is improved;
Thirdly, any point of the object to be measured in the minimum external frame is used as a target reference point, the image coordinate point corresponding to the recording time stamp is determined through the target reference point, and then the geographic coordinate point corresponding to the image coordinate point is determined through the perspective transformation matrix, so that the influence of transformation between the three-dimensional world and the two-dimensional image is eliminated, the height and width information of the object to be measured are not required to be acquired, and the accuracy of measuring the moving speed is improved.
In some embodiments, the geographic coordinate matrix of the target areaImage coordinate matrix of target areaDetermining perspective transformation matrix of image coordinate matrix mapped to geographic coordinate matrixDetermining a first timestamp t 1 =5044, a second timestamp t 2 =5050 and an image coordinate point (53,480) corresponding to the first timestamp and an image coordinate point (897,277) corresponding to the second timestamp of the object to be measured, determining a geographic coordinate point (2.50,5.71) corresponding to the first timestamp and a geographic coordinate point (8.29,13.29) corresponding to the second timestamp of the object to be measured based on a perspective transformation matrix M, further determining that the moving distance of the object to be measured is s=9.54M and the moving time is 6s, and determining that the moving speed of the object to be measured is 1.59M/s.
Referring to fig. 4, an embodiment of the disclosure provides a moving speed measurement system, which includes an obtaining module 401, a matrix determining module 402, a coordinate determining module 403 and a calculating module 404, where the obtaining module 401 is configured to obtain a geographic coordinate matrix corresponding to a target area and an area image stream corresponding to the target area, the matrix determining module 402 establishes an image coordinate matrix corresponding to the target area based on the area image stream, determines a perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix, the coordinate determining module 403 determines image coordinate points corresponding to a target to be measured in a plurality of recording timestamps in the area image stream, determines geographic coordinate points corresponding to each image coordinate point based on the perspective transformation matrix, and the calculating module 404 determines a moving distance of the target to be measured according to at least a part of the geographic coordinate points corresponding to the recording timestamps, and determines a moving speed of the target to be measured according to the moving distance and the recording timestamps corresponding to the moving distance.
By adopting the moving speed measuring system provided by the embodiment of the disclosure, the geographic coordinate matrix corresponding to the target area and the area image flow corresponding to the target area are obtained, the image coordinate matrix corresponding to the target area is established based on the area image flow, the perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix is determined, the image coordinate points corresponding to the target to be measured at a plurality of recording time stamps are determined in the area image flow, the geographic coordinate points corresponding to the image coordinate points are determined based on the perspective transformation matrix, the moving distance of the target to be measured is determined according to at least one part of the geographic coordinate points corresponding to the recording time stamps, and the moving speed of the target to be measured is determined according to the moving distance and the recording time stamps corresponding to the moving distance. In this way, the perspective transformation matrix is obtained according to the geographic coordinate matrix and the image coordinate matrix, the image coordinate points of the object to be measured in the image are converted into the geographic coordinate points in the real world through the perspective transformation matrix, and then the real moving speed of the object to be measured is determined.
As shown in fig. 5, an embodiment of the present disclosure provides an electronic device, including a processor (processor) 500 and a memory (memory) 501, where the memory is configured to store a computer program, and the processor is configured to execute the computer program stored in the memory, so that the terminal performs any one of the methods in the embodiment. Optionally, the electronic device may also include a communication interface (Communication Interface) 502 and a bus 503. The processor 500, the communication interface 502, and the memory 501 may communicate with each other via the bus 503. The communication interface 502 may be used for information transfer. The processor 500 may invoke logic instructions in the memory 501 to perform the methods of the embodiments described above.
Further, the logic instructions in the memory 501 may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product.
The memory 501 is a computer readable storage medium that may be used to store a software program, a computer executable program, and program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 500 performs functional applications as well as data processing, i.e. implements the methods of the embodiments described above, by running program instructions/modules stored in the memory 501.
The memory 501 may include a storage program area that may store an operating system, at least one application program required for functions, and a database file area that may store data created according to the use of the terminal device, etc. In addition, the memory 501 may include a high-speed random access memory, and may also include a nonvolatile memory.
By adopting the electronic equipment provided by the embodiment of the disclosure, the geographic coordinate matrix corresponding to the target area and the area image flow corresponding to the target area are obtained, the image coordinate matrix corresponding to the target area is established based on the area image flow, the perspective transformation matrix of the image coordinate matrix mapped to the geographic coordinate matrix is determined, the image coordinate points corresponding to the target to be detected at a plurality of recording time stamps are determined in the area image flow, the geographic coordinate points corresponding to the image coordinate points are determined based on the perspective transformation matrix, the moving distance of the target to be detected is determined according to at least one part of the geographic coordinate points corresponding to the recording time stamps, and the moving speed of the target to be detected is determined according to the moving distance and the recording time stamp corresponding to the moving distance. In this way, the perspective transformation matrix is obtained according to the geographic coordinate matrix and the image coordinate matrix, the image coordinate points of the object to be measured in the image are converted into the geographic coordinate points in the real world through the perspective transformation matrix, so that the real moving speed of the object to be measured is determined, and compared with the method that the real moving speed is estimated through the pixel moving distance, the image deformation caused by the perspective principle is reduced, and the accuracy of moving speed measurement is improved.
The present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the methods of the present embodiments.
The computer readable storage medium in the embodiments of the present disclosure, those of ordinary skill in the art will understand that all or part of the steps of implementing the above-described method embodiments may be implemented by computer program related hardware. The aforementioned computer program may be stored in a computer readable storage medium. The program, when executed, performs the steps comprising the method embodiments described above, and the storage medium described above includes various media capable of storing program code, such as ROM, RAM, magnetic or optical disk.
The electronic device disclosed in this embodiment includes a processor, a memory, a transceiver, and a communication interface, where the memory and the communication interface are connected to the processor and the transceiver and perform communication therebetween, the memory is used to store a computer program, the communication interface is used to perform communication, and the processor and the transceiver are used to run the computer program, so that the electronic device performs each step of the above method.
In this embodiment, the memory may include a random access memory (Random Access Memory, abbreviated as RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor including a central Processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a network processor (Network Processor, NP), a digital signal processor (DIGITAL SIGNAL Processing, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), a Field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
The above description and the drawings illustrate embodiments of the disclosure sufficiently to enable those skilled in the art to practice them. Other embodiments may involve structural, logical, electrical, process, and other changes. The embodiments represent only possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and sub-samples of some embodiments may be included in or substituted for portions and sub-samples of other embodiments. Moreover, the terminology used in the present application is for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. In addition, when used in this disclosure, the terms "comprises," "comprising," and/or variations thereof mean the presence of the stated sub-sample, integer, step, operation, element, and/or component, but do not exclude the presence or addition of one or more other sub-samples, integers, steps, operations, elements, components, and/or groups of these. Without further limitation, an element defined by the phrase "comprising one..+ -." does not exclude the presence of additional identical elements in a process, method or apparatus comprising said element. In this context, each embodiment may be described with emphasis on the differences from the other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the methods, products, etc. disclosed in the embodiments, if they correspond to the method portions disclosed in the embodiments, then reference may be made to the description of the method portions for relevance.
Those of skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. The skilled artisan may use different methods for each particular application to achieve the described functionality, but such implementation should not be considered to be beyond the scope of the embodiments of the present disclosure. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the embodiments disclosed herein, the disclosed methods, articles of manufacture (including but not limited to devices, apparatuses, etc.) may be practiced in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some sub-samples may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to implement the present embodiment. In addition, each functional unit in the embodiments of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than that disclosed in the description, and sometimes no specific order exists between different operations or steps. For example, two consecutive operations or steps may actually be performed substantially in parallel, they may sometimes be performed in reverse order, which may be dependent on the functions involved. Each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210518817.2A CN115290921B (en) | 2022-05-12 | 2022-05-12 | Moving speed measurement method, system, electronic device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210518817.2A CN115290921B (en) | 2022-05-12 | 2022-05-12 | Moving speed measurement method, system, electronic device and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115290921A CN115290921A (en) | 2022-11-04 |
CN115290921B true CN115290921B (en) | 2025-06-27 |
Family
ID=83820281
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210518817.2A Active CN115290921B (en) | 2022-05-12 | 2022-05-12 | Moving speed measurement method, system, electronic device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115290921B (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2674830A1 (en) * | 2007-01-05 | 2008-07-17 | Nestor, Inc. | Video speed detection system |
JP2009059009A (en) * | 2007-08-30 | 2009-03-19 | Dainippon Printing Co Ltd | Color-corrected image image creation method and color-corrected image creation device |
CN110060200B (en) * | 2019-03-18 | 2023-05-30 | 创新先进技术有限公司 | Image perspective transformation method, device and equipment |
CN110427944A (en) * | 2019-09-06 | 2019-11-08 | 重庆紫光华山智安科技有限公司 | Acquisition methods, device, equipment and the storage medium of car plate detection data |
CN113470374B (en) * | 2021-06-30 | 2022-09-16 | 平安科技(深圳)有限公司 | Vehicle overspeed monitoring method and device, computer equipment and storage medium |
-
2022
- 2022-05-12 CN CN202210518817.2A patent/CN115290921B/en active Active
Non-Patent Citations (1)
Title |
---|
一种提高视频车速检测精度的算法分析和实现;孙宁;张重德;;合肥工业大学学报(自然科学版);20141228(12);60-65+125 * |
Also Published As
Publication number | Publication date |
---|---|
CN115290921A (en) | 2022-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112686877B (en) | Construction and measurement method and system of 3D house damage model based on binocular camera | |
Balali et al. | Multi-class US traffic signs 3D recognition and localization via image-based point cloud model using color candidate extraction and texture-based recognition | |
CN111046776A (en) | Mobile robot traveling path obstacle detection method based on depth camera | |
CN109918977B (en) | Method, device and equipment for determining idle parking space | |
US8600713B2 (en) | Method of online building-model reconstruction using photogrammetric mapping system | |
CN111932627B (en) | Marker drawing method and system | |
CN116721230A (en) | Method, device, equipment and storage medium for constructing three-dimensional live-action model | |
CN116778094B (en) | A method and device for monitoring building deformation based on optimal viewing angle shooting | |
CN106023147B (en) | The method and device of DSM in a kind of rapidly extracting linear array remote sensing image based on GPU | |
CN112732860A (en) | Road extraction method, device, readable storage medium and equipment | |
CN112767498A (en) | Camera calibration method and device and electronic equipment | |
CN114969221A (en) | A method for updating a map and related equipment | |
CN114859938B (en) | Robot, dynamic obstacle state estimation method, device and computer equipment | |
CN115311645A (en) | Traffic light marking method, device, computer equipment and storage medium | |
JP2011064639A (en) | Distance measuring device and distance measuring method | |
CN115290921B (en) | Moving speed measurement method, system, electronic device and readable storage medium | |
CN112183378A (en) | Road slope estimation method and device based on color and depth image | |
CN112767477A (en) | Positioning method, positioning device, storage medium and electronic equipment | |
CN116823966A (en) | Internal reference calibration method and device for camera, computer equipment and storage medium | |
Lv et al. | Optimisation of real‐scene 3D building models based on straight‐line constraints | |
CN115222815A (en) | Obstacle distance detection method, device, computer equipment and storage medium | |
CN115601336A (en) | Method and device for determining target projection and electronic equipment | |
CN117011481A (en) | Method and device for constructing three-dimensional map, electronic equipment and storage medium | |
CN114757976A (en) | A kind of dynamic target detection method and device | |
CN103971356A (en) | Street scene image segmenting method and device based on parallax information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |