[go: up one dir, main page]

HK1114229B - Line monitoring system and method - Google Patents

Line monitoring system and method Download PDF

Info

Publication number
HK1114229B
HK1114229B HK08104156.4A HK08104156A HK1114229B HK 1114229 B HK1114229 B HK 1114229B HK 08104156 A HK08104156 A HK 08104156A HK 1114229 B HK1114229 B HK 1114229B
Authority
HK
Hong Kong
Prior art keywords
line
objects
data
analyzing
new
Prior art date
Application number
HK08104156.4A
Other languages
Chinese (zh)
Other versions
HK1114229A1 (en
Inventor
戴维.M..赛尔斯都
Original Assignee
Adt服务有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adt服务有限责任公司 filed Critical Adt服务有限责任公司
Priority claimed from PCT/US2005/039487 external-priority patent/WO2006052545A2/en
Publication of HK1114229A1 publication Critical patent/HK1114229A1/en
Publication of HK1114229B publication Critical patent/HK1114229B/en

Links

Description

Line monitoring system and method
Cross Reference to Related Applications
This application claims benefit of filing date 2/2004, and filing date of U.S. provisional application No. 60/624,430, the teachings of which are incorporated herein by reference.
Technical Field
The present invention relates to a line monitoring system and method that may be used to monitor objects in a line.
Background
The rows may be formed in various places for various reasons. The persons may form a line, for example at a sales location or other customer service location of a retail store. People may also form lines at other facilities, such as outdoor entertainment areas, waiting for payment to enter the area or waiting for the area to absorb people specifically. Other objects, such as vehicles, may also form a train, for example at toll booths, gas stations and other facilities. Waiting in a line is generally considered undesirable and a facility may want to manage the line, for example to improve the customer's experience.
Obtaining information such as the number of people or objects in line, the average wait time in line, or the amount of people or objects moving through a line may be useful in managing the flow of people or other objects through a line. Observing a line is one way to determine the number of people or other objects in line at a given time. One disadvantage of this observation is that it takes personnel time and resources to collect line count data. Observing a line is not sufficient to provide additional line information, such as average wait time and/or the amount of people or objects moving through the line.
Drawings
Features and advantages of embodiments of the claimed subject matter will become apparent as the following detailed description proceeds, and upon reference to the drawings (wherein like numerals depict like parts), in which:
FIG. 1 is a block diagram of a line monitoring system consistent with an embodiment of the present invention;
FIGS. 2-5 are images illustrating an object extraction method that may be used to provide object data in line monitoring systems and methods consistent with an embodiment of the present invention;
FIG. 6 is a flow chart illustrating a line monitoring method consistent with an embodiment of the present invention;
FIGS. 7-14 are schematic diagrams illustrating patterns of behavior that may be used to determine whether an object is in line, consistent with embodiments of the invention;
FIG. 15 is a flow diagram illustrating one example of an object analysis method for determining objects in line, consistent with an embodiment of the invention;
FIG. 16 is a flowchart illustrating an exemplary method for processing a first new object in the object analysis method shown in FIG. 15; and
FIG. 17 is a flowchart illustrating an example method for handling the addition of new objects in the object analysis method shown in FIG. 15.
While the following detailed description refers to enumerated embodiments, many alternatives, modifications, and variations thereof will be apparent to those skilled in the art. Accordingly, it is intended that the claimed subject matter be viewed broadly.
Detailed Description
Referring to FIG. 1, a line monitoring system 100 consistent with one embodiment of the invention may be used to monitor a line formed by objects 102a-102e in a monitoring area 104. The objects 102a-102e may include any object capable of forming a line, including but not limited to people and vehicles. Line monitoring system 100 may be used at any facility or location where objects may form a line, including but not limited to retail stores, banks, amusement parks, casinos, sporting venues, ticket outlets, gas stations, toll booths, and car washes. The surveillance area 104 may include a line start point and any area at a facility or location where the line may extend. For example, in a retail store, the surveillance area 104 may include a point-of-sale location where a line generally begins and an area extending from the point-of-sale location. Although the exemplary embodiment is described in the context of a single line, the line monitoring system 100 may be used to monitor any number of lines.
One embodiment of the line monitoring system 100 may include an object identification and location system 120 that identifies and locates objects 102a-102e in the surveillance area 104 and an object analysis system 130 that analyzes object behavior and determines whether the objects form a line. The object identification and location system 120 may generate object data including, but not limited to, object identification data (e.g., an ID number) and object location data (e.g., coordinates). The object analysis system 130 may receive the object data and analyze the position and motion of the object to determine whether the object exhibits behavior indicating that the object should be designated as in line, as will be described in more detail below. As shown, objects 102a, 102b may be designated as being in line, while objects 102c-102e may not be designated as being in line.
The object analysis system 130 may also determine one or more line statistics, such as a count of objects in a line, a wait time for objects in a line, an average time to service customers (e.g., in multiple lines), and/or an amount of objects passing through a line over a given time period. The line monitoring system 100 may display the line statistics on the display 140 and may further analyze the line statistics, for example, by comparing the line statistics to a threshold (e.g., a line count threshold, an average wait time threshold, etc.). The line monitoring system 100 may also provide the line statistics to another computer system 142 for further analysis. The line monitoring system 100 and/or the computer system 142 may also communicate with a notification device 144 (e.g., a handheld wireless device) to provide notifications based on line statistics. If the line count exceeds or falls below the line count threshold, a notification may be provided, for example, to indicate that another line should be activated or that one line should be deactivated. The line monitoring system 100 may also include a user input device 146 to allow a user to provide input to, for example, select a surveillance area, select desired line statistics, set desired notification thresholds, and configure line behavior pattern parameters, as described below.
Thus, the line monitoring system 100 may facilitate a number of line management applications. For example, if there are an excessive number of people in the row at a point-of-sale in a retail store, the line monitoring system 100 may trigger an alarm (e.g., the notification device 144) to alert appropriate store personnel of the situation regardless of their location in the retail store. In response, store personnel may open other point-of-sale locations to relieve congestion.
Another application may be to determine whether the service providers of a retail store are relatively consistent from the perspective of traffic flow through a particular area. Such applications can be used to identify relatively slow service providers and then train them with more efficient service techniques. Yet another application may calculate an average wait time for the entire line, an average volume of transactions through a particular area over a particular period of time, and an average time to service a single customer. Store personnel can use the results of these additional applications to improve line management and customer service.
One embodiment of the object recognition and location system 120 may include one or more cameras 122 that capture one or more images of the monitored area, and an object extraction system 124 that extracts objects from the captured images and determines the location of the objects within the monitored area. The camera 122 may generate one or more image signals representative of a captured image of the surveillance area 104. The camera 122 may comprise a camera known to those skilled in the art, such as a digital still or video camera.
A camera 122 may be arranged to focus the surveillance area 104. Although not shown in the block diagram of fig. 1, the camera 122 may be located above the surveillance area 104. Such a top view of the surveillance area 104 by the overhead camera 122 facilitates visually separating the objects 102a-102e so that one object can be best distinguished from another (e.g., one person from another). For indoor applications, such as retail stores, the camera 122 may be mounted on the ceiling above the center of the surveillance area 104. For outdoor applications, the cameras may be mounted on utility poles, buildings, or other structures as appropriate to provide a global overhead view of the surveillance area 104. Although angular viewing of the camera is possible, tracking and distinguishing can be difficult if the angular viewing causes one object in line to block another object in line.
As the line grows, the field of view of the cameras 122 may be increased to enlarge the surveillance area 104 and capture a desired number of in-line objects. To increase the field of view, for example, the vertical height of the camera 122 may be increased above the surveillance area 104, a wider angle camera lens may be used, and/or multiple cameras may be used to provide adjacent views of the surveillance area 104. The use of multiple cameras 122 may allow each camera to be mounted lower or closer to the surveillance area 104, thereby facilitating the tracking and differentiation of the objects 102a-102e by the object extraction system 124. When multiple cameras are used, the cameras may be coordinated using techniques known to those skilled in the art to track objects moving from the range of one camera to the range of another camera.
In one embodiment, the object extraction system 124 and the object analysis system 130 may be implemented as one or more computer programs or applications running, for example, on a computer system. The object extraction system 124 and the object analysis system 130 may be separate applications or may be components of a single integrated line monitoring application. The object extraction system 124 and the object analysis system 130 may also be applications running on separate computer systems connected, for example, by a network connection, a serial connection, or using other connections. The computer programs or applications may be stored on any of a variety of machine-readable media (e.g., hard disk, CD rom, system memory, etc.) and executed by a processor, thereby causing the processor to perform the functions described herein, as performed by object extraction system 124 and object analysis system 130. Those skilled in the art will recognize that the object extraction system 124 and the object analysis system 130 may be implemented using any combination of hardware, software, and firmware to provide these functions.
The camera 122 may be connected to the object extraction system 124 via a path 126, for example using a wireless or wired connection to a computer system incorporating the object extraction system 124. The camera 122 may provide image signals (e.g., a video feed of the surveillance area 104) to an object extraction system 124 via a path 126. The object extraction system 124 may analyze pixels in the image represented by the image signal and may combine the moving pixels together to form an image object corresponding to the actual objects 102a-102e in the surveillance area 104. The object extraction system 124 may also identify each object in the image of the surveillance area 104 and provide coordinates specifying the location of each object.
Referring to fig. 2-5, one example of a method for identifying and locating objects using the object extraction system 124 is illustrated in more detail. As shown in fig. 2, an image 200 of the surveillance area 104 may be generated from an image signal provided from the camera 122 to the object extraction system 124. The object extraction system 124 may analyze pixels from the image 200 to extract image objects. Although the image 200 is displayed as a single static image, the object extraction system 124 may receive an image signal representing a changing or moving image (or series of static images) in which the object in the surveillance area 104 is moving.
In one embodiment, where the monitored objects in the monitoring area 104 are people, the object extraction system 124 may be configured to identify objects that are people. To accurately identify people, the object extraction system 124 may filter out lighting, shadows, reflectors, and other anomalies that may be erroneously identified as adults. The object extraction system 124 may use adjustment parameters to increase the accuracy of object extraction, as is known to those skilled in the art. The adjustment parameters may include an illumination threshold, an edge detection threshold, and/or a combination criterion. Thus, the object extraction system 124 may provide correctly identified human objects to the object analysis system 130, thereby avoiding false images or "ghosts" that may confuse the object analysis system 130. Although the object extraction system 124 may provide most of the filtering to identify people as objects, the object analysis system 130 may also provide object filtering to distinguish people from other objects, for example, based on the motion or behavior of the objects.
As shown in FIG. 3, the moving pixels in the image 200 may be combined to form pixel groups 202a-202e corresponding to moving objects (e.g., people) in the surveillance area 104. Regions defining pixel groups 202a-202e may be formed around pixel groups 202a-202 e. In the illustrated example, pixel groups 202a-202e are shown having rectangular areas defining pixel groups 202a-202e, but this is not to be considered a limitation. As shown in FIG. 4, center points 204a-204e of regions (e.g., rectangular regions) defining pixel groups 202a-202e may be determined. The coordinates of the center points 204a-204e may be determined to identify the coordinates of the corresponding objects (e.g., people) in the surveillance area 104.
The object extraction system 124 may provide persistence of objects so that objects may be consistently identified as they move through the image 200 of the surveillance area 104. To accomplish this, the object extraction system 124 may provide an identifier (e.g., an ID number) for each object in the image 200 to associate the image object at that coordinate in the image 200 with a particular corresponding object in the surveillance area 104. The object extraction system 124 may maintain the identifier while the image object is in motion.
As shown in FIG. 5, the object data provided from the object extraction system 124 to the object analysis system 130 may include identification data (e.g., ID numbers) of the image objects 206a-206e extracted from the image 200 and location data (e.g., defined by coordinates of the center points 204a-204 e) of the image objects 206a-206 e. The object data may be provided from the object extraction system 124 to the object analysis system 130 continuously through various paths, including, for example, over a network, over a serial connection, via a hardware device, or via a software mechanism through shared memory or some other software buffering mechanism. Depending at least in part on the ability of the object extraction system 124 to generate and transmit such data, the object data may be provided at varying data rates. In general, faster data rates may improve the accuracy of the object analysis system 130 that analyzes the position and motion of objects within the surveillance area. Although the object extraction system 124 uses the graphical information to obtain the object data, as shown in fig. 2-5, the graphical information need not be sent to the object analysis system 130. However, such graphical information may be used in the line monitoring system 100 to facilitate monitoring of a line.
In addition to providing object identification data and object location data for the image objects 206a-206e extracted from the surveillance area image 200, the object extraction system 124 also provides additional parameters or object data to the object analysis system 130. Such object data may include object size, object velocity, and a timestamp of the current location of each object. These additional parameters may be useful in some situations, but are not required.
Although the exemplary embodiment uses the object extraction system 124 to obtain object identification and location data, those skilled in the art will recognize that the object identification and location system 120 may also include other systems capable of generating object identification data (e.g., ID numbers) and object location data (e.g., coordinates). Examples of such systems include Radio Frequency Identification (RFID) tracking systems and other tracking systems known to those skilled in the art.
Referring to FIG. 6, one method of monitoring a line using the object analysis system 130 is illustrated. The object analysis system 130 may receive 302 object data, including object identification data and object location data related to objects in the surveillance area. To determine whether an object should be designated as in a line in the surveillance area, the object analysis system 130 may analyze 304 the object data with reference to one or more line behavior pattern parameters that represent the behavior of the objects in the line. The object analysis system 130 may also determine 306 one or more line statistics, such as the number of objects in line, the wait time, and the amount of objects passing through the line.
A number of behavioral patterns representing objects in a line may be abstracted into various parameters and enumerated as values. The object analysis system 130 may assign default values to each line behavior pattern parameter representing a behavior pattern. The operator of the object analysis system 130 may also use the user input device 146 to adjust the default values of the parameters to "tune" the object analysis system 130 for various conditions.
Referring to fig. 7-14, the different behavior patterns and associated line behavior pattern parameters are illustrated in more detail. In general, line behavior pattern parameters may be based on object positions and/or object motions representing objects in a line. Objects may be designated as "in line" or "likely in line" or removed from line using a line behavior pattern parameter.
The objects generally form a line in a designated area extending from an origin (e.g., point of sale location). As shown in FIG. 7, the parameters may define a reference area 400 within the surveillance area 104 where objects may be in line. The reference area 400 may include where the line should start and may also include where the line should end. In one embodiment, the reference area 400 may be defined using values representing one or more pairs of parallel rows and columns. An operator of the object analysis system 130 may input values for the parameters defining the reference area 400 or may provide default values. The object location data may be compared to the reference area parameters to determine whether the object has entered the reference area 400 and should be designated as "in line" or "likely in line".
When an object enters the reference area 400, the object is only designated as "likely in line" because the object may only transitionally cross the reference area 400. Thus, the object analysis system 130 may designate the object 404a as "likely in line" until the object analysis system 130 determines that the object is actually in line, e.g., using other parameters described below. For example, as shown in FIG. 8, a first object 404a that has entered the reference area 400 (e.g., passed through one of the lines defining the reference area 400) "may be in line". As shown in FIG. 9, the first object 404a has left the reference area 400 (e.g., traversed one of the lines backwards) and is therefore not actually in line. Once the object leaves the reference area 400, the object analysis system 130 may exclude the object from being designated as "likely in line".
Other parameters may define the movement of objects to determine whether an object designated as "likely in line" should be designated as "in line". Examples of such parameters include "still" parameters and/or "jitter" parameters. Objects (e.g., people) entering a line typically stop moving for at least a short period of time. The "at rest" parameter may be defined using one or more values representing a period of rest. If the object location data for the object 404a that has entered the reference area 400 indicates that the location of the object has not changed during the quiescent period, for example, the object analysis system 130 may designate the object as being "in line" rather than "likely in line". The quiescent period may be adjusted or tuned by an operator of the object analysis system 130 in view of different circumstances.
Objects in line may move within a limited space and thus may not be completely stationary. The "bounce" parameter may be defined using one or more values representing a limited "bounce" space in which objects may move while in line. As shown in fig. 10, for example, a boundary 410 may define a jitter space around the object 404 b. If the object location data indicates that the object 404b in the reference area 400 is moving only within a defined "jumping" space, the object analysis system 130 may designate the object as being "in line" rather than "likely in line". The jitter parameters may also be adjusted to accommodate different environments. The size of the bounce space may be adjusted, for example, based on the position in the line (e.g., greater bounce at the tail than at the beginning), the amount of space moved around the line, and other factors. In one embodiment, the bounce space may be defined by a circle around the object coordinates, and the adjustable parameter is the radius of the circle. Once an object is designated as being "in line," the still and jitter parameters are no longer analyzed for that object unless that particular object leaves the line and returns.
When no object is designated as "in line," the reference region parameter, the still parameter, and the jitter parameter may be used to determine when the first new object should be designated as "in line. When at least one object is designated as being "in line," then additional objects may be designated as being "in line" or "possibly in line. Other parameters may define the location of additional objects relative to other objects in line, thereby determining whether the additional objects should be designated as being "in line" or "likely in line". These parameters may include a proximity parameter, a posterior parameter, and an insertion distance parameter, as described below.
Typically, additional objects will be added to the line at the end. The proximity parameter may be defined using one or more values representing a proximity distance from the last object specified in line. If the object location data indicates that the additional object is within a proximity distance of the last object, the object analysis system 130 may designate the object as being "in line" or "likely in line". As shown in FIG. 11, for example, the proximity distance may be defined by a radius length of a circular region 412 surrounding the last object 404c currently in line, and the additional object 404d is within the proximity distance of the last object 404c currently in line. The proximity parameter, like other parameters, may be adjusted by an operator of the object analysis system 130.
Additional objects that enter the line ahead of the last object 404c currently in line (e.g., within the proximity distance) may do something that causes the object to temporarily move to that location but may not actually attempt to enter the line. The back parameter may be defined using one or more values representing the relative position of the back of the last object currently in line. If the object location data for an additional object indicates that the additional object is actually "behind" the last object currently in line, the object analysis system 130 may designate the additional object as being "in line" or "possibly in line". As shown in FIG. 12, the latter parameter may be defined by the angle 414 between lines 416, 418 derived from the coordinates of the last object 404d currently in line. Thus, the object analysis system 130 may determine that the additional object 404e is within the proximity distance and behind the last object currently in line. The latter parameters may be adjusted by an operator of the object analysis system 130.
If an object attempts to "plug" into a line, the object may be brought into line ahead of the last object currently in line. The insertion distance parameter may be defined using one or more values representing the distance from a line connecting the coordinates of the two objects currently in line. If the object location data indicates that an additional object has moved within the break distance parameter, the additional object may be designated as being "in line" or "possibly in line". As shown in FIG. 13, the inset distance 420 may be relative to a straight line 422 formed between the objects 404b, 404c currently in line, and the object 404f is within the inset distance 420. The insertion distance parameter may be adjusted by an operator of the object analysis system 130.
Even though the additional object may be close to the line (e.g., within a proximity or inset distance), the additional object may not be in line, for example, if the object simply passes by the line. Thus, the proximity parameter, the back parameter, and the insertion parameter may be used to indicate that an additional object is "likely in line," and the quiescence parameter and/or the jitter parameter described above may be analyzed to determine whether the additional object designated as "likely in line" should be designated as "in line.
Once an object joins a line, the object may leave the line at any time. The object analysis system 130 may use the deviation distance parameter to determine whether objects that have been designated as "in line" should be removed from the line. The deviation distance parameter may be defined using one or more values representing a distance required for the object to move away from the line before the object is removed from the line. If the object location data indicates that the object is moving a distance greater than the offset distance from the line, the object analysis system 130 may remove the object that was previously designated as being "in line".
As shown in FIG. 14, the offset distance may be defined differently for the first object currently in line, the last object currently in line, and the objects between the first and last objects. For objects between the first object 404a and the last object 404f, the offset distance may be defined as a distance 432 from a line 430 connecting adjacent objects 404c, 404e in line. For example, the current position of object 404d (previously midway along line 430 between objects 404c, 404 e) has deviated from line 430 by at least the deviation distance 432, and thus may be designated as being removed from line.
For the first object 404a currently in line, the offset distance may be defined as the distance from the line 440 between the last "at rest" position (shown in phantom) of the first object 404a and the next object 404b in line. The last "still" position of the first object 404a may be the position when the first object last satisfied the still parameter or the jerk parameter. For example, the current position of the first object 404a (first in line before) has deviated from line 440 by at least the deviation distance 442 and thus may be designated as removed from the line.
For the last object 404f currently in line, the offset distance may be defined as the distance from the last "at rest" position (shown in dashed lines) of the last object 404 f. The last "still" position of the last object 404f may be the position when the last object 404f last satisfied the still parameter or the jitter parameter. The deviation parameter, like other parameters, may be adjusted by an operator of the object analysis system 130. The deviation parameter may be adjusted individually for the first object currently in line, the last object currently in line, and objects between the first and last objects currently in line.
Referring to FIGS. 15-17, a method 500 for analyzing object data with reference to line behavior pattern parameters is illustrated in greater detail. After a start 502 of the method, the object analysis system 130 may receive 504 object data, including object identification data and object location data. Based on the object data (e.g., object identification data), the object analysis system 130 may determine 506 whether there are any new objects in the monitored area relative to previously identified objects.
If there are no new objects, the object analysis system may update 514 the location of all objects based on the received object location data. The object analysis system may then determine 516 whether any objects designated as "in line" are outside of their deviation distance. If the object is outside of the deviation distance, the object analysis system may remove 520 the object from the line.
If there are new objects, the object analysis system may determine 508 how many objects are currently in line. If there are no objects currently in line and the new object may be the first object in line, the object analysis system processes 510 an analysis of the object data for the first new object, as will be described in more detail below. If there is at least one object currently in line and the new object is likely to be an additional object in line, the object analysis system processes 512 the analysis of the object data as the additional object, as will be described in more detail below. When the process of object data analysis of the first new object and additional objects is complete, the object analysis system may update 514 the locations of all objects and may determine 516 if any objects have deviated from the deviation distance.
FIG. 16 illustrates one method of processing 510 the object data analysis of the first object in the case where no objects are currently designated as being in line. The object analysis system may determine 602 whether a reference region is defined and, if so, may determine 604 whether the object is within the reference region. If the object is within the reference region, the object analysis system may determine 606 whether the object is stationary for a particular period of time. If it is determined that the object in the reference area is stationary, the object analysis system may add 610 the object as the first object in the line. If the object is not determined to be stationary, the object analysis system may determine 608 whether the object is bouncing within the bounce space. If it is determined that the object in the reference area is bouncing, the object analysis system may add 610 the object as the first object in the line. If the object is not in the reference area, is not stationary, and is not bouncing, the object cannot be added as the first object in line.
FIG. 17 shows a method of processing 512 object data analysis of additional objects when at least one object has been designated as being in line. The object analysis system may determine 702 whether the new object is within an insertion distance defined by the insertion distance. If the additional object is not within the break-in distance, the object analysis system may determine 704 whether the additional object is within a proximity distance of a last object currently in line. If the object is within the proximity distance, the object analysis system may also determine 706 whether the additional object is behind the last object currently in line.
If it is determined that the additional object is within the inset or proximity distance and behind the last object currently in line, the object analysis system may determine 708 whether the additional object is stationary. If it is determined that the additional object is stationary, the object analysis system may add 712 the additional object to the line. If it cannot be determined that the object is stationary, the object analysis system may determine 710 whether the additional object is bouncing near a bounce space. If the object is bouncing, the object analysis system may add 712 the additional object to the line. If the additional object does not satisfy any of these parameters, the additional object is not added to the line.
Various embodiments of the object analysis system and method may use one or more defined line behavior pattern parameters depending on the actual implementation environment. Other line pattern behavior parameters may also be implemented in the object analysis system. The line mode behavior parameters may also be analyzed in a different order than described herein.
Line statistics may be calculated as objects are added and removed from the line by the object analysis system. For example, a line count may be determined by counting the number of objects designated as "in line" at any time. For example, the average wait time is determined by calculating the average time period that each object is designated as "in line". For example, the amount of movement through a line is determined by counting the number of objects designated as "in line" over a period of time. The line statistics may then be displayed and/or used to provide notifications or alerts, as described above.
Consistent with embodiments of the present invention, a line monitoring method and system may be used to monitor objects in a line. The line monitoring method may include receiving object data associated with objects in a surveillance zone. The object data includes at least object identification data and object location data. The method may further include analyzing the object data with reference to at least one line behavior pattern parameter representative of at least one behavior pattern indicative of objects in the line to determine whether at least one object should be designated as being in a line in the surveillance area. The method also includes determining at least one line statistic related to objects designated as in a line.
The line monitoring system may include an object identification and locating system configured to identify and locate objects in the surveillance area and to generate object data including at least object identification data and object location data. The line monitoring system may also include an object analysis system configured to receive the object data, analyze the object data to determine whether at least one object should be designated as in a line within the surveillance area, and determine at least one line statistic related to the line.
The terms and expressions which have been employed herein are used as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding any equivalents of the features shown and described (or portions thereof), and it is recognized that various modifications are possible within the scope of the claims. Other modifications, variations, and changes are also possible. Accordingly, the claims are intended to cover all such equivalents.

Claims (16)

1. A line monitoring method for monitoring a plurality of objects in an area, comprising:
receiving object data relating to a plurality of objects in a surveillance area, the object data including at least object identification data and object location data;
analyzing said object data with reference to at least one line behavior pattern parameter representative of at least one behavior pattern representative of a plurality of objects in a line to determine whether at least one of said objects should be designated as being in a line in said surveillance area; and
determining at least one line statistic related to the plurality of objects designated as being in the line.
2. The method of claim 1, further comprising determining whether at least one of the objects is a new object within the surveillance area, and if the at least one of the objects is a new object, analyzing the object data for the new object to determine whether the new object should be designated as being in the line.
3. The method of claim 2, wherein analyzing the object data for the new object comprises determining whether the object is a first new object or an additional new object, and wherein the object data is analyzed based on whether the new object is a first new object or an additional new object.
4. The method of claim 3, wherein analyzing the object data for the first new object comprises: analyzing the object position data with reference to parameters defining a reference area in which objects form the line.
5. The method of claim 4, wherein analyzing the object data for the first new object comprises: the object position data is analyzed with reference to parameters defining a motion representative of the object in line.
6. The method of claim 3, wherein analyzing the object data for the additional new objects comprises: analyzing the object position data with reference to parameters defining a position of the object relative to other objects in the line.
7. The method of claim 6, wherein analyzing the object data for the additional new objects comprises: the object position data is analyzed with reference to parameters defining a motion representative of the object in line.
8. The method of claim 1, wherein analyzing the object data comprises: the object data is analyzed with reference to parameters defining a position of an object relative to the line to determine whether the object should be designated for removal from the line.
9. The method of claim 1, wherein determining the at least one line statistic comprises: determining a number of objects in the line.
10. The method of claim 1, wherein determining the at least one line statistic comprises: an average wait time for the objects in the line, or an amount of objects moving through the line over a period of time, is determined.
11. A line monitoring system for monitoring a plurality of objects in an area, comprising:
an object identification and location system configured to identify and locate a plurality of objects in a surveillance area and to generate object data including at least object identification data and object location data; and
an object analysis system configured to receive the object data, analyze the object data with reference to at least one line behavior pattern parameter representative of at least one behavior pattern representative of a plurality of objects in a line, to determine whether at least one of the objects should be designated as being in a line within the surveillance area, and to determine at least one line statistic related to the line.
12. The line monitoring system of claim 11 wherein the object identification and location system comprises:
at least one camera configured to generate an image signal representative of at least one image of the surveillance area; and
an object extraction system configured to receive the image signal, extract an object from the at least one image represented by the image signal, and generate the object data.
13. The line monitoring system of claim 11 wherein the at least one line behavior pattern parameter includes a parameter defining a reference area in which the objects form the line and a parameter defining a motion representative of the objects in the line within the reference area.
14. The line monitoring system of claim 13 wherein the at least one line behavior pattern parameter includes a parameter that defines a position of an object relative to other objects in the line.
15. The line monitoring system of claim 14 wherein the at least one line statistic includes a number of objects in the line.
16. The line monitoring system of claim 11 wherein the object identification and location system and the object analysis system include at least one computer system.
HK08104156.4A 2004-11-02 2005-11-01 Line monitoring system and method HK1114229B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US62443004P 2004-11-02 2004-11-02
US60/624,430 2004-11-02
PCT/US2005/039487 WO2006052545A2 (en) 2004-11-02 2005-11-01 Line monitoring system and method

Publications (2)

Publication Number Publication Date
HK1114229A1 HK1114229A1 (en) 2008-10-24
HK1114229B true HK1114229B (en) 2013-11-08

Family

ID=

Similar Documents

Publication Publication Date Title
AU2005305143B2 (en) Line monitoring system and method
US11704936B2 (en) Object tracking and best shot detection system
US8855361B2 (en) Scene activity analysis using statistical and semantic features learnt from object trajectory data
US8068640B2 (en) Method for detecting image regions that are conspicuous in terms of the movement in them; apparatus and computer program for performing the method
US20090268033A1 (en) Method for estimating connection relation among wide-area distributed camera and program for estimating connection relation
CN111372043B (en) Abnormity detection method and related equipment and device
KR101698500B1 (en) System for monitoring surveillance object using intelligent security camera
JP2008544705A (en) Detect and track surveillance objects from overhead video streams
CN112633150B (en) Target track analysis-based retention loitering behavior identification method and system
JP2006229465A (en) Monitoring device, monitoring method, monitoring program
KR101840042B1 (en) Multi-Imaginary Fence Line Setting Method and Trespassing Sensing System
JP2008284166A (en) Throwing ball type discriminating device, discriminator generating device, throwing ball type discriminating program and discriminator generating program
JP2011107765A (en) Method and device for detecting suspicious action
JP5496566B2 (en) Suspicious behavior detection method and suspicious behavior detection device
JP2005353004A (en) Vehicle running measurement system and vehicle tracking method
JP2019096062A (en) Object tracking device, object tracking method, and object tracking program
KR101848367B1 (en) metadata-based video surveillance method using suspective video classification based on motion vector and DCT coefficients
CN103260004A (en) Object series connection correction method of photographic picture and multi-camera monitoring system thereof
HK1114229B (en) Line monitoring system and method
CN117409345A (en) Target tracking method, target counting method, terminal device, and storage medium
CN115115978A (en) Object recognition method, device, storage medium and processor
CN113326792A (en) Abnormity warning method, device and equipment
CN119863822A (en) Method, equipment and medium for automatically tracking and identifying personnel violations
WO2005006276A1 (en) Traffic violation method and system
CN120302165A (en) Video data processing method and device, storage medium and electronic device