US20170316033A1 - Streaming representation of moving objects and shapes in a geographic information service - Google Patents
Streaming representation of moving objects and shapes in a geographic information service Download PDFInfo
- Publication number
- US20170316033A1 US20170316033A1 US15/652,993 US201715652993A US2017316033A1 US 20170316033 A1 US20170316033 A1 US 20170316033A1 US 201715652993 A US201715652993 A US 201715652993A US 2017316033 A1 US2017316033 A1 US 2017316033A1
- Authority
- US
- United States
- Prior art keywords
- video
- objects
- region
- map
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/30241—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- GIS Geographic Information System
- a GIS presents tools for users to query and make decisions based on geographic information.
- the geographic information may be spatial and temporal; and is often heterogeneous, from divergent sources, and may contain structured and unstructured data.
- the processing of this data becomes complex and involves numerous challenges.
- the use of multiple, disparate tools are often necessary in order to process and analyze geospatial data in real-time. These tools are often expensive and can require specialized skills and training to use.
- each tool may require the data to be in different formats, increasing the difficulty in combining heterogeneous types of data.
- Certain embodiments provide a GIS system that leverages video streaming sensor nodes and presents the dynamic information associated with moving objects in a manner that can be used across a variety of clients.
- techniques are discussed for minimizing bandwidth issues.
- An application programming interface is provided in which clients may access and use dynamic information associated with moving objects in a surrounding environment.
- FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented.
- FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention.
- API application programming interface
- GIS Geographic Information System
- Querying, analysis, and visualization of real-time data pertaining to at least two moving objects in conjunction with relatively static multi-temporal geospatial data can be facilitated on client devices through the presentation of the API.
- a GIS server that may provide a GIS service including the API, can incorporate data from mobile video sensors and streaming technologies in order to present streaming and/or video data.
- the GIS server can perform the step of processing and analyzing geographic, spatial and/or temporal data to provide visual representation of the trajectories of relevant objects, which may be known to the server in advance or transmitted in real time, sampled, and have uncertainty aspects.
- Geographic data exploration can be enhanced through incorporation of moving objects.
- moving objects of a specific area of interest may be viewed overlaid on geospatial data.
- dynamic movement of objects within the geographic area and at specific resolutions of interest can be presented. This translates to the user as a real-world experience with objects moving across their screen or “zooming” by them.
- User scenarios that may be supported include, but are not limited to: (1) the user is on board a moving object whose trajectory (e.g., location as a function of time) is known a priori to the server; (2) the user is on board a moving object whose trajectory is generated in real-time and received by the server; or (3) the user is located at a fixed point.
- trajectory e.g., location as a function of time
- the trajectories of relevant objects may be known to the server in advance or received by the server in real time and sampled. Visual querying and rendering can be provided to the client.
- FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented.
- a client device 105 can communicate with a GIS server 110 over a network 115 .
- the network 115 can include, but is not limited to, a cellular network (e.g., wireless phone), a point-to-point dial up connection, a satellite network, the Internet, a local area network (LAN), a wide area network (WAN), a WiFi network, an ad hoc network or a combination thereof.
- a cellular network e.g., wireless phone
- LAN local area network
- WAN wide area network
- WiFi ad hoc network or a combination thereof.
- Such networks are widely used to connect various types of network elements, such as hubs, bridges, routers, switches, servers, and gateways.
- the network 115 may include one or more connected networks (e.g., a multi-network environment) including public networks, such as the Internet, and/or private networks such as a secure enterprise private network. Access to the network 115 may be provided via one or more wired or wireless access networks as will be understood by those skilled in the art.
- the client device 105 may be, but is not limited to, a personal computer (e.g. desktop computer), laptop, personal digital assistant (PDA), mobile phone (or smart phone), tablet, slate, terminal, or set-top box.
- a personal computer e.g. desktop computer
- laptop personal digital assistant
- mobile phone or smart phone
- tablet slate
- terminal or set-top box.
- the GIS server 110 may include distributed servers that execute real-time continuous queries to facilitate rendering and collaborating with vehicular ad-hoc networks (VANets) 120 and other video streaming sources.
- VANets vehicular ad-hoc networks
- a mobile sensor e.g., on a drone aircraft or even a smart phone
- geo-positioning and communication capabilities and a camera can capture vehicles and pedestrians that are in line of sight and/or a stationary sensor with communication capabilities and a camera can be installed at geographically distributed locations such as at a traffic light, and these videos and/or images be communicated over a network, and ultimately collected by the GIS server 110 .
- the GIS server 110 may include one or more computing devices.
- the GIS server 110 can include one or more blade server devices, standalone server devices, personal computers, routers, hubs, switches, bridges, firewall devices, intrusion detection devices, mainframe computers, network-attached storage devices, and other types of computing devices.
- the server can include one or more communications networks that facilitate communication among the computing devices.
- the one or more communications networks can include a local or wide area network that facilitates communication among the computing devices.
- One or more direct communication links can be included between the computing devices.
- the computing devices can be installed at geographically distributed locations.
- the multiple computing devices can be installed at a single geographic location, such as a server farm or an office.
- Certain embodiments of the invention can be practiced in distributed-computing environments where tasks are performed by remote-processing devices that are linked through a communications network.
- program modules can be located in both local and remote computer-readable storage media.
- the GIS server 110 can also access multiple geographic information sources 130 for static and/or dynamic geographic information. Once the results from the information sources are received by the GIS server 110 , the GIS server can perform steps of filtering and formatting the results; storing the results in a database, transforming the results into a visual representation, and/or transmitting the results in a form to client devices.
- the client device 105 can communicate with the GIS server 110 to obtain visual data and information on relevant moving objects based on criteria submitted by a user of the client device 105 .
- a user interface can be provided in a web browser of the client device 105 in which the end-user will be provided the ability to graphically manipulate objects and navigation examples.
- the presented data about the object can include the object's existing and computed properties, including location, description, context, and personalization.
- This query pattern adds information retrieval and database selection-like constraints to traditional spatio-temporal queries. Such constraints give users the flexibility to execute free-text searches (information-retrieval style) on unstructured data, or refined attribute-based predicate retrieval (SQL style) on structured data. For example: “visualize the trajectories of large green trucks near the current point” is a query where “green” is a keyword, “truck” is a category, and “large” translates into a numerical predicate.
- Certain embodiments provide support for requests to the GIS server 110 from the client that include functions for opening historic movement trajectories, getting live movement trajectories from the server, getting a video player (or a view of a player) to play videos captured on a given trajectory, and getting movement trajectory inside a search window that a user is interested in.
- the GIS services API involves a set of request messages available to a client 105 (or server) along with a definition of the structure of response messages sent to the client (or server).
- the response messages from the GIS 110 server to the client 105 may be in a markup language such as Extensible Markup Language (XML) or JavaScript Object Notation (JSON).
- XML Extensible Markup Language
- JSON JavaScript Object Notation
- the GIS services provided by GIS server 110 can interact programmatically over the network through industry standard Web protocols, such as, but not limited to, XML, JSON, Hypertext Transfer Protocol (HTTP) Representational State Transfer (REST), and Simple Object Access Protocol (SOAP).
- API functions that may be called by the client 150 include:
- the search window is described by a series of points in latitudes and longitudes bounding the area that the user is interested in.
- FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention.
- An incoming message from the client may be received by an API server of the GIS server.
- the incoming message may be received as a SOAP protocol message.
- the API/GIS server determines whether the incoming message includes a historic traces request “getHistoricTraces” ( 202 ). If the incoming message includes the historic traces request (“Yes” of 204 ), the API/GIS server may get some or all historic traces and sends response back to client ( 204 ). The server may get this information from a database.
- the API/GIS server also determines whether the incoming message includes a live traces request “getLiveTraces” ( 206 ). If the incoming message includes the live traces request (“Yes” of 206 ), the API/GIS server may get some or all live traces and sends response back to client ( 208 ). The server may get this information from at least one sensor having a camera.
- the API/GIS server also determines whether the incoming message includes a video player request “openPlayer” ( 210 ). If the incoming message includes the open player request (“Yes” of 210 ), the API/GIS server may open the video player in the portal (e.g., user interface of client browser) and may stream video to client ( 212 ).
- the API/GIS server may open the video player in the portal (e.g., user interface of client browser) and may stream video to client ( 212 ).
- the API/GIS server also determines whether the incoming message includes a get movement trajectory request “openSearchWindowWithPolygon” ( 214 ). If the incoming message includes the get movement trajectory request (“Yes” of 214 ), the API/GIS server may get the traces located within the search window back to client ( 216 ). Information about the specific trajectory may be obtained from at least one sensor having a camera within a geographic region associated with the search window or from information associated with the search window that is stored in a database.
- a user interface can be provided in which a map is displayed as part of a geographical visualization view of a region.
- the user interface can request the video from the GIS server using, for example, the openPlayer request.
- the client receives the streaming video captured by the sensor corresponding to the selected object
- the user interface can present a video player in the portal to show the video stream.
- the user can interact with the video player, resulting in changes to the geographical visualization view of the region. For example, the user may manually shift a time pointer in the video stream being watched in the player.
- the interface can reposition the moving object on the map.
- a polygonal projection of the sensor's view to earth surface can be provided as part of the visualization and synchronized with the playback of the video player.
- the client can request “getHistoricTraces” and “getLiveTraces” at regular intervals, for example, every second.
- the server may return an XML, formatted file that contains the name of the movement trajectories and the coordinates of the points of the trajectories.
- the client may draw these trajectories on the map within a user interface and list them in a Trajectory Control panel provided to the client.
- related streaming videos that lie on the trajectory path can be displayed in response to receiving an input command, such as a right click by a mouse connected to the client device or by a touch or other gesture of a touchscreen of the client device, on the trajectory path itself.
- a pop-up window can be displayed with the option of “Open Player” that, if selected, would proceed to send a request for the streaming video to the server and open a window to view that video.
- trajectory paths may be drawn in a manner to minimize unnecessarily obscuring other elements on the screen, for example as a thin red line. This, however, can make it more difficult for users to be able to select the trajectory path itself.
- an invisible buffer can be placed around the trajectory path lines that when an input indicating a selection is received on the invisible buffer, it is understood to be a selection of the trajectory path itself.
- the trajectory paths can be displayed with minimal disruption to the geographic visualization while still maintaining the ability to select the line for a second action.
- the user interface can include markers to tag the movement trajectories on the map in case the trajectories are too small to be seen from global view.
- the Trajectory Path Control panel can list available movement trajectories returned from the server in response to one of the requests for getting trajectories.
- a trajectory is selected in the Trajectory Path Control panel, the view of the trajectory path from the perspective of an object along the trajectory path can be provided (e.g., representing a scenario where a user is on board the moving object).
- client-server bandwidth can be optimized.
- the client can periodically consult the server for moving objects instead of using a constant stream of data.
- the rate of checking for moving objects and the number of moving objects will influence the bandwidth required.
- Algorithms that deal with time- and speed-based stream predictions and collision detection can be used to minimize the number of checks to the server and the number of objects consulted.
- objects that are moving slowly in the viewable window (indicating that they are far away), can be checked for updates less often. Objects that are close or moving at a fast pace are checked more frequently. Also, objects that are not travelling on a collision course with the client's viewable window can be ignored altogether. Of course, if a change of direction is detected, the collision course can be re-evaluated.
- the future location of a moving object is predicted for obtaining visual results of moving objects (positions) to optimize client-server bandwidth.
- Location prediction refers to statistical methods that derive patterns or mathematical formulas whose purpose is, given the recent trajectory of a moving object, to predict its future location.
- sensor data streams are queried, wherein each update from a sensor is associated with a function allowing prediction of future values of that sensor. The sensor commits to update its value whenever the difference between the observed value and the value estimated using the prediction function exceeds a certain threshold.
- Location prediction enables selective transfer of moving objects' data from the server to the client. More specifically, moving objects whose locations are predicted to be viewable will be transferred, whereas other moving objects' data will not, to optimize client-server bandwidth.
- the subject systems and methods can be used in a wide variety of applications and settings including, but not limited to, weather monitoring, troop dispatching, endangered species tracking, disaster mitigation, general aviation monitoring, fleet management, transportation and highway patrol problems, traffic analysis and visualization, commanding and controlling mobile sensors; commanding and controlling operations (e.g., homeland security, law enforcement and disaster response).
- weather monitoring troop dispatching, endangered species tracking, disaster mitigation, general aviation monitoring, fleet management, transportation and highway patrol problems, traffic analysis and visualization, commanding and controlling mobile sensors; commanding and controlling operations (e.g., homeland security, law enforcement and disaster response).
- the system and method of the invention enables situational monitoring by law enforcement (e.g., notice is provided to law enforcement regarding a hit and run accident).
- video surveillance recordings which are used in specific locations, are accessed in real time and integrated with other forms of critical information (e.g., airborne and vehicle-borne sensors).
- critical information e.g., airborne and vehicle-borne sensors.
- law enforcement officers would be able to use the invention to quickly pin point the geographic location, view streaming media of the current location to quickly assess the situation, and, through the use of additional sensors, track the offender's vehicle.
- program modules include routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types.
- Embodiments may be implemented as a computer process, a computing system, or as an article of manufacture, such as a computer program product or computer-readable medium.
- Certain methods and processes described herein can be embodied as code and/or data, which may be stored on one or more computer-readable media.
- Certain embodiments of the invention contemplate the use of a machine in the form of a computer system within which a set of instructions, when executed, can cause the system to perform any one or more of the methodologies discussed above.
- Certain computer program products may be one or more computer-readable storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
- Computer-readable media can be any available computer-readable storage media or communication media that can be accessed by the computer system.
- Communication media include the mechanisms by which a communication signal containing, for example, computer-readable instructions, data structures, program modules, or other data, is transmitted from one system to another system.
- the communication media can include guided transmission media, such as cables and wires (e.g., fiber optic, coaxial, and the like), and wireless (unguided transmission) media, such as acoustic, electromagnetic, RF, microwave and infrared, that can propagate energy waves.
- Communication media, particularly carrier waves and other propagating signals that may contain data usable by a computer system, are not included as computer-readable storage media.
- computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- a computer-readable storage medium includes, but is not limited to, volatile memory such as random access memories (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only-memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM), and magnetic and optical storage devices (hard drives, magnetic tape, CDs, DVDs); or other media now known or later developed that is capable of storing computer-readable information/data for use by a computer system.
- volatile memory such as random access memories (RAM, DRAM, SRAM
- non-volatile memory such as flash memory, various read-only-memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM), and
- the methods and processes described herein can be implemented in hardware modules.
- the hardware modules can include, but are not limited to, application-specific integrated circuit (ASIC) chips, field programmable gate arrays (FPGAs), and other programmable logic devices now known or later developed.
- ASIC application-specific integrated circuit
- FPGAs field programmable gate arrays
- the hardware modules When the hardware modules are activated, the hardware modules perform the methods and processes included within the hardware modules.
- any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc. means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention.
- the appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment.
- any elements or limitations of any invention or embodiment thereof disclosed herein can be combined with any and/or all other elements or limitations (individually or in any combination) or any other invention or embodiment thereof disclosed herein, and all such combinations are contemplated with the scope of the invention without limitation thereto.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- The present application is a divisional application of U.S. application Ser. No. 14/215,484, filed Mar. 17, 2014, which claims the benefit of U.S. Provisional Application Ser. No. 61/792,985, filed Mar. 15, 2013, the disclosures of each of which are hereby incorporated by reference in their entirety, including all figures, tables, and drawings.
- This invention was made with government support under Award Number HRD-0833093 awarded by the National Science Foundation. The government has certain rights in the invention.
- A Geographic Information System (GIS) captures, stores, analyzes, manages, and presents data linked to geographic locations. Example GISs include Google Earth™, ArcGIS® from ESRI, and commercial fleet management services.
- In general, a GIS presents tools for users to query and make decisions based on geographic information. The geographic information may be spatial and temporal; and is often heterogeneous, from divergent sources, and may contain structured and unstructured data. As a result, the processing of this data becomes complex and involves numerous challenges. For example, the use of multiple, disparate tools are often necessary in order to process and analyze geospatial data in real-time. These tools are often expensive and can require specialized skills and training to use. In addition, each tool may require the data to be in different formats, increasing the difficulty in combining heterogeneous types of data.
- Another challenge associated with current commercial systems is that much of the information currently stored in these systems is either historical or static in nature. While this is acceptable for visualizing data such as road-maps, and even handling a single moving object, such as in global positioning system (GPS) navigation where a moving vehicle is the only dynamic object represented, there exists a gap in presenting and handling the dynamic information associated with moving objects in the surrounding environment having different locations, speeds, shapes and trajectories.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
- Certain embodiments provide a GIS system that leverages video streaming sensor nodes and presents the dynamic information associated with moving objects in a manner that can be used across a variety of clients. In addition, techniques are discussed for minimizing bandwidth issues.
- An application programming interface (API) is provided in which clients may access and use dynamic information associated with moving objects in a surrounding environment.
-
FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented. -
FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention. - An application programming interface (API) for a Geographic Information System (GIS) is disclosed. The API enables clients to perform visual querying and rendering.
- Querying, analysis, and visualization of real-time data pertaining to at least two moving objects in conjunction with relatively static multi-temporal geospatial data can be facilitated on client devices through the presentation of the API.
- A GIS server, that may provide a GIS service including the API, can incorporate data from mobile video sensors and streaming technologies in order to present streaming and/or video data.
- In certain embodiments, the GIS server can perform the step of processing and analyzing geographic, spatial and/or temporal data to provide visual representation of the trajectories of relevant objects, which may be known to the server in advance or transmitted in real time, sampled, and have uncertainty aspects.
- Geographic data exploration can be enhanced through incorporation of moving objects. For example, at the client, moving objects of a specific area of interest may be viewed overlaid on geospatial data. According to embodiments of the invention, dynamic movement of objects within the geographic area and at specific resolutions of interest can be presented. This translates to the user as a real-world experience with objects moving across their screen or “zooming” by them.
- User scenarios that may be supported include, but are not limited to: (1) the user is on board a moving object whose trajectory (e.g., location as a function of time) is known a priori to the server; (2) the user is on board a moving object whose trajectory is generated in real-time and received by the server; or (3) the user is located at a fixed point.
- Likewise, the trajectories of relevant objects may be known to the server in advance or received by the server in real time and sampled. Visual querying and rendering can be provided to the client.
-
FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented. Referring toFIG. 1 , aclient device 105 can communicate with aGIS server 110 over anetwork 115. - The
network 115 can include, but is not limited to, a cellular network (e.g., wireless phone), a point-to-point dial up connection, a satellite network, the Internet, a local area network (LAN), a wide area network (WAN), a WiFi network, an ad hoc network or a combination thereof. Such networks are widely used to connect various types of network elements, such as hubs, bridges, routers, switches, servers, and gateways. Thenetwork 115 may include one or more connected networks (e.g., a multi-network environment) including public networks, such as the Internet, and/or private networks such as a secure enterprise private network. Access to thenetwork 115 may be provided via one or more wired or wireless access networks as will be understood by those skilled in the art. - The
client device 105 may be, but is not limited to, a personal computer (e.g. desktop computer), laptop, personal digital assistant (PDA), mobile phone (or smart phone), tablet, slate, terminal, or set-top box. - The
GIS server 110 may include distributed servers that execute real-time continuous queries to facilitate rendering and collaborating with vehicular ad-hoc networks (VANets) 120 and other video streaming sources. For example, a mobile sensor (e.g., on a drone aircraft or even a smart phone) with geo-positioning and communication capabilities and a camera can capture vehicles and pedestrians that are in line of sight and/or a stationary sensor with communication capabilities and a camera can be installed at geographically distributed locations such as at a traffic light, and these videos and/or images be communicated over a network, and ultimately collected by theGIS server 110. - The
GIS server 110 may include one or more computing devices. For example, theGIS server 110 can include one or more blade server devices, standalone server devices, personal computers, routers, hubs, switches, bridges, firewall devices, intrusion detection devices, mainframe computers, network-attached storage devices, and other types of computing devices. - In embodiments where the
GIS server 110 includes multiple computing devices, the server can include one or more communications networks that facilitate communication among the computing devices. - For example, the one or more communications networks can include a local or wide area network that facilitates communication among the computing devices. One or more direct communication links can be included between the computing devices. In addition, in some cases, the computing devices can be installed at geographically distributed locations. In other cases, the multiple computing devices can be installed at a single geographic location, such as a server farm or an office. Certain embodiments of the invention can be practiced in distributed-computing environments where tasks are performed by remote-processing devices that are linked through a communications network. In a distributed-computing environment, program modules can be located in both local and remote computer-readable storage media.
- The
GIS server 110 can also access multiplegeographic information sources 130 for static and/or dynamic geographic information. Once the results from the information sources are received by theGIS server 110, the GIS server can perform steps of filtering and formatting the results; storing the results in a database, transforming the results into a visual representation, and/or transmitting the results in a form to client devices. - The
client device 105 can communicate with theGIS server 110 to obtain visual data and information on relevant moving objects based on criteria submitted by a user of theclient device 105. - To allow ergonomic formulation of queries, a user interface can be provided in a web browser of the
client device 105 in which the end-user will be provided the ability to graphically manipulate objects and navigation examples. The presented data about the object can include the object's existing and computed properties, including location, description, context, and personalization. This query pattern adds information retrieval and database selection-like constraints to traditional spatio-temporal queries. Such constraints give users the flexibility to execute free-text searches (information-retrieval style) on unstructured data, or refined attribute-based predicate retrieval (SQL style) on structured data. For example: “visualize the trajectories of large green trucks near the current point” is a query where “green” is a keyword, “truck” is a category, and “large” translates into a numerical predicate. - Certain embodiments provide support for requests to the
GIS server 110 from the client that include functions for opening historic movement trajectories, getting live movement trajectories from the server, getting a video player (or a view of a player) to play videos captured on a given trajectory, and getting movement trajectory inside a search window that a user is interested in. - An API method for performing these functions is disclosed. The GIS services API involves a set of request messages available to a client 105 (or server) along with a definition of the structure of response messages sent to the client (or server). The response messages from the
GIS 110 server to theclient 105 may be in a markup language such as Extensible Markup Language (XML) or JavaScript Object Notation (JSON). The GIS services provided byGIS server 110 can interact programmatically over the network through industry standard Web protocols, such as, but not limited to, XML, JSON, Hypertext Transfer Protocol (HTTP) Representational State Transfer (REST), and Simple Object Access Protocol (SOAP). - According to certain embodiments of the invention, API functions that may be called by the client 150 include:
-
- “getHistoricTraces”: to get all the historic movement trajectories from server;
- “getLiveTraces”: to get all the live movement trajectories from server;
- “openPlayer”: to ask server to open a player to play the videos captured on the given trajectory; and
- “openSearchWindowWithPolygon”: to get the movement trajectory inside a search window.
- The search window is described by a series of points in latitudes and longitudes bounding the area that the user is interested in.
-
FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention. An incoming message from the client may be received by an API server of the GIS server. The incoming message may be received as a SOAP protocol message. - When the API/GIS server receives the incoming message, the API/GIS server determines whether the incoming message includes a historic traces request “getHistoricTraces” (202). If the incoming message includes the historic traces request (“Yes” of 204), the API/GIS server may get some or all historic traces and sends response back to client (204). The server may get this information from a database.
- The API/GIS server also determines whether the incoming message includes a live traces request “getLiveTraces” (206). If the incoming message includes the live traces request (“Yes” of 206), the API/GIS server may get some or all live traces and sends response back to client (208). The server may get this information from at least one sensor having a camera.
- The API/GIS server also determines whether the incoming message includes a video player request “openPlayer” (210). If the incoming message includes the open player request (“Yes” of 210), the API/GIS server may open the video player in the portal (e.g., user interface of client browser) and may stream video to client (212).
- The API/GIS server also determines whether the incoming message includes a get movement trajectory request “openSearchWindowWithPolygon” (214). If the incoming message includes the get movement trajectory request (“Yes” of 214), the API/GIS server may get the traces located within the search window back to client (216). Information about the specific trajectory may be obtained from at least one sensor having a camera within a geographic region associated with the search window or from information associated with the search window that is stored in a database.
- An example of a SOAP request for “getHistoricTraces” is as follows:
- <SOAP-ENV:Envelope xmlns: SOAP-ENV=“ . . . ” xmlns: SOAP-ENC=“ . . . ” xmlns:xsi=“ . . . ” xmlns:xsd=“ . . . ”> <SOAP-ENV:Body>m:getHistoricTraces x lns:m=“http://map_proxy.gis.cms.ibm.com/”/> </SOAP-ENV:Body> </SOAP-ENV:Envelope>
- An example of a SOAP request for “openPlayer”, which shows an argument for a streaming video channel, is as follows:
- <SOAP-ENV:Envelope xmlns: SOAP-ENV=“ . . . ” xmlns: SOAP-ENC=“ . . . ” xmlns:xsi=“ . . . ” xmlns:xsd=“ . . . ”> <SOAP-ENV:Body> <m:openPlayer xmlns:m=“http://map_proxy.gis.cms.ibm.com/”> <arg0> Channel Live2 </arg0></m:openPlayer></SOAP-ENV:Body></SOAP-ENV:Envelope>
- A user interface can be provided in which a map is displayed as part of a geographical visualization view of a region. When a user selects an object (such as one of the moving objects) displayed in the geographical visualization view in order to see its video, the user interface can request the video from the GIS server using, for example, the openPlayer request. Once the client receives the streaming video captured by the sensor corresponding to the selected object, the user interface can present a video player in the portal to show the video stream. While watching the video stream, the user can interact with the video player, resulting in changes to the geographical visualization view of the region. For example, the user may manually shift a time pointer in the video stream being watched in the player. In response to receiving this input, the interface can reposition the moving object on the map. In some embodiments, a polygonal projection of the sensor's view to earth surface can be provided as part of the visualization and synchronized with the playback of the video player.
- In some implementations, the client can request “getHistoricTraces” and “getLiveTraces” at regular intervals, for example, every second. For each request, the server may return an XML, formatted file that contains the name of the movement trajectories and the coordinates of the points of the trajectories. The client may draw these trajectories on the map within a user interface and list them in a Trajectory Control panel provided to the client. In one implementation, related streaming videos that lie on the trajectory path can be displayed in response to receiving an input command, such as a right click by a mouse connected to the client device or by a touch or other gesture of a touchscreen of the client device, on the trajectory path itself. A pop-up window can be displayed with the option of “Open Player” that, if selected, would proceed to send a request for the streaming video to the server and open a window to view that video.
- In some implementations, trajectory paths may be drawn in a manner to minimize unnecessarily obscuring other elements on the screen, for example as a thin red line. This, however, can make it more difficult for users to be able to select the trajectory path itself. To mitigate this and help users to select on the trajectory paths, an invisible buffer can be placed around the trajectory path lines that when an input indicating a selection is received on the invisible buffer, it is understood to be a selection of the trajectory path itself. Thus, the trajectory paths can be displayed with minimal disruption to the geographic visualization while still maintaining the ability to select the line for a second action. The user interface can include markers to tag the movement trajectories on the map in case the trajectories are too small to be seen from global view. The Trajectory Path Control panel can list available movement trajectories returned from the server in response to one of the requests for getting trajectories. When a trajectory is selected in the Trajectory Path Control panel, the view of the trajectory path from the perspective of an object along the trajectory path can be provided (e.g., representing a scenario where a user is on board the moving object).
- In certain embodiments, client-server bandwidth can be optimized. In one embodiment, the client can periodically consult the server for moving objects instead of using a constant stream of data. The rate of checking for moving objects and the number of moving objects will influence the bandwidth required. Algorithms that deal with time- and speed-based stream predictions and collision detection can be used to minimize the number of checks to the server and the number of objects consulted. To optimize client-server bandwidth, objects that are moving slowly in the viewable window (indicating that they are far away), can be checked for updates less often. Objects that are close or moving at a fast pace are checked more frequently. Also, objects that are not travelling on a collision course with the client's viewable window can be ignored altogether. Of course, if a change of direction is detected, the collision course can be re-evaluated.
- In certain embodiments, the future location of a moving object is predicted for obtaining visual results of moving objects (positions) to optimize client-server bandwidth. Location prediction refers to statistical methods that derive patterns or mathematical formulas whose purpose is, given the recent trajectory of a moving object, to predict its future location. In one related embodiment, sensor data streams are queried, wherein each update from a sensor is associated with a function allowing prediction of future values of that sensor. The sensor commits to update its value whenever the difference between the observed value and the value estimated using the prediction function exceeds a certain threshold. Location prediction enables selective transfer of moving objects' data from the server to the client. More specifically, moving objects whose locations are predicted to be viewable will be transferred, whereas other moving objects' data will not, to optimize client-server bandwidth.
- The subject systems and methods can be used in a wide variety of applications and settings including, but not limited to, weather monitoring, troop dispatching, endangered species tracking, disaster mitigation, general aviation monitoring, fleet management, transportation and highway patrol problems, traffic analysis and visualization, commanding and controlling mobile sensors; commanding and controlling operations (e.g., homeland security, law enforcement and disaster response).
- In one embodiment, the system and method of the invention enables situational monitoring by law enforcement (e.g., notice is provided to law enforcement regarding a hit and run accident). In a specific embodiment, video surveillance recordings, which are used in specific locations, are accessed in real time and integrated with other forms of critical information (e.g., airborne and vehicle-borne sensors). By way of example, law enforcement officers would be able to use the invention to quickly pin point the geographic location, view streaming media of the current location to quickly assess the situation, and, through the use of additional sensors, track the offender's vehicle.
- Certain techniques set forth herein may be described or implemented in the general context of computer-executable instructions, such as program modules, executed by one or more computing devices. Generally, program modules include routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types.
- Embodiments may be implemented as a computer process, a computing system, or as an article of manufacture, such as a computer program product or computer-readable medium. Certain methods and processes described herein can be embodied as code and/or data, which may be stored on one or more computer-readable media. Certain embodiments of the invention contemplate the use of a machine in the form of a computer system within which a set of instructions, when executed, can cause the system to perform any one or more of the methodologies discussed above. Certain computer program products may be one or more computer-readable storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
- Computer-readable media can be any available computer-readable storage media or communication media that can be accessed by the computer system.
- Communication media include the mechanisms by which a communication signal containing, for example, computer-readable instructions, data structures, program modules, or other data, is transmitted from one system to another system. The communication media can include guided transmission media, such as cables and wires (e.g., fiber optic, coaxial, and the like), and wireless (unguided transmission) media, such as acoustic, electromagnetic, RF, microwave and infrared, that can propagate energy waves. Communication media, particularly carrier waves and other propagating signals that may contain data usable by a computer system, are not included as computer-readable storage media.
- By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, a computer-readable storage medium includes, but is not limited to, volatile memory such as random access memories (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only-memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM), and magnetic and optical storage devices (hard drives, magnetic tape, CDs, DVDs); or other media now known or later developed that is capable of storing computer-readable information/data for use by a computer system. “Computer-readable storage media” do not consist of carrier waves or propagating signals.
- In addition, the methods and processes described herein can be implemented in hardware modules. For example, the hardware modules can include, but are not limited to, application-specific integrated circuit (ASIC) chips, field programmable gate arrays (FPGAs), and other programmable logic devices now known or later developed. When the hardware modules are activated, the hardware modules perform the methods and processes included within the hardware modules.
- Any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. In addition, any elements or limitations of any invention or embodiment thereof disclosed herein can be combined with any and/or all other elements or limitations (individually or in any combination) or any other invention or embodiment thereof disclosed herein, and all such combinations are contemplated with the scope of the invention without limitation thereto.
- It should be understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application.
Claims (16)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/652,993 US20170316033A1 (en) | 2013-03-15 | 2017-07-18 | Streaming representation of moving objects and shapes in a geographic information service |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201361792985P | 2013-03-15 | 2013-03-15 | |
| US14/215,484 US9734161B2 (en) | 2013-03-15 | 2014-03-17 | Streaming representation of moving objects and shapes in a geographic information service |
| US15/652,993 US20170316033A1 (en) | 2013-03-15 | 2017-07-18 | Streaming representation of moving objects and shapes in a geographic information service |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/215,484 Division US9734161B2 (en) | 2013-03-15 | 2014-03-17 | Streaming representation of moving objects and shapes in a geographic information service |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170316033A1 true US20170316033A1 (en) | 2017-11-02 |
Family
ID=51533259
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/215,484 Active 2035-01-22 US9734161B2 (en) | 2013-03-15 | 2014-03-17 | Streaming representation of moving objects and shapes in a geographic information service |
| US15/652,993 Abandoned US20170316033A1 (en) | 2013-03-15 | 2017-07-18 | Streaming representation of moving objects and shapes in a geographic information service |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/215,484 Active 2035-01-22 US9734161B2 (en) | 2013-03-15 | 2014-03-17 | Streaming representation of moving objects and shapes in a geographic information service |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US9734161B2 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9886190B2 (en) * | 2014-09-26 | 2018-02-06 | The Florida International University Board Of Trustees | Gesture discernment and processing system |
| CN108319716A (en) * | 2018-02-09 | 2018-07-24 | 北京天元创新科技有限公司 | A kind of communication line collection of resources automatically generates the method and system of part of path |
| CN109410573A (en) * | 2018-10-24 | 2019-03-01 | 中电科新型智慧城市研究院有限公司 | Transport services and management system based on road holography perception |
| US10231085B1 (en) | 2017-09-30 | 2019-03-12 | Oracle International Corporation | Scaling out moving objects for geo-fence proximity determination |
| CN112231389A (en) * | 2020-10-16 | 2021-01-15 | 中国民用航空华东地区空中交通管理局 | Track-based visual conflict model construction method and device, electronic equipment and storage medium |
| CN113239112A (en) * | 2021-07-12 | 2021-08-10 | 广州思迈特软件有限公司 | Third-production growth amount visualization method and device based on GIS system |
Families Citing this family (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10740358B2 (en) | 2013-04-11 | 2020-08-11 | Oracle International Corporation | Knowledge-intensive data processing system |
| US10084871B2 (en) | 2013-05-23 | 2018-09-25 | Allied Telesis Holdings Kabushiki Kaisha | Graphical user interface and video frames for a sensor based detection system |
| US9779183B2 (en) | 2014-05-20 | 2017-10-03 | Allied Telesis Holdings Kabushiki Kaisha | Sensor management and sensor analytics system |
| PH12013000136A1 (en) * | 2013-05-23 | 2015-01-21 | De Antoni Ferdinand Evert Karoly | A domain agnostic method and system for the capture, storage, and analysis of sensor readings |
| US20150338447A1 (en) | 2014-05-20 | 2015-11-26 | Allied Telesis Holdings Kabushiki Kaisha | Sensor based detection system |
| US9693386B2 (en) | 2014-05-20 | 2017-06-27 | Allied Telesis Holdings Kabushiki Kaisha | Time chart for sensor based detection system |
| WO2016153790A1 (en) * | 2015-03-23 | 2016-09-29 | Oracle International Corporation | Knowledge-intensive data processing system |
| CN107409064B (en) * | 2015-10-23 | 2020-06-05 | Nec实验室欧洲有限公司 | Method and system for supporting detection of irregularities in a network |
| JP6343316B2 (en) * | 2016-09-16 | 2018-06-13 | パナソニック株式会社 | Terminal device, communication system, and communication control method |
| CN106777271A (en) * | 2016-12-29 | 2017-05-31 | 广东南方数码科技股份有限公司 | It is a kind of that system constituting method is built based on Service Source pond automatically |
| KR20180131789A (en) * | 2017-06-01 | 2018-12-11 | 현대자동차주식회사 | System and method for providing forward traffic information during stop |
| US11648951B2 (en) | 2018-10-29 | 2023-05-16 | Motional Ad Llc | Systems and methods for controlling actuators based on load characteristics and passenger comfort |
| CN109902138B (en) * | 2019-03-07 | 2021-01-08 | 中国水利水电科学研究院 | Urban one-dimensional hydrodynamic simulation basic data topological relation construction and encoding method based on GIS |
| US11472291B2 (en) * | 2019-04-25 | 2022-10-18 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
| GB2588983B (en) * | 2019-04-25 | 2022-05-25 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
| US11615711B2 (en) * | 2019-04-29 | 2023-03-28 | Drover, Inc. | Precision localization and geofencing governance system and method for light electric vehicles |
| CN112948421B (en) * | 2021-03-30 | 2022-03-18 | 重庆市规划和自然资源信息中心 | Mobile query method for planning natural resources |
| CN113868320A (en) * | 2021-09-30 | 2021-12-31 | 广州凡拓数字创意科技股份有限公司 | Video patrol method and system based on GIS space search and linear sequencing |
| CN116644143A (en) * | 2023-04-07 | 2023-08-25 | 中电云数智科技有限公司 | A vehicle management method based on road checkpoint monitoring |
| CN117311563B (en) * | 2023-11-28 | 2024-02-09 | 西安大地测绘股份有限公司 | An AR-based monitoring method and system for illegal highway land use |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020067412A1 (en) * | 1994-11-28 | 2002-06-06 | Tomoaki Kawai | Camera controller |
| US20050073438A1 (en) * | 2003-09-23 | 2005-04-07 | Rodgers Charles E. | System and method for providing pedestrian alerts |
| US20070168090A1 (en) * | 2006-01-19 | 2007-07-19 | Lockheed Martin Corporation | System for maintaining communication between teams of vehicles |
| US20120280087A1 (en) * | 2011-05-03 | 2012-11-08 | Raytheon Company | Unmanned Aerial Vehicle Control Using a Gamepad |
| US20120316782A1 (en) * | 2011-06-09 | 2012-12-13 | Research In Motion Limited | Map Magnifier |
| US20130225180A1 (en) * | 2012-02-29 | 2013-08-29 | Lg Electronics Inc. | Method and Apparatus for Performing Handover Using Path Information in Wireless Communication System |
| US20140068439A1 (en) * | 2012-09-06 | 2014-03-06 | Alberto Daniel Lacaze | Method and System for Visualization Enhancement for Situational Awareness |
| US20140132426A1 (en) * | 2012-11-13 | 2014-05-15 | International Business Machines Corporation | Managing vehicle detection |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6697103B1 (en) * | 1998-03-19 | 2004-02-24 | Dennis Sunga Fernandez | Integrated network for monitoring remote objects |
| US6801850B1 (en) * | 2000-10-30 | 2004-10-05 | University Of Illionis - Chicago | Method and system for tracking moving objects |
| US7892078B2 (en) * | 2005-12-30 | 2011-02-22 | Microsoft Corporation | Racing line optimization |
| US7830276B2 (en) * | 2007-06-18 | 2010-11-09 | Honeywell International Inc. | System and method for displaying required navigational performance corridor on aircraft map display |
| US8869038B2 (en) * | 2010-10-06 | 2014-10-21 | Vistracks, Inc. | Platform and method for analyzing real-time position and movement data |
| US9171079B2 (en) * | 2011-01-28 | 2015-10-27 | Cisco Technology, Inc. | Searching sensor data |
| US9013352B2 (en) * | 2011-04-25 | 2015-04-21 | Saudi Arabian Oil Company | Method, system, and machine to track and anticipate the movement of fluid spills when moving with water flow |
| US9076259B2 (en) * | 2011-09-14 | 2015-07-07 | Imagine Communications Corp | Geospatial multiviewer |
| US8762048B2 (en) * | 2011-10-28 | 2014-06-24 | At&T Mobility Ii Llc | Automatic travel time and routing determinations in a wireless network |
| WO2013184528A2 (en) * | 2012-06-05 | 2013-12-12 | Apple Inc. | Interactive map |
-
2014
- 2014-03-17 US US14/215,484 patent/US9734161B2/en active Active
-
2017
- 2017-07-18 US US15/652,993 patent/US20170316033A1/en not_active Abandoned
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020067412A1 (en) * | 1994-11-28 | 2002-06-06 | Tomoaki Kawai | Camera controller |
| US20050073438A1 (en) * | 2003-09-23 | 2005-04-07 | Rodgers Charles E. | System and method for providing pedestrian alerts |
| US20070168090A1 (en) * | 2006-01-19 | 2007-07-19 | Lockheed Martin Corporation | System for maintaining communication between teams of vehicles |
| US20120280087A1 (en) * | 2011-05-03 | 2012-11-08 | Raytheon Company | Unmanned Aerial Vehicle Control Using a Gamepad |
| US20120316782A1 (en) * | 2011-06-09 | 2012-12-13 | Research In Motion Limited | Map Magnifier |
| US20130225180A1 (en) * | 2012-02-29 | 2013-08-29 | Lg Electronics Inc. | Method and Apparatus for Performing Handover Using Path Information in Wireless Communication System |
| US20140068439A1 (en) * | 2012-09-06 | 2014-03-06 | Alberto Daniel Lacaze | Method and System for Visualization Enhancement for Situational Awareness |
| US20140132426A1 (en) * | 2012-11-13 | 2014-05-15 | International Business Machines Corporation | Managing vehicle detection |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9886190B2 (en) * | 2014-09-26 | 2018-02-06 | The Florida International University Board Of Trustees | Gesture discernment and processing system |
| US10231085B1 (en) | 2017-09-30 | 2019-03-12 | Oracle International Corporation | Scaling out moving objects for geo-fence proximity determination |
| US10349210B2 (en) | 2017-09-30 | 2019-07-09 | Oracle International Corporation | Scaling out moving objects for geo-fence proximity determination |
| US11412343B2 (en) | 2017-09-30 | 2022-08-09 | Oracle International Corporation | Geo-hashing for proximity computation in a stream of a distributed system |
| CN108319716A (en) * | 2018-02-09 | 2018-07-24 | 北京天元创新科技有限公司 | A kind of communication line collection of resources automatically generates the method and system of part of path |
| CN109410573A (en) * | 2018-10-24 | 2019-03-01 | 中电科新型智慧城市研究院有限公司 | Transport services and management system based on road holography perception |
| CN112231389A (en) * | 2020-10-16 | 2021-01-15 | 中国民用航空华东地区空中交通管理局 | Track-based visual conflict model construction method and device, electronic equipment and storage medium |
| CN113239112A (en) * | 2021-07-12 | 2021-08-10 | 广州思迈特软件有限公司 | Third-production growth amount visualization method and device based on GIS system |
Also Published As
| Publication number | Publication date |
|---|---|
| US9734161B2 (en) | 2017-08-15 |
| US20140280319A1 (en) | 2014-09-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9734161B2 (en) | Streaming representation of moving objects and shapes in a geographic information service | |
| US20220319183A1 (en) | System for tracking and visualizing objects and a method therefor | |
| US10528316B2 (en) | Methods, systems, and media for presenting requested content on public display devices | |
| US8869038B2 (en) | Platform and method for analyzing real-time position and movement data | |
| CN112367531B (en) | Video stream display method, processing method and related equipment | |
| CA2949353C (en) | Computer-implemented systems and methods of analyzing data in an ad-hoc network for predictive decision-making | |
| US8543917B2 (en) | Method and apparatus for presenting a first-person world view of content | |
| US20150294233A1 (en) | Systems and methods for automatic metadata tagging and cataloging of optimal actionable intelligence | |
| US9798819B2 (en) | Selective map marker aggregation | |
| DE112016005854B4 (en) | Maintaining data protection in location-based processes | |
| Milosavljević et al. | Integration of GIS and video surveillance | |
| US11010641B2 (en) | Low power consumption deep neural network for simultaneous object detection and semantic segmentation in images on a mobile computing device | |
| EP2681651A1 (en) | Method and apparatus for providing an active search user interface element | |
| GB2449754A (en) | Spatio-temporal Graphical User Interface for Collaborative and Secure Information Sharing | |
| US20140358252A1 (en) | Cloud Based Command and Control System | |
| US20250324156A1 (en) | Weather event visualization application for mobile devices | |
| US10148772B2 (en) | System and method for automatically pushing location-specific content to users | |
| CN112287060B (en) | Data processing method and device based on monitoring map and readable storage medium | |
| Blasch et al. | Video observations for cloud activity-based intelligence (VOCABI) | |
| Fan et al. | An on-demand provision model for geospatial multisource information with active self-adaption services | |
| HK40038741A (en) | Displaying method and processing method for video stream, related device | |
| Burchett et al. | Collaborative visualization for layered sensing | |
| Atzl et al. | Online Visualization of Streaming Data | |
| Schneider et al. | Visual Analysis Tool for a BLE Technology based Tracking Data. | |
| Sherrill et al. | Analyzing networks of static and dynamic geospatial entities for urban situational awareness |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: THE FLORIDA INTERNATIONAL UNIVERSITY BOARD OF TRUS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RISHE, NAPHTALI DAVID;REEL/FRAME:043035/0232 Effective date: 20140310 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |