US12483613B2 - Scalable decentralized media distribution - Google Patents
Scalable decentralized media distributionInfo
- Publication number
- US12483613B2 US12483613B2 US17/925,207 US202117925207A US12483613B2 US 12483613 B2 US12483613 B2 US 12483613B2 US 202117925207 A US202117925207 A US 202117925207A US 12483613 B2 US12483613 B2 US 12483613B2
- Authority
- US
- United States
- Prior art keywords
- media
- network
- signals
- source
- packetized
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/16—Implementation or adaptation of Internet protocol [IP], of transmission control protocol [TCP] or of user datagram protocol [UDP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/70—Media network packetisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/185—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast with management of multicast group membership
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
- H04L65/611—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/765—Media network packet handling intermediate
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present disclosure relates to media communications, and, in particular, to decentralized media distribution systems, methods, and devices.
- a scalable media distribution system operable to interface with a plurality of media data components, said media data components comprising at least one media acquisition unit that acquires source media data and outputs corresponding source media signals, and at least one media presentation unit that receives source media signals and processes corresponding source media data
- the system comprising two or more transceiver units, each transceiver unit comprising: at least two source media signal ports, said media signal ports configured to provide a connection to the media components, at least one of said media signal ports for receiving source media signals from one of the media acquisition units and at least one of said media signal ports for transmitting source media signals to one of the media presentation units; a signal converter operable to packetize source media signals received via said source media signal ports from one of the at least one media acquisition units for communication over said packetized communications network, and to convert packetized network media signals to source media signals for communicating to one of the at least one media presentation units via one of said source media signal ports; and a packetized network media data transceiver operable to wirelessly
- a transceiver unit device for providing scalable media distribution, the transceiver unit device configured to interface with a plurality of media data components, said media data components comprising at least one at least one media acquisition unit for acquiring source media data and providing to said transceiver unit source media signals corresponding to said source media data, and at least one media presentation unit that receives source media signals from said transceiver unit and processes corresponding source media data
- the transceiver unit device comprising: at least two source media signal ports, said source media signal ports configured to provide a connection to the media components, at least one of said media signal ports for receiving source media signals from one of the media acquisition units and at least one of said media signal ports for transmitting source media signals to one of the media presentation units; a packetized network media data transceiver operable to wirelessly, or via a wired connection, multicast packetized network media signals over said packetized communications network to additional transceiver units and to receive packetized network media signals multicasted over said packetized communications network (which
- a scalable media distribution method for interfacing a plurality of media data components, said media data components comprising at least one media acquisition unit that acquires source media data and outputs corresponding source media signals, and at least one media presentation unit that receives source media signals and processes corresponding source media data, the method comprising: receiving as input at one of a plurality of source media signal ports disposed on a first transceiver unit the source media signal, at least some of said source media signal ports configured to provide a connection between the at least one media acquisition unit and said first transceiver unit for receiving source media signals; converting the source media signal received by the first transceiver unit to packetized network media signals; multicasting said packetized network media signals over a packetized communications network (which may be multicasted wirelessly, or wired connections, or both); receiving from any one or more additional transceiver units packetized network media signals; converting packetized network media signals to the source media signal; and outputting via one of the source media signal ports the source media signal, said output
- a scalable media distribution system operable to interface with a plurality of media data components, said media data components comprising at least one media acquisition unit that acquires source media data and outputs corresponding source media signals, and at least one media presentation unit that receives source media signals and processes corresponding source media data
- the system comprising: two or more transceiver units, each transceiver unit in turn comprising: at least two source media signal ports, said media signal ports configured to provide a connection to the media components, at least one of said media signal ports for receiving source media signals from one of the media acquisition units and at least one of said media signal ports for transmitting source media signals to one of the media presentation units; a signal converter operable to packetize source media signals received via said source media signal ports from one of the at least one media acquisition units for communication over said packetized communications network, and to convert packetized network media signals to source media signals for communicating to one of the at least one media presentation units via one of said source media signal ports; and a packetized network media data transceiver operable to interface with a plurality of
- transceiver units can be configured to selectively transfer or receive packetized network media signals to a subset of other transceiver units, wherein the selectivity of the subset of transceivers is based on characteristics relating to one or more of the following: the media data components, the source media signal, one or more of said source media signal ports, the source media data, the source media signals, and the transceiver unit.
- Embodiments hereof provide for scaling a virtually unlimited number of connective inputs and outputs for media acquisition and presentation units in a decentralized manner.
- Architectures associated with embodiments hereof leverage Ethernet, or other network data communications technologies and standards (including others that can be classified as provisioning communications up to the “link” or “data link” layers in, respectively, the Internet Protocol suite or the OSI model), as a bus to scale video or other sensory data, and have access to video on any device/transceiver connected to the decentralized network.
- inputs of any one device can be accessed via the outputs of any other device, without the requirement to access a centralized or remote network server or network of servers.
- a sensor/Ethernet interfacing protocol is implemented to facilitate that connection of the sensor device to a transceiver on a network, and the transceiver communicates to all the other transceivers the interfaced sensor information, and the receiving transceivers convert that information back, using the same interfacing protocol.
- GigE Vision is used as an interfacing protocol, but the disclosure hereof is not intended to be limited to that interfacing protocol (or indeed to Ethernet as the network communication technology).
- FIG. 1 is a table of various attributes of digital video connectivity technologies
- FIG. 2 is a table of a hierarchical Open System Interconnection (OSI) Reference Model for communications and computer networking;
- OSI Open System Interconnection
- FIG. 3 is a schematic of GigE Vision operations in the OSI model shown in FIG. 2 , in accordance with various embodiments;
- FIG. 4 is a schematic of Version 2.0 of the GigE Vision standard focused on five high-speed transport technologies, in accordance with various embodiments;
- FIG. 5 is a schematic of a GigE Vision framework encompassing a wide range of network elements, in accordance with various embodiments
- FIG. 6 is a schematic of solutions to leverage the networking flexibility of switched Ethernet architectures, in accordance with various embodiments.
- FIG. 7 is a schematic of a military vehicle retrofit scenario using networked video connectivity elements, in accordance with at least one embodiment
- FIG. 8 A is an image of an exemplary network processing unit
- FIG. 8 B is a table of exemplary specifications related to the unit of FIG. 8 A , in accordance with various embodiments;
- FIG. 9 A is a schematic of an exemplary system of media acquisition and presentation units interfaced with a transceiver unit
- FIG. 9 B is a schematic of a network comprising a plurality of systems such as that of FIG. 9 A , in accordance with various embodiments;
- FIG. 10 A is an image of an exemplary network switch
- FIG. 10 B is a schematic of an exemplary network comprising the system of FIG. 9 A and a system comprising the switch of FIG. 10 A , in accordance with at least one embodiment;
- FIG. 11 is a schematic of a fully integrated network comprising, inter alia, media acquisition and presentation units interfaced with a network switch and transceiver unit, in accordance with various embodiments;
- FIG. 12 is a schematic of a military vehicle housing a media communications system networked via a flexible communications interface, in accordance with various embodiments.
- FIG. 13 is a schematic of a communications network showing an exemplary connectivity between components, in accordance with various embodiments.
- elements may be described as “configured to” perform one or more functions or “configured for” such functions.
- an element that is configured to perform or configured for performing a function is enabled to perform the function, or is suitable for performing the function, or is adapted to perform the function, or is operable to perform the function, or is otherwise capable of performing the function.
- the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise.
- the term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise.
- the meaning of “a,” “an,” and “the” include plural references.
- the meaning of “in” includes “in” and “on.”
- the systems and methods described herein provide, in accordance with different embodiments, different examples of a ruggedised networking platform for sensor-to-presentation systems.
- Various embodiments relate to video and/or audio communication networks in armored vehicles, tanks, and the like, and may be of use in, for instance, reducing cognitive burden and increasing mission-effectiveness for end-users while meeting interoperability and scalability demands in size, weight, power and cost in sensitive real-time military applications.
- Embodiments hereof provide for scaling a virtually unlimited number of connective inputs and outputs for media acquisition and presentation units in a decentralized manner.
- Architectures associated with embodiments hereof leverage Ethernet, or other network data communications technologies and standards (including others that can be classified as provisioning communications up to the “link” or “data link” layers in, respectively, the Internet Protocol suite or the OSI model), as a bus to scale video or other sensory data, and have access to video on any device/transceiver connected to the decentralized network.
- the system can be characterized as one or more transceivers that are configured to interface media components, the media components comprising one or more media acquisition units with one or more media presentation units.
- media acquisition units automatically acquire extraneous information, and may include one or more of the following non-limiting examples: cameras, sensors, frame grabbers, or other imaging, vision, or sensing input devices.
- media presentation units sometimes referred to as media processing units, automatically process the extraneous information acquired by the media acquisition units, and may comprise one or more of the following non-limiting examples: displays and monitors, image and vision analysis devices, and network communication devices (e.g. for display at a remote location for guiding/piloting an unmanned vehicle).
- the inputs of any one transceiver units can be accessed via the outputs of some or all of the other transceiver units, without the requirement to access a centralized or remote network server or network of servers, or indeed accessing any other network device other than the transceivers themselves.
- the media presentation unit or media processing unit
- presents or displays media data e.g. a visual display or monitor
- other media data processing is possible. For example, automated machine vision processing; automated analysis for distinguishing specific shapes, colours, heat, material (e.g. organic vs. inorganic material); further communication; detecting movement; detecting non-conforming material; and other processing.
- an interfacing protocol is implemented in association with the connection of the media component to a given transceiver device, so that the signal received from the medial component can be communicated over a network (e.g. the Internet, or a local or personal area network) using standard communications technologies and protocols (e.g. Ethernet).
- the transceiver communicates the signal, as a network-enabled communication generated in accordance with the interfacing protocol, to all the other transceiver devices the interfaced sensor information, in some cases by multicasting to the other transceiver devices (wirelessly, via wired connections, or both), and the receiving transceiver devices convert that information back to a media signal, using the same interfacing protocol.
- GigE Vision is used as an interfacing protocol, but the disclosure hereof is not intended to be limited to that interfacing protocol (or indeed to Ethernet as the network communication technology).
- a network platform may comprise low latency, multicast Gigabit Ethernet (GigE), and may be a modular, scalable platform for the straightforward design, manufacture, and implementation of, for instance, camera-to-display systems. Furthermore, such a platform may be cost-effectively evolved to fully networked architectures integrating different sensor and display types, switching, processing, and recording units, and may meet various performance requirements for various applications.
- GigE multicast Gigabit Ethernet
- GigE multicast Gigabit Ethernet
- a platform may be cost-effectively evolved to fully networked architectures integrating different sensor and display types, switching, processing, and recording units, and may meet various performance requirements for various applications.
- hardware and peripheral equipment may be reduced within, for instance, a vehicle, and simplified, less expensive cabling may be deployed to help meet various cost and/or weight objectives.
- sensor data may be transmitted to, for instance, any combination of mission computers and displays.
- vehicle crew may view required information on a single display to know immediately if something has changed in an environment.
- embodiments comprising a dual ethernet port and passive bypass design enable reliable performance and protection against single-point-of-failure risks.
- systems, methods, and devices disclosed herein may be implemented in new or existing machine vision systems.
- vision, auditory, sensory, or other environmental data may be obtained by a source device and transmitted via transmitters to a media presentation device for automated analysis by a machine, which may not involve any display (or other broadcasting or presentation) of the collected vision, auditory, sensory, or other environmental data.
- the received data may be analysed by a machine (i.e. a computer or computing device) without re-creating the collected information as, for example, in a display screen or a speaker.
- the data may be analyzed in a machine-analysis system, such as a machine vision or machine-based sensory system.
- a machine-analysis system such as a machine vision or machine-based sensory system.
- this could include factory- or manufacturing-based automated systems, autonomous cars and vehicles, other autonomous systems, medical devices and surgery devices using cameras or other sensory devices, machine-based investigation/diagnostic information systems for human/animal and non-human/non-animal systems (e.g. investigating inaccessible features in a plumbing, electrical, or geological system).
- the presentation unit may not in fact “present” acquired data in a manner that is consumable by a human; in fact, a machine-based or automated analysis system may constitute the presentation unit as it is presented with the underlying digital data that represents that acquired signals.
- Various embodiments may further relate to low-latency networks with user-focused design to help increase intelligence, awareness, and safety while reducing cognitive burden for vehicle crew members through standards-compliant vehicle platforms (e.g. Def Stan 00-082 (VIVOE), STANAG 4697 (PLEVID), MISB ST 1608, STANAG 4754 (NGVA), Def Stan 23-009 (GVA), VICTORY guidelines, and the like) that are rapidly deployable, mission configurable, and cost-effective.
- Various platforms may further be highly scalable, allowing ready implementation of future capabilities that may increase mission effectiveness with minimum integration effort.
- various embodiments may comprise high-performance video networking capabilities that may be combined with powerful GPU resources (e.g.
- various embodiments may further comprise image fusion capabilities, 360-degree view stitching, map/terrain overlay, image enhancement, convolutional-neural-network based threat detection and classification, and the like.
- LSA local situation awareness
- the all-digital image streams generated by advanced sensors can be fed directly into sophisticated in-vehicle digital processing applications, improving the precision of tasks such as surveillance and targeting.
- New-generation vision sensors create a substantial opportunity, but also pose a significant challenge. Behind the crisp, high-definition images they produce are millions of pixels of high-speed digital data. To fully leverage the potential of this data in LSA systems, it must be distributed, displayed, and processed in real time with ultra-high reliability.
- Today's in-vehicle systems typically consist of different types of analog and digital cameras and image sensors mounted on the vehicle. They generate a range of video formats operating at a variety of data rates. Mixers are sometimes used to combine analog signals for multi-image viewing by crew members on a single mission computer or smart display inside. More typically, video is streamed directly to the computer or display.
- LSA systems To overcome these limitations an improvement that can be made to LSA systems is to deploy a networked connectivity system that handles the throughput of advanced cameras and sensors and brings together into a common topology both new equipment and legacy gear, such as analog cameras, which may in some embodiments be very difficult to replace due to cost, integration with other systems, familiarity, or other practical concerns.
- a network framework is required that provides a seamless path from the past to the future.
- a modern in-vehicle video connectivity system must also offer robust, reliable transport that can deliver “glass-to-glass” video in real-time with virtually no delay between what the camera sees and what is displayed on monitors inside the vehicle. Furthermore, modern in-vehicle video connectivity systems must be based on standards of interoperability and cost-effectiveness.
- FIG. 1 compares key attributes of the digital video connectivity candidates of Camera Link®, CoaXPress, and Ethernet that systems manufacturers and integrators may consider for use in vehicular LSA applications.
- Camera Link® is a digital serial interface standard introduced in 2000 by the Automated Imaging Association (AIA). It transports imaging data at high rates (up to ⁇ 6.8 Gb/s) over direct links of ⁇ 10 m or less. Cable extenders can be used to lengthen the short reach of Camera Link connections, but at significant cost.
- Camera Link is also limited by its dependence on point-to-point topologies. Cameras are essentially tethered to the frame grabbers in PCs, restricting system design options. Many vendors offer frame grabbers that support more than one camera, but the resultant ‘star’ deployments do not offer the flexibility and scalability of a true networked topology.
- CoaXPress is a standard for a point-to-point, asymmetrical serial communication that runs over coaxial cable. It was introduced in 2009 by a small industry consortium and was approved by the Japan Industrial Imaging Association (JIIA) in December 2010. It offers longer reach than Camera Link ( ⁇ 40 m at ⁇ 6.25 Gb/s, or ⁇ 120 m at ⁇ 1.25 Gb/s) but is supported by only a small group of vendors and is not widely deployed. Furthermore, the two chips needed to support its implementation are available today from only one vendor, and, like Camera Link, CoaXPress does not support networked video.
- JIIA Japan Industrial Imaging Association
- Ethernet on the other hand, is a time-honored standard that is deployed in most of the world's local area networks, including those for high-performance, real-time military and industrial applications. It is supported by a low-cost, well-understood, and widely available infrastructure. It delivers exceptional networking flexibility, supporting almost every conceivable connectivity configuration, including point-to-point, point-to-multipoint, multi-point to multi-point, and multi-channel aggregation.
- Ethernet delivers high bandwidth.
- GigE Gigabit Ethernet
- 10 GigE now ramping quickly in mainstream markets, delivers ⁇ 10 Gb/s.
- All Ethernet generations use the same frame format, ensuring backward compatibility and permitting system upgrades without sacrificing the equipment already in place. It also offers long reach, allowing spans of up to ⁇ 100 meters between network nodes over standard, low-cost Cat 5/6 copper cabling, and greater distances with switches or cost-effective fiber extenders. With now-inexpensive fiber cabling, distances of up to ⁇ 40 km can be achieved without intervening equipment.
- Ethernet is scalable, supporting meshed network configurations that easily accommodate different data rates and the addition of new processing nodes, displays, and sensors.
- Ethernet ports are built in to every laptop and ruggedised notebooks, as well as nearly all single-board computers (SBCs) and embedded processing boards, eliminating the need for an available adapter card slot in a PC to house a traditional frame grabber.
- SBCs single-board computers
- COTS commercial off-the-shelf
- FIG. 2 shows a hierarchical seven-layer Open System Interconnection (OSI) Reference Model for communications and computer networking.
- OSI Open System Interconnection
- Ethernet operates at level 2 (data link), and standardises the routing of data based on destination information in each data packet known as a Media Access Control (MAC) address. Every element in an Ethernet network, such as switches and NICs (network interface cards/chips), has a unique MAC address.
- Other networking functions such as formatting data into IP packets, overseeing data transport and flow control, managing sessions, and formatting information for the user application, are handled by higher level protocols in the OSI model.
- IP IP at Layer 3.
- TCP Transmission Control Protocol
- TCP has a heavy protocol overhead and is optimized for accurate rather than timely data delivery. It guarantees delivery, but latency measured in seconds is common while the protocol waits for out-of-order messages, retransmissions of lost messages, or most commonly, simply waiting for synchronization of packet acknowledgements.
- TCP is thus not recommended for mission-critical vehicle electronics (vetronics) applications for LSA, which depend on the immediate delivery of video data with low, predictable latency.
- vetronics vehicle electronics
- a better choice at layer 4 may be User Datagram Protocol (UDP), which is simpler than TCP, with lower protocol overhead. It is better suited for low-latency networked video, with the caveat that it does not guarantee data delivery.
- UDP User Datagram Protocol
- UDP is a better starting point than TCP.
- the reliability, efficiency, and effectiveness of systems that transfer video over Ethernet are still determined primarily by two factors: the protocols used at layers 5 to 7, and the sophistication and quality of the video connectivity solution implemented at these layers.
- FIG. 3 illustrates how the GigE Vision standard fits into the OSI model.
- UDP is used to handle transport at Layer 4, rather than TCP.
- UDP may be selected for its simplicity, low overhead, and multicast support. It is well suited for low-latency networked video, but does not guarantee data delivery.
- the GigE Vision standard includes an optional mechanism that allows video sources to resend undelivered data to video receivers. This mechanism can also be turned off if resending data is not required for the application.
- packets will rarely if ever be dropped.
- This mechanism may allow performance-oriented implementations of the GigE Vision standard to guarantee video transport and achieve low and predictable latency, even during a resend.
- the first two versions of the GigE Vision standard focused primarily on point-to-point connectivity between video sources and receiving software in a host PC.
- Version 1.2 of the standard ratified in January 2010, includes a range of updates that meet growing demand for application architectures that make better use of Ethernet's powerful networking capabilities.
- Version 1.2 permits a wide range of network-connected elements (basically anything that can be managed by GVCP) to be registered as compliant products.
- GigE Vision now supports, for example, video servers, hardware video receivers, video processing units, network-controlled devices, and management entities, as illustrated in FIG. 5 .
- the GigE Vision standard is ideally suited for the high performance, richly featured video networks required for military vetronics systems incorporating today's advanced vision sensors.
- Version 2.0 of the GigE Vision standard ratified in 2012 by the AIA's GigE Vision Technical Committee, optimises the standard for high-speed transport. The technical work has five key thrusts, as detailed in FIG. 4 .
- Version 2.0 formally includes 10 GigE in the standard text, the standard does not preclude the use of 10 GigE in systems compliant with earlier versions of the standard.
- the Ethernet/GigE Vision platform provides an excellent framework for building high-performance networked video connectivity systems for vetronics LSA. However, it is the quality of the implementation that defines the performance levels of video networks based on the Ethernet and GigE Vision standards. Many performance characteristics that are important to new-generation vetronics systems, such as low and consistent latency, high throughput, guaranteed data delivery, and low CPU usage, vary with the implementation method. Achieving an implementation that meets the stringent performance requirements of LSA systems for military vehicles is time-consuming, expensive, and technically challenging. In part to address this need, various embodiments relate to networked connectivity solutions for, for instance, mission-critical, real-time applications in the military, medical, and manufacturing sectors, whereby networking elements are compliant with the GigE Vision standard. Furthermore, various embodiments enable solutions to support many different network configurations, ranging from traditional point-to-point connections between a camera and mission computer to, for instance, more advanced configurations based on switched Ethernet client/server architectures, as shown in FIG. 6 .
- Various embodiments relate to networked video connectivity solutions compliant with the GigE Vision standard, although other vision interfacing solutions may be used in other embodiments.
- embedded hardware products may allow designers of cameras and other imaging devices to integrate video interfaces with core sensor electronics quickly, with minimal risk via embedded video interfaces.
- frame grabbers may allow manufacturers to integrate a wide range of cameras into a wide range of system types with plug-and-play simplicity.
- Various embodiments relate to frame grabbers that are internal to PCs having a peripheral card slot, or those that are external to PCs and do not require a peripheral card slot. Further embodiments relate to the ability to incorporate tool kits such as the eBUSTM SDK to quickly and easily develop third-party or custom video applications.
- Such tool kits may include sample source code and executables that provide working applications for functions such as device configuration and control, image and data acquisition, and image display and diagnostics. They may further operate under Windows or Linux operating systems, include drivers for, for instance, transferring video data in real time directly to applications, and may optionally not be subject to task demands from an operating system.
- Various embodiments further relate to robust, end-to-end platforms that may be compliant with the GigE Vision standard, or other vision standards, and may be tailored to meet the networked video requirements of LSA programs for both the retrofit of existing vehicles, as well as the design of new ones.
- communications elements such as frame grabbers may be used to convert analog and digital feeds from existing video sources into GigE Vision compliant video streams.
- the streams may then be incorporated into a common, real-time GigE Vision framework that may be all-digital, all-networked, and manageable.
- Such embodiments may salvage the use of legacy cameras and sensors, while delivering a scalable Ethernet backbone that is backward-compatible with older technology and may enable the introduction of advanced digital sensor technologies.
- Embedded Video Interfaces may be built directly into new-generation high-resolution cameras, making them compatible with the desired standard, such as GigE Vision, from the start. Integration may be accomplished by adding an EVI to, for instance, the back end of a camera, or by integrating a core into a camera's FPGA and a digital sensor directly onto a processing board, thus reducing component count and simplifying overall hardware design.
- mission computers can be equipped with a toolkit such as Pleora's eBUS SDK, enabling video from a GigE Vision compliant link to stream in real-time into system memory, without the need for a frame grabber.
- an external frame grabber may be deployed to reduce computer count and optimize the use of valuable in-vehicle real-estate.
- FIG. 8 is a diagram of a possible retrofit implementation, in accordance with at least one embodiment, using iPORT IP engines, vDisplay video receivers, and a processing unit with an eBUS driver from Pleora.
- the vehicle is equipped with a range of analog and digital cameras, which may provide views of its entire perimeter. Video from the cameras may be streamed simultaneously over a multicast network, for example using GigE Vision Ethernet standard protocol, to the driver controlling the vehicle. Some systems may support wireless and/or wired communication for this streaming. In this case, while monitors are set up in front of the driver, image streams may also be distributed through the network to other crew members, who may view the video or use the on-board mission computer to combine images for display elsewhere in the vehicle.
- Troops may therefore decide which video streams they need to see, without changing cabling or software configurations, or use the on-board mission computer to combine images for use by others in the vehicle.
- video from legacy cameras may be converted to an uncompressed Ethernet-ready video stream, for example in accordance with the GigE Vision standard, by an external frame grabber and multicast over an Ethernet network to displays and processing equipment at various points within a vehicle.
- the GigE Vision standard for video distribution designers may meet video performance requirements, such as those outlined in Def Stan 00-082 (VIVOE), STANAG 4697 (PLEVID), and MISB ST 1608, to enable to design of vehicle electronics platforms that comply with STANAG 4754 (NGVA), Def Stan 23-009 (GVA), and VICTORY guidelines.
- video, control data, and power may be transmitted over a single cable, thereby lowering component costs, simplifying installation and maintenance, and reducing cable clutter and weight in a vehicle.
- All computers used for processing and mission control may further connect to the network via their standard Ethernet port, eliminating the need for a computing platform with an available peripheral card slot.
- Designers may thus, in accordance with various embodiments, employ ruggedised laptops, embedded PCs, or single-board computers for image analysis and control to help lower costs, improve reliability, and meet size, weight, and power (SWaP) objectives, while easily adding advanced capabilities to reduce cognitive burden and increase mission effectiveness for end-users.
- SWaP size, weight, and power
- a network processing unit herein interchangeably referred to as “RuggedCONNECT”, or “transceiver unit”, that enables a scalable, flexible approach to real-time sensor networking.
- Various embodiments further relate to scalable, flexible, real-time video sensor networking.
- Such a unit may be a highly integrated standalone device that may acquire, process, and display real-time data for, for instance, vehicle-based local situational awareness and/or driver vision enhancer (DVE) applications. It may further comprise a networked video switcher having a number of analog composite inputs (e.g. eight inputs) supporting, for instance, RS-170/NTSC/PAL and/or DVI-D displays (e.g. two independent single link displays).
- DVE driver vision enhancer
- Units may support GigE Vision and/or Def Stan 00-082 streaming protocols via, for instance, Ethernet channels (e.g. dual 1 Gb/s ethernet channels), for networked, open standard, interoperable video management systems as demanded by, for instance, GVA, NGVA, and/or VICTORY standards.
- Ethernet channels e.g. dual 1 Gb/s ethernet channels
- Various embodiments may further relate to a RuggedCONNECT unit that may combine with GPU resources (e.g. NVIDIA Jetson TX2i) for application-specific image processing and graphics overlay, and/or decision-support capabilities to reduce cognitive burden and increase mission effectiveness. Furthermore, such a combination may further support applications such as image fusion, 360-degree stitching, map/terrain overlay, image enhancement, and/or more demanding processes such as convolutional-neural network-based threat detection and classification.
- GPU resources e.g. NVIDIA Jetson TX2i
- Such units may have a highly configurable architecture that may host, for instance, multiple mini-PCIe and/or M2 daughter cards, enabling fast development of products to address various sensor and display interfaces, such as HD-SDI, CameraLink, VGA, STANAG-3350, or custom sensor/display requirements.
- Such architecture in accordance with various embodiments, may enable the addition of more interfaces, and/or support a mix of interfaces, additional network interfaces, and/or general communications ports.
- units may comprise a reduced number of interfaces and smaller overall enclosure size. At least one embodiment relates to a video processing unit comprising 8 video inputs (e.g.
- RS-170/NTSC/PAL 2 fully independent DVI-D displays
- components of RS-232/422/485, CANbus, USB2.0, and GPIO dual ethernet capability to enable system level redundancy and effective communications capabilities
- bypass channels for select inputs to provide additional redundancy during degraded operating situations and a scalable technology platform to support multiple sensor and display configurations, including basic sensor, display, and/or network-only processing units.
- Various embodiments further relate to a system enabling plug-in artificial intelligence solutions for, for instance, machine learning-based tank detection or driver assistance, or having a software platform to provide features such as network-based video switching or advanced situational awareness.
- units may comprise an open framework to load custom imaging plugins to perform real-time video analysis.
- At least one embodiment relates to a processing unit that eases design of standards-compliant vetronics imaging platforms.
- various embodiments relate to systems that are GigE Vision- and Def Stan 00-082-compliant, are GVA-, NGVA-, and VICTORY-ready, have MIL-STD-1275E power supplies, and have MIL-STD-810G and MIL-STD-461F conformance for shock, vibration and EMI.
- Various embodiments may further comprise powerful GPU resources, such as those of NVIDIA Jetson TX2i configured as a system-on-module and enabling application-specific capabilities.
- Such resources may comprise, for instance, a 256 core Pascal GPU, a quad-core ARM Cortex-A57 CPU system, a dual-core Denver CPU system, 8 GB LPDDR4, multichannel hardware compression (e.g. H.264/H.265/JPEG), operability to encode and/or decode signals, dual independent displays, and/or OpenCV, OpenGL, CUDA, or Video4Linux.
- FIG. 8 A shows a non-limiting example of a video processing unit 800 , or RuggedCONNECT system, having a plurality and variety of ports 810 and 820 for signal input and/or output.
- FIG. 8 B shows tabulated non-limiting exemplary components, functional specifications, customisation options, and various mechanical, environmental and power specifications of a system such as that of FIG. 8 A .
- ports 810 or 820 of FIG. 8 A may further enable similar units to connect to one another by a wired connection (e.g. units may be daisy-chained together) to, for instance, share information, data, and/or resources.
- a wired connection e.g. units may be daisy-chained together
- units wirelessly connected such as via an internet protocol, connectionless transport protocol, UDP, UDP-lite, or the like.
- Such embodiments may be of use in, for instance, scaling up a network size (e.g. increasing the number of available ports in a system) without having to design and/or purchase a larger system, such as one having 15 ports that is correspondingly bulky and precludes use of a previously purchased 8-port unit.
- Networks comprising a plurality of units so connected may further preclude the need for an external or centralised server to communicate image, video, audio, or text data between network components.
- Such networks comprising connected units may further communicate via packetised data transfer in embodiments in which a processing unit comprises both transceivers for inbound/outbound data transfer (e.g. transceivers operable to receive and multicast packetised network signals) and a signal converter operable to packetise/de-packetise network signals.
- a network processing unit such as that of FIG. 8 A that may operate as an all-in-one rugged switch for sensor capture, streaming, processing, display, and the like.
- Various embodiments relate to a unit that may transmit real-time, low-latency sensor data across a secure network to processors and/or displays for, for instance, advanced warning and effective mission planning in military operations.
- Embodiments may further comprise a crew-centred design for intuitive systems that may reduce cognitive burden, and may have configurable architecture for standards-compliant platforms (e.g. vehicle platforms compliant with NGVA, GVA, VICTORY, or the like) that may be rapidly deployable, mission-configurable, and cost-effective.
- standards-compliant platforms e.g. vehicle platforms compliant with NGVA, GVA, VICTORY, or the like
- Units may be employed in, for instance, SWaP-C-optimised imaging platforms with high scalability and flexibility to incorporate future capabilities with minimal integration effort, such as plug-in AI solutions for machine learning-based tank detection and identification, driver assistance (e.g. changing soil conditions, gradients, obstacles, etc.), integration of GPU resources, and the like.
- SWaP-C-optimised imaging platforms with high scalability and flexibility to incorporate future capabilities with minimal integration effort, such as plug-in AI solutions for machine learning-based tank detection and identification, driver assistance (e.g. changing soil conditions, gradients, obstacles, etc.), integration of GPU resources, and the like.
- FIG. 9 A shows a schematic illustrating an exemplary system 910 for providing scalable media acquisition and presentation, in accordance with various embodiments.
- a transceiver unit 910 may serve as an interface between one or more media acquisition units 920 and media presentation units 930 .
- Media acquisition units may comprise, for instance, one or more video cameras, microphones, or the like, or combination thereof
- media presentation units may comprise, for instance, one or more displays, speakers, or the like, or combination thereof.
- This exemplary embodiment shows an in situ camera-to-display system, but embodiments of the instant disclosure include systems where one or more (or indeed all) of the camera(s) (i.e. media acquisition device) and the presentation devices (i.e. screen or accessible data source providing information relating to the acquired media data in a digital form) are not in situ.
- the transceiver unit 920 may comprise a network processing unit as described above (e.g. that of FIG. 8 A ), and as such, may have any number of ports (e.g. 8 ports, such as elements 810 or 820 of FIG. 8 A ) that may be configured to provide connections for one or more of receiving and/or transmitting media signals (e.g. analog or digital source media signals).
- ports operable to accommodate one or more, or combination of, connections, such as Ethernet, general purpose input/output (GPIO), USB, controller area network (CAN) bus, RS-232/422/485, and the like.
- the transceiver unit 920 may further comprise a signal converter operable to packetise media data, such as source media signals, for communication over a network (e.g. a packetised communications network).
- a signal converter may additionally, or alternatively, be operable to convert packetised signals to output corresponding source media signals (e.g. analog or digital source media signals), to, for instance, presentation units 940 via ports, such as those indicated by elements 810 or 820 of FIG. 8 A .
- the signal converter is configured to accept electrical signals communicating video signals and convert them into packetized video transmission for communication over Ethernet-compliant communications networks.
- the transceiver unit may utilize video, image, and/or machine-vision interface standard protocols for interfacing source media signals with packetized communication networks.
- the transceiver unit or the signal converter thereof is configured to accept signal input from any one or more media acquisition devices, or types or classes thereof, which may communicate any number of types of media signals, including analog, digital or electrical signals, and as well as packetized or other network communication signals.
- a transceiver may be configured to accept GigE Vision/network cameras (or other types of media acquisition devices that provide a packetized or network-enabled media signal in accordance with another interface, communication, and/or network protocol) into an input thereto or into one of the plurality of inputs thereof.
- a given transceiver unit may be configured to provide the media signal output to any one or more media acquisition devices, or types or classes thereof, which may communicate any number of types of media signals, including analog, digital or electrical signals, and as well as packetized or other network communication signals.
- a transceiver may be configured to communicate media signals via an output thereof to GigE Vision/network displays or analysis devices (or other types of media presentation devices that provide a packetized or network-enabled media signal in accordance with another interface, communication, and/or network protocol) into an input thereto or into one of the plurality of inputs thereof.
- GigE Vision/network displays or analysis devices or other types of media presentation devices that provide a packetized or network-enabled media signal in accordance with another interface, communication, and/or network protocol
- the transceiver unit 920 may further comprise a media data transceiver, or a packetised network media data transceiver, operable to send and/or receive media signals (e.g. packetised network media signals, such as those packetised by a signal converter in the same device 910 , or in another network device) over, for instance, a packetised communications network.
- media signals e.g. packetised network media signals, such as those packetised by a signal converter in the same device 910 , or in another network device
- a transceiver unit 920 may be operable to interface acquisition 930 and presentation 940 units in a system 910 through wired connections. While optionally operable to packetise source media signals over a network, a unit 920 may be configured in a system 910 to transfer analog and/or digital source media signals (e.g. audio data, image/video data, text data, etc.) from an acquisition unit 930 to a presentation unit 940 directly without signal conversion. Various embodiments may further relate to a transceiver unit 920 having a port (e.g. elements 810 or 820 of FIG. 8 A ) enabling bidirectional communication (i.e. one or more ports may be operable to both send and receive signals, for instance to communicate via an Ethernet connection).
- a port e.g. elements 810 or 820 of FIG. 8 A
- bidirectional communication i.e. one or more ports may be operable to both send and receive signals, for instance to communicate via an Ethernet connection.
- the system 910 of Figure A is operable coupled with a similar system 912 via a network 900 of the transceiver unit 920 and a second transceiver unit 922 having similar properties to that of the first transceiver unit 920 (e.g. a signal converter operable to packetise and/or convert source media signals and packetised network signals, a transceiver operable to send and/or receive packetised signals).
- a signal converter operable to packetise and/or convert source media signals and packetised network signals
- a transceiver operable to send and/or receive packetised signals
- this example shows a packetised communications network 900 size of two transceiver units 920 and 922
- various embodiments relate to highly scalable networks wherein any number of units may be added to the system.
- transceiver units may be added to such packetised communications networks by announcing their presence to the network 900 .
- a transceiver unit such as that of element 920 may have a unique address (e.g a MAC address).
- Additional units such as that of element 922 , may announce themselves to the network 900 with a corresponding unique address, and may then be incorporated within the network 900 .
- Such networks while optionally employing wired connections, may additionally, or alternatively, comprise a means known in the art for conveying signals, or packetised network media signals, such as an internet protocol, and/or a connectionless transport protocol such as UDP, UDP-lite, or the like.
- embodiments may relate to systems comprising transceiver units that may unidirectionally or bidirectionally communicate packetised network media signals with each other over the network 900 , and may further conform to communication that is compliant with, for instance, a GigE Vision standard, and/or a GigE Vision interface standard.
- a new transceiver unit that is accepted on to the network of the existing transceiver unit(s) can be added at any time in connection with any type of media component (i.e. whether or not directly connective or compatible with an existing installed base of media components).
- the local network made up of the transceiver units do not access a centralized server, system, or network, but rather communicate amongst themselves directly; accordingly, the network of transceiver units can be scaled at any time and need not have access to an external network.
- a second transceiver unit 922 may be a component of a respective system 912 having respective a media acquisition unit(s) 932 , and/or may be connected to a respective presentation unit(s) 942 .
- the system 912 may comprise media acquisition units that differ in number or type (e.g. audio/visual), or combination thereof, from those of system 910 .
- Various embodiments may further relate sharing of source media data between various components of the network 900 via, for instance, communication between transceiver units 920 and 922 .
- source media data acquired by an acquisition unit 930 of system 910 may be received at a port of the transceiver unit 920 as a source media signal.
- the source media signal may then be packetised and sent over the network 900 as a packetised network media signal via, respectively, a signal converter and a packetised network media data transceiver associated with the unit 920 .
- the packetised network media signal may then be received by a packetised network media data transceiver associated with another unit 922 of the network, which may convert the packetised network media signal to a source media signal using a signal converter associated therewith, and transmit the source media signal to a presentation unit 942 of the system 912 .
- media data acquired via acquisition units 932 may be received as source media signals by transceiver unit 922 , packetised and transmitted to unit 920 over the network 900 , converted to a source media data signal and then presented via a presentation unit 930 .
- various embodiments may relate to systems wherein one or more transceiver units 920 and/or 922 , or additional units in the system, are not operably coupled to one or more of media acquisition units and presentation units.
- each transceiver unit of a system comprises 8 ports, such as those of elements 810 and 820 of FIG. 8 A , and an application required 12 input video streams (e.g. to obtain a 360-degree view surrounding a tank) to be output to a single presentation unit (e.g. a display screen with 12 views)
- a system comprising two transceiver units, for example units 920 and 922 in FIG. 9 B , may be employed.
- each transceiver unit 920 and 922 may, for instance, each be operably coupled with 6 media acquisition units 930 932 , respectively.
- a port on, for instance, unit 920 may then be employed to output 12 video streams to a single display unit 940 , while unit 922 may not output a source media signal.
- output may be shared between ports on both transceiver units 920 and 922 , while only, for instance, unit 922 receives as input a source audio signal.
- any limitation, in situ or otherwise, with respect to the number of physical input ports available for installing additional media acquisition devices, is addressed by adding another transceiver unit, which may have one or more input ports.
- a limitation with respect to the number of outputs, in situ or otherwise, available for connecting a media presentation device is addressed by adding another transceiver unit.
- the input of any of the transceiver devices is available from the output of any transceiver unit (including the same or one of the other transceiver devices).
- network components such as transceiver units 920 and 922 , may be situated locally, i.e. in situ, such as on a vehicle (e.g. a tank, armored car, etc.), and may thus comprise a local area network (LAN).
- LAN local area network
- various embodiments may relate to a network 900 wherein all network components, including media acquisition 930 and 932 and presentation 940 and 942 units may be similarly comprising a network in situ. In other embodiments, the network components may be remotely located.
- systems may comprise camera-to-display systems (e.g. a system 910 comprises a camera 930 which acquires source media data which may be transferred to a display 940 as a source media signal via the transceiver unit 920 coupled to components using wire-based connections), or may comprise complex, fully networked applications integrating different sensors and display types, switching, processing, and recording units.
- camera-to-display systems e.g. a system 910 comprises a camera 930 which acquires source media data which may be transferred to a display 940 as a source media signal via the transceiver unit 920 coupled to components using wire-based connections
- complex, fully networked applications integrating different sensors and display types, switching, processing, and recording units.
- Such systems may be employed for, for instance, local situational awareness (LSA), driver vision enhancer (DVE) applications, or the like, or may complementarily be applied as, for instance, an all-in-one solution to route video sources to a display or processing unit without going through a network.
- LSA local situational awareness
- DVE driver vision enhancer
- interfaces can be scaled down from, for instance, that presented in FIG. 8 A , and system components may be adapted to scale down devices.
- custom sensor adapters may acquire sensor streams, apply processing, and send signals to a network.
- a display adaptor may acquire streams from a network, apply processing, and send signals to a display.
- image processors may acquire streams from a network, apply processing, and send processed signals back to a network.
- a transceiver unit may be coupled with a smart video switcher, as shown in FIG. 10 A , as a drop-in solution that provides the benefits of sensor networking that meets video performance requirements outlined in, for instance, Def Stan 00-082 (VIVOE) and/or GigE Vision (GEV).
- VIP Def Stan 00-082
- GEV GigE Vision
- manufacturers may design vehicle electronics platforms that comply with STANAG 4754 (NGVA), Def Stan 23-009 (GVA), and/or VICTORY guidelines.
- NGVA STANAG 4754
- GVA Def Stan 23-009
- VICTORY VICTORY guidelines.
- a video switcher 1000 may comprise a designated number of one or more types of media ports 1010 , 1020 , and 1030 (e.g. RS-170, NTSC, PAL, HD-SDI, CAN bus, USB, Ethernet, etc.), which may be operable to receive as input and/or output media signals (e.g. video, audio, text, etc.).
- media signals may be output to individual displays, and various functionalities may offer system-level redundancy. For instance, ethernet capability may be dual to enable redundancy and more effective communications capabilities, or various inputs may comprise bypass channels for additional redundancy during, for instance, degraded operating situations.
- FIG. 10 B shows one such exemplary embodiment, wherein the system 910 of FIG. 9 A is integrated with a video switcher 1000 of FIG. 10 A .
- the switch 1000 may be operably coupled, for instance via a GigE connection and/or over a packetised communications network, to the transceiver unit 920 , the latter having any number of media acquisition units 930 and media presentation units 940 coupled thereto.
- the switch may further be coupled to a designated number of additional media acquisition units 1030 and additional presentation units 1040 , for instance via GigE Vision (GEV) or VIVEO standards-compliant connections.
- GEV GigE Vision
- an acquisition unit may include any device configured to collect external data. While many of the embodiments referred to above relate to cameras or visible and non-visible light sensors, any sensors may be used to acquire external data for communication over a network.
- acquisition units and presentation units may comprise machine-vision components, as well as non-vision-related components. While machine-vision generally refers to imaging-based automatic inspection and analysis for such applications as automatic inspection, process control, and robot guidance, usually in industry, non-vision-related sensors or acquisition units are used in some embodiments.
- any device configured to detect changes in the environment including but not limited to temperature, proximity, pressure, chemical, biochemical, monitoring sensors, among others) may be used for an acquisition unit.
- presentation units this may refer to a visual image or video display, but it may also refer to an automatic or computer analyser of images or video (or other sensor output).
- presentation unit may refer to a communication endpoint device that communicates the output media signal onto another network or to another presentation unit located elsewhere (wherein such other presentation unit may be on another network or otherwise).
- the media components may require a wired connection into a transceiver unit.
- source media ports on transceiver units may support wireless connections, e.g. Bluetooth.
- the media component may already be communicated in a packetized network signal (e.g. the acquisition unit may output GigE Vision compliant media signals), in which case the interfacing activities may be minimal, redundant, or not required.
- the transceiver units may comprise, or may comprise access to, media signal processing units that carry out additional functionality that could not otherwise be carried out other than by significant post-processing on custom-programmed computing devices.
- a given media acquisition unit and a given media presentation device may not be physically or datalink compatible; accordingly, a transceiver unit that has the appropriate port for a given, but previously incompatible connection, can be added, thereby rendering the incompatible media component compatible with the other previously installed media components.
- Other examples may include overlaying or combining of media data from different media acquisition units and different types of medial acquisition units.
- Another example may include modification of media data to enhance or reduce the relative importance of certain information; this could include automated recognition of certain features or characteristics important in a machine vision context, or distinguish between natural material and camouflage material or recently disturbed ground cover from non-disturbed ground cover (the latter two examples being useful in association with military vehicles).
- the media signal itself could be modified so that a subsequent—or legacy media component, would display or otherwise recognize or distinguish the same feature or characteristic.
- Embodiments comprising a video switch may provide a simple and easy way to integrate cameras and/or sensors in a real-time network.
- various embodiments further relate to fully networked systems, such as fully networked vetronics system designs with scalable, modular platforms comprising RuggedCONNECT system and switcher architectures integrating different sensor and display types with (dual) ethernet capabilities and in-built system redundancy.
- FIG. 11 shows an exemplary, non-limiting embodiment of such a system 1100 .
- media acquisition 1130 and presentation 1140 systems may be interfaced with a transceiver unit, or RuggedCONNECT system 1120 , which may in turn be operable coupled, such as via a GigE Vision standard connection, to a switch 1110 .
- the switch 1110 may further be coupled with a second transceiver unit 1122 having respective media acquisition units 1132 , as well as having additional sources of input 1135 of, for instance, a plurality of radio inputs and a mission computer.
- the switch 1110 may be further connected, for instance via Ethernet protocols, or via a packetised communications network, to an additional switch 1112 having respective inputs/outputs.
- the switch 1110 in this example may have a variety of outputs.
- processors such as an image processor 1150 may be operable to process media data for a variety of purposes (e.g. terrain analysis, driver assistance, etc.), which may be optionally returned to the switch 1110 for relay to external devices.
- adapters such as display adapters 1141 and 1143 may interface the switch 1110 with respective media presentation units 1142 and 1144 .
- FIG. 11 various embodiments of a system 1100 with such scalability may further comprise additional components for increased functionality, number of users, inputs, outputs, and the like, while remaining within the scope of the disclosure.
- communication between transceiver units may be in accordance with a multicast protocol.
- the multicast protocol provides for simultaneous transmission (wirelessly, via wired connection, or both) to a plurality or all of the other transceivers.
- Multicast addressing can be used in the link layer (layer 2 in the OSI model), such as Ethernet multicast, and at the internet layer (layer 3 for OSI) for Internet Protocol Version 4 (IPv4) or Version 6 (IPv6) multicast.
- a multicast address may be used, which is a logical identifier for a group, or all, of the transceivers in the network of transceivers, that are available to packets, which includes process datagrams or frames, intended to be multicast for a designated network service.
- communication therebetween may be unicasted to all or a specific subset of the other active transceivers.
- multicast and unicast are also available in some embodiments; for example, broadcast (transmission to all available network nodes); anycast (where a particular transceiver or group of transceivers is identified or targeted as the destination for a given communication from a given transceiver or transceiver's input); or geocast (in which transceivers communicate to other transceivers that are within a particular geographic location).
- broadcast transmission to all available network nodes
- anycast where a particular transceiver or group of transceivers is identified or targeted as the destination for a given communication from a given transceiver or transceiver's input
- geocast in which transceivers communicate to other transceivers that are within a particular geographic location.
- some of all of the transceivers in a group of transceivers may be programmed or configured to communicate packetized network signals to any one, some, or all transceivers based on the communication protocol or characteristics of, for example, the source media signal, the input type or input transceiver for the incoming media source signal, the desired presentation devices that may be associated with a given input types or input port type, or other characteristics.
- a scalable media distribution system operable to interface with a plurality of media data components, the system comprising two or more transceiver units that are operable to transfer packetized network media signals over said packetized communications network in accordance with one or more of a variety of routing schemes, and are operable to receive packetized network media signals transferred over said packetized communications network in accordance with one or more of a variety of said routing schemes.
- the routing schemes may be selected from any one of the following: unicast, multicast, broadcast, anycast, and geocast.
- transceiver units can be configured to selectively transfer or receive packetized network media signals to a subset of other transceiver units, wherein the selectivity of the subset of transceivers is based on characteristics relating to one or more of the following: the media data components, the source media signal, one or more of said source media signal ports, the source media data, the source media signals, and the transceiver unit.
- the devices and methods disclosed elsewhere herein may also, as an alternative (in the same or different embodiments) use different routing schemes and/or selective transmission and/or receiving of packetized network media signals.
- the routing schemes may, in various embodiments, including wired or wireless communication.
- the system 1200 of FIG. 12 shows an exemplary combination of devices that may be employed in a military vehicle 1220 .
- the system 1200 may employ, for instance, an eBUS-ISR network for ruggedised GigE Vision with VIVOE cameras, sensors, and video equipment.
- such equipment may be provided from multiple vendors, and yet employed within the same system 1200 due to the flexibility of a rugged ethernet switch 1210 , in accordance with at least one embodiment.
- a comprehensive API may enable both Linux, Windows, or other operating systems, while allowing for various processing applications via an eBUS-ISR, removing the limitation of users being tied to manufacturer-specific SDKs and further enabling development of systems using, for instance, any GigE Vision and/or VIVEO-compliant sensors. Furthermore, by enabling a shared SDK for all transport functions, designers may preserve existing software investments, in accordance with various embodiments.
- FIG. 13 shows a potential network 1300 that highlights the configurability of a RuggedCONNECT's architecture that enables hosting of multiple mini-PCIe and M2 daughter cards to address a scalable range of sensor and display interfaces to meet a variety of application demands, such as those with SWaP-C concerns. Furthermore, such a network 1300 enables ready implementation of future capabilities (e.g. the integration of additional devices) to, for instance, increase mission effectiveness with minimum integration effort.
- future capabilities e.g. the integration of additional devices
- the high-performance networking capabilities of a RuggedCONNECT system may allow for combination with powerful GPU resources of, for instance, a NVIDIA Jetson TX2i for application-specific processing and graphics overlay for decision-support and/or to reduce cognitive burden and increase mission effectiveness.
- Various further embodiments relate to a scalable method of acquiring and presenting media using any of the abovementioned embodiments of a communications system or network.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Small-Scale Networks (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
Claims (26)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CA3080972 | 2020-05-15 | ||
| CA3080972A CA3080972C (en) | 2020-05-15 | 2020-05-15 | Scalable decentralized media distribution |
| PCT/CA2021/050668 WO2021226723A1 (en) | 2020-05-15 | 2021-05-14 | Scalable decentralized media distribution |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230231896A1 US20230231896A1 (en) | 2023-07-20 |
| US12483613B2 true US12483613B2 (en) | 2025-11-25 |
Family
ID=78525883
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/925,207 Active 2042-03-12 US12483613B2 (en) | 2020-05-15 | 2021-05-14 | Scalable decentralized media distribution |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12483613B2 (en) |
| EP (1) | EP4150860A4 (en) |
| CA (1) | CA3080972C (en) |
| WO (1) | WO2021226723A1 (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240095205A1 (en) * | 2022-09-15 | 2024-03-21 | Mellanox Technologies, Ltd. | User-defined peripheral-bus device implementation |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010120707A1 (en) | 2009-04-14 | 2010-10-21 | Bae Systems Information And Electronic Systems Integration Inc. | Vehicle-mountable imaging systems and methods |
| US20110050925A1 (en) * | 2009-08-28 | 2011-03-03 | Canon Kabushiki Kaisha | Control apparatus, control system, command transmission method, and non-transitory computer-readable storage medium |
| US20110141221A1 (en) * | 2009-12-14 | 2011-06-16 | At&T Intellectual Property I, L.P. | Video Conference System and Method Using Multicast and Unicast Transmissions |
| US9106428B2 (en) * | 2012-10-04 | 2015-08-11 | Broadcom Corporation | Multicast switching for distributed devices |
| US20160234263A1 (en) * | 2015-02-09 | 2016-08-11 | Shigeru Nakamura | Management system, communication system, management method, and recording medium |
| US20170310936A1 (en) * | 2014-11-07 | 2017-10-26 | BAE Systems Hägglunds Aktiebolag | Situation awareness system and method for situation awareness in a combat vehicle |
-
2020
- 2020-05-15 CA CA3080972A patent/CA3080972C/en active Active
-
2021
- 2021-05-14 EP EP21804992.2A patent/EP4150860A4/en active Pending
- 2021-05-14 US US17/925,207 patent/US12483613B2/en active Active
- 2021-05-14 WO PCT/CA2021/050668 patent/WO2021226723A1/en not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010120707A1 (en) | 2009-04-14 | 2010-10-21 | Bae Systems Information And Electronic Systems Integration Inc. | Vehicle-mountable imaging systems and methods |
| US20100295945A1 (en) * | 2009-04-14 | 2010-11-25 | Danny Plemons | Vehicle-Mountable Imaging Systems and Methods |
| US20110050925A1 (en) * | 2009-08-28 | 2011-03-03 | Canon Kabushiki Kaisha | Control apparatus, control system, command transmission method, and non-transitory computer-readable storage medium |
| US20110141221A1 (en) * | 2009-12-14 | 2011-06-16 | At&T Intellectual Property I, L.P. | Video Conference System and Method Using Multicast and Unicast Transmissions |
| US9106428B2 (en) * | 2012-10-04 | 2015-08-11 | Broadcom Corporation | Multicast switching for distributed devices |
| US20170310936A1 (en) * | 2014-11-07 | 2017-10-26 | BAE Systems Hägglunds Aktiebolag | Situation awareness system and method for situation awareness in a combat vehicle |
| US20160234263A1 (en) * | 2015-02-09 | 2016-08-11 | Shigeru Nakamura | Management system, communication system, management method, and recording medium |
Non-Patent Citations (10)
| Title |
|---|
| International Search Report corresponding to International Patent Application No. PCT/CA2021/050668 dated Aug. 26, 2021. |
| John Phillips—"Choosing the Right Video Interface for Military Vision Systems", Proceedings of SPIE, IEEE, US, vol. 9481, May 13, 2015, pp. 948111-1-948111-8. |
| Pleora Technologies, Local Situational Awareness Design and Military and Machine Vision Standards, Retrieved from https://www.pleora.com/resources/whitepapers/local-situational-awareness-design-and-military-and-machine-vision-standards/ (Year: 2017). * |
| RuggedCONNECT Smart Video Switcher and Plug-in AI for Local Situational Awareness, Rev 1.0, Pleora Technologies Inc., May 16, 2019 (May 16, 2019), Retrieved from <http://pleora.bentech-taiwan.com/RuggedCONNECT%20Smart%20Video%20Switcher.pdf>. |
| Written Opinion corresponding to International Patent Application No. PCT/CA2021/050668 dated Aug. 26, 2021. |
| International Search Report corresponding to International Patent Application No. PCT/CA2021/050668 dated Aug. 26, 2021. |
| John Phillips—"Choosing the Right Video Interface for Military Vision Systems", Proceedings of SPIE, IEEE, US, vol. 9481, May 13, 2015, pp. 948111-1-948111-8. |
| Pleora Technologies, Local Situational Awareness Design and Military and Machine Vision Standards, Retrieved from https://www.pleora.com/resources/whitepapers/local-situational-awareness-design-and-military-and-machine-vision-standards/ (Year: 2017). * |
| RuggedCONNECT Smart Video Switcher and Plug-in AI for Local Situational Awareness, Rev 1.0, Pleora Technologies Inc., May 16, 2019 (May 16, 2019), Retrieved from <http://pleora.bentech-taiwan.com/RuggedCONNECT%20Smart%20Video%20Switcher.pdf>. |
| Written Opinion corresponding to International Patent Application No. PCT/CA2021/050668 dated Aug. 26, 2021. |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4150860A1 (en) | 2023-03-22 |
| US20230231896A1 (en) | 2023-07-20 |
| WO2021226723A1 (en) | 2021-11-18 |
| EP4150860A4 (en) | 2024-05-22 |
| CA3080972A1 (en) | 2021-11-15 |
| CA3080972C (en) | 2025-05-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20140249695A1 (en) | Low latency data link system and method | |
| US10306689B2 (en) | Systems and methods for shared mixed reality experiences using digital, physical, temporal or spatial discovery services | |
| CN108700889B (en) | Control method, remote monitoring equipment, remote controller, server and streaming media server | |
| CN110391975B (en) | Information interaction system based on vehicle-mounted Ethernet and vehicle | |
| US9955115B2 (en) | Facilitating wide view video conferencing through a drone network | |
| US20140232616A1 (en) | Proximity-based multi-display configuration | |
| JP2010273125A (en) | Monitoring system, imaging device, analysis device, and monitoring method | |
| US7149660B2 (en) | Sensor application integration framework (SAIF) | |
| CN106982345A (en) | Facilitating wide-angle view video conferencing over UAV networks | |
| US20200213195A1 (en) | Method and device for configuring identical network components, and transportation vehicle | |
| US12483613B2 (en) | Scalable decentralized media distribution | |
| CN107005673A (en) | Head | |
| WO2020155037A1 (en) | Multi-load multi-path image transmission method, control system and terminal, and unmanned aerial vehicle and server | |
| US20180173647A1 (en) | Modular device, system, and method for reconfigurable data distribution | |
| GB2512184A (en) | Systems and methods for video distribution | |
| US20230010445A1 (en) | Methods and systems for generating access instructions based on vehicle seat occupancy status | |
| CN103200056A (en) | Security check system based on network and method of security check system for achieving cooperative work | |
| CN114205759B (en) | Display control method of vehicle-mounted Ethernet display screen | |
| CN117565805A (en) | TBox electronic system integrating around-view monitoring and V2X communication | |
| CN116266820A (en) | Apparatus, system and method for routing multiple diagnostic pathways | |
| CN206117885U (en) | Cradle head | |
| Phillips | Choosing the right video interface for military vision systems | |
| US20050028214A1 (en) | Visual monitoring system and method for use with in-flight air telephone on a mobile platform | |
| CN111131761B (en) | Distributed tiled display system and data transmission method | |
| CN205142277U (en) | A double -screen display system that is used for carrying out information transfer with cloud platform |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION UNDERGOING PREEXAM PROCESSING |
|
| AS | Assignment |
Owner name: PLEORA TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WARKENTIN, CHRIS ERIC;HOU, JONATHAN CHAPMAN;TURZO, ROBERT;AND OTHERS;SIGNING DATES FROM 20211103 TO 20211124;REEL/FRAME:061948/0029 Owner name: PLEORA TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNORS:WARKENTIN, CHRIS ERIC;HOU, JONATHAN CHAPMAN;TURZO, ROBERT;AND OTHERS;SIGNING DATES FROM 20211103 TO 20211124;REEL/FRAME:061948/0029 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |