GB2613459A - A method of managing display data - Google Patents
A method of managing display data Download PDFInfo
- Publication number
- GB2613459A GB2613459A GB2217132.6A GB202217132A GB2613459A GB 2613459 A GB2613459 A GB 2613459A GB 202217132 A GB202217132 A GB 202217132A GB 2613459 A GB2613459 A GB 2613459A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image data
- frame
- metadata
- compressed image
- compressed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
- G06F3/1462—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay with means for detecting differences between the image stored in the host and the images displayed on the remote displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/02—Handling of images in compressed format, e.g. JPEG, MPEG
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2350/00—Solving problems of bandwidth in display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Images are displayed on a shared display (22) by receiving frames (21) of image data from devices (11A, 11B, 11C), each frame including compressed image data portion(s) (32) and respective portion metadata (34) (e.g. indicating size, compression parameters) associated with each of the compressed portion(s). The metadata associated with each portion indicates a location of the portion in relation to the frame (33). The portion metadata for each portion is amended, without decompressing the compressed portions, to indicate a location of the compressed portion in a composited image frame, the composited frame being a composite of the received frames (e.g. forming a grid of non-overlapping portions). The compressed portion(s) included in the frames, and the amended portion metadata associated with each compressed portion, are transmitted to the shared display. The rate at which the compressed image data portions are transmitted to the shared display may be independent of any rates at which the compressed image data portions are received from the devices.
Description
A Method of Managing Display Data
Background
Display data is often generated at one device before being transmitted, for example, wirelessly, to another device where it is displayed to a user. Especially as video quality improves and therefore the volume of a frame of display data increases, it is becoming increasingly desirable to compress display data for transmission from the device where it is generated to the device where it is displayed At the same time, it is desirable in a collaborative setting such as a meeting room system to be able to show display input from multiple connected computing devices on a single display panel, which means that the input from the connected computing devices must be composited into a single frame or series of single frames. This introduces a number of problems, of which one is exacerbated by the compression of the display data.
Conventionally, in order to carry out composition into a single frame, a compositor must be able to access the raw display data to be composited. This means that in a collaborative system the data must either be transmitted uncompressed and then compressed after composition, if the compositor is to transmit compressed data, or must be decompressed, composited, and recompressed, introducing significant delay and inefficiency. Similar problems arise if data is encrypted, and it may be undesirable to allow an intermediate device such as a compositor to access unenciypted data. This currently makes it impossible to properly use encrypted data in such a system.
The methods and devices of the invention seek to solve or at least mitigate these problems
Summary
Accordingly, in one aspect, the invention provides a method of managing display data from a plurality of originating devices for display on a shared display, the method comprising: receiving, at a compositor from each of the plurality of originating devices, compressed and/or encrypted image data portions of a frame of image data, receiving, at the compositor from each of the plurality of originating devices, portion metadata for each of the compressed and/or encrypted image data portions indicating a location of the compressed and/or encrypted image data portions in the frame of image data from a particular originating device, a size of the compressed and/or encrypted image data portions, and compression and/or encryption parameters and/or protocols; receiving, at the compositor from each of the plurality of originating devices, frame metadata for the frame of image data indicating a size of the frame of image data and a format of the frame of image data; compositing, by the compositor, the compressed and/or encrypted image data portions without decompressing and/or decrypting the compressed and/or encrypted image data portions, based on the portion and frame metadata, by generating composited frame metadata for the composited image frame indicating a size of the composited image frame and a format of the composited image frame and amending the portion metadata for each of the compressed and/or encrypted image data portions from the plurality of originating devices to indicate a location of the compressed and/or encrypted image data portions in the composited image frame, IS transmitting, by the compositor to a display control device, the compressed and/or encrypted image data portions without decompressing and/or decrypting the compressed and/or encrypted image data portions, the composited frame metadata and the amended portion metadata including the location of the compressed and/or encrypted image data portions in the composited image frame, and the compression and/or encryption parameters and/or protocols.
In one embodiment, compositing comprises amending the portion metadata to indicate a different size of the compressed and/or encrypted image data portions in the composited image frame.
In embodiments, different image data portions may be compressed and/or encrypted using different compression and/or encryption parameters and/or protocols. The different image data portions may be from different ones of the plurality of originating devices or may be from a same originating device.
According to one embodiment, compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged to maintain the frame of image data from each of the plurality of originating devices separately in the composited image frame.
Preferably, compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged in a grid-like pattern in the composited image frame.
The compositing preferably comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices do not overlap in the composited image frame.
In an embodiment, the method further comprises receiving, by the compositor, instructions indicating where the frames of image data from different ones of the plurality of originating devices are to be arranged in the composited image frame, and wherein compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged in the composited image frame according to the received instructions.
According to a second aspect, the invention provides a compositor configured to perform the method described above.
According to a third aspect, the invention provides a display system comprising: a compositor as described above, and a display control device configured to receive from the compositor the compressed and/or encrypted image data portions, the composited frame metadata and the amended portion metadata, the display control device including a decompressing and/or decryption module configured to decompress and/or decrypt the compressed and/or encrypted image data portions based on the compression and/or encryption parameters and/or protocols, the display control device further being configured to send the decompressed and/or decrypted image data portions to a display device to display the decompressed and/or decrypted image data portions at locations according to the composited frame metadata and the amended portion metadata.
According to a further aspect, the invention provides a method of compressing and 30 transmitting display data from two or more computing devices for display on one or more display panels, comprising: 1 Each computing device generates an initial frame of display data; 2 Each computing device compresses the initial frame of display data; 3 Each computing device transmits its compressed initial frame of display data to a compositor together with compression information; 4 The compositor composites the received initial frames of display data into one or more set(s) of remapped data without decompressing the image data, using metadata; The compositor transmits the set(s) of remapped data to a decompressor, such set(s) of remapped data incorporating metadata and the compression information; 6 The decompressor uses the metadata and the compression information to decompress the image data contained in the remapped data, producing one or more final image(s); and 7 The decompressor sends the finalimage(s) to the display panel(s) for display.
This allows multiple computing devices to transmit compressed display data in order to create a combined display, while avoiding the need to decompress the received data prior to composition and recompress it afterwards, which otherwise adds to the time and computing effort required for composition. It also means that confidential data can be transmitted more securely, as only the computing device that produced it and the decompressor need to be able to decompress it.
Accordingly, for this purpose compression may also or instead mean encryption.
The computing devices may all compress their initial frames using the same compression algorithm, either with the same or different parameters or may use different compression algorithms as appropriate to the different types of data. The type of compression used and appropriate parameters are then sent alongside the data as part of the compression information The metadata used by the compositor and the decompressor includes the locations of the different initial frames within the final image(s) in order to assist with composition and decompression The compositor may have an internal clock rate independent of the computing devices and produce remapped data according to that clock rate. This will take advantage of intelligence within the compositor in order to reduce the need for synchronisation of the clients, as well as further obfuscating the exact activity of the clients.
Brief Description of the Drawings
Embodiments of the invention will now be more fully described, by way of example, with reference to the drawings, of which: Figure 1 shows a basic network topography: Figure 2 shows the conventional transmission of frames of display data; Figure 3 shows the transmission of frames of display data according to an embodiment of the invention; Figure 4 shows a simplified block diagram of an example compositor; Figure 5 shows a conceptual version of remapped data, rendered as the tile layout of an
example frame;
Figure 6 shows a second basic network topography; Figure 7 shows the transmission of frames of display data according to a second embodiment of the invention; and Figure 8 shows the use of the invention for regulating update rates.
Detailed Description of the Drawings
For the purposes of this description, the following terms will be used: * Initial frame: The frame of display data produced by each computing device and transmitted to the compositor.
* Remapped data: The display data and additional and modified metadata produced by the compositor and transmitted to the display control device.
* Final image(s): The display data generated from the remapped data, which is/are displayed on the display device(s).
Figure 1 shows the architecture of a basic arrangement such as could be used for an embodiment of the invention. In this embodiment, three computing devices [11] are connected to a central compositor [12] and are able to transmit display data to it. They may also be able to transmit and/or receive other data such as audio data and user interaction data, for example touch information passed to them from a display device [14] if this has a touchscreen. For the purposes of this description, the main purpose of the compositor [12] is to composite frames of display data received from the computing devices [11] into frames for display on the display device [14].
The compositor [12] is in turn connected to a display control device [13], which receives the composited frames from the compositor [12] and prepares them for display on the display device [14], then passes them to the display device [14] for display.
The connections between the devices [ 1 1, U, 13, 14] may be over any appropriate media: wired or wireless, and either local or across a network connection, including the internet. Accordingly, the computing devices [11] may be remotely located compared to the compositor [12] and one another. Some devices [11, 12, 13, 14] may also be co-located such that they share a casing and appear to be a single device. For example, the display control device [13] may be built into the casing of a display device [14].
Figure 2 shows a conventional transmission of frames of display data. In this example, the frames of display data produced by the computing devices [11] are not compressed at any point in the process. This may not desirable, however, especially where the computing devices [11] are not local to the compositor [12] or are transmitting data across a bandwidth-limited connection such as USB or a wireless connection, when it is preferable to compress the data in order to reduce its volume for transmission. Furthermore, where data is proprietary -for example, a clip from a film -it may be desirable to encrypt the data. Accordingly, for this purpose -compression-may include or be replaced by encryption.
Each computing device [11A, 11B, 11C] produces a frame of display data [21A, 21B, 21C] according to the operation of its internal programming, which may include user applications, an operating system, etc. In this example, the first computing device [11A] produces a frame of display data [21A] showing a heart, the second computing device [11B] produces a frame of display data [21B] showing a star, and the third computing device [11C] produces background data [21C], here shown as a plain field hatched with dots. In some embodiments, the computing device [11C] which produces background data is built into the compositor [12], but it is shown separately here for clarity.
The three frames of display data [21] are transmitted to the compositor [12], which uses them to produce a single frame [22] -the final image -for display in accordance with instructions. This may involve applying transformations to the received data such as scaling the frames, overlapping them such that one frame is partially hidden, etc., for example, here the frame [21C] produced by the third computing device [11C] is partially hidden by the frames [21A, 21B] produced by the other two computing devices [HA, 11B], which have themselves been scaled to fit within the final image [22].
In some embodiments, the initial frames [21] are compressed by the computing devices [11] prior to transmission to the compositor [12]. However, conventionally the compositor [12] must decompress them prior to composition in order to apply transformations and produce the final image [22], which it may then recompress prior to transmission. This repeated decompression and recompression is inefficient and results in wasted time and processing power. It also means that the compositor [12] will have access to the raw display data during the composition process, which in some cases may not be desirable, for example where the data is proprietary and the compositor [12] is not necessarily a trusted device.
Figure 3 shows a similar process of the transmission of frames of display data, but in this case the process is carried out according to an embodiment of the invention.
As described with reference to Figure 2, the three computing devices [11A, 11B, 11C] each produce an initial frame [21A, 21B, 21C] in accordance with the operation of their internal programming. The initial frames [21] are then compressed [31] to produce compressed initial frames [32], and these are sent to the compositor [12] together with compression information [34]. This information may include details of the compression algorithm used, any key parameters such as the level of quantisation applied, etc The compressed initial frames [32] may also be accompanied by internal metadata [34]. This is especially important where a block-based encoding method has been used and the compressed initial frame [32] is or can be transmitted in parts rather than necessarily being transmitted in a continuous stream, since each part will have its own location within the initial frame [21/32], which will be sent to the compositor [12] as part of internal metadata [34].
The internal metadata [34] may also include the dimensions of each part of the compressed initial frame [32], and compression information [34] may also be transmitted on a per-part basis. Furthermore, the internal metadata [34] may include the dimensions and/or volume of data of the entire initial frame [21/32], its format, any buffer management in use, any update batching in use, and special settings for the decompressor [13], which may also or instead be included in the compression information [34].
The compressed initial frames [32] are received by the compositor [12], which according to the methods of the invention does not need to decompress them in order to generate remapped data [33/35] and thus ultimately a final image [22] but is able to composite them in their compressed form The internal workings of an example compositor [12] arranged according to the invention are shown in Figure 4.
Figure 4 shows a simplified block diagram of an example compositor [12]. Compressed initial frames [32] are received from the computing devices [11] along with compression information and possibly other metadata [34]. The compressed initial frames [32] are stored in a frame buffer [41] and the compression information [34] is stored in compression information storage [44]. Internal metadata [34] may also be stored in the compression information storage [44] or it may be stored in the frame buffer [41] with the compressed initial frames [32], for example such that a piece of internal metadata [34] is associated with each piece of compressed display data, or it may be stored in a separate memory not shown here. The frame buffer [41] is connected to a composition engine [42], an engine or processor arranged to amend metadata and display data as appropriate to carry out the methods of the invention. The composition engine [42] is also connected to metadata storage [43], which stores data such as the locations of the initial frame data relative to other initial frame data, as previously mentioned. There may also be conversion factors such as mappings to be used in determining how to convert internal metadata into overall metadata. For example, locations of parts of compressed initial frames [32] relative to the initial frames [12] must be converted into locations relative to the final image [22]. The composition engine [42] uses this information to create remapped data [33], comprising compressed image data [33] and metadata [35], without any need to access the raw data of the initial frames [21]. Naturally, the metadata in the metadata storage [43] can be modified by user interactions, such as instructions to change the locations in a final image [22] where the output from a particular computing device [11] should appear.
The metadata in the metadata storage [41] also includes ordering information. For example, this may indicate that as a general rule the initial frames [21A, 21B] for the first two computing devices [11A, 11B] should never overlap, but both appear on top of the initial frame [21C] from the third computing device [11C] and that if they do temporarily overlap, for example if a change in position is animated and one image is moving past the other, the image [21A] transmitted by the first computing device [11A] should appear "on top" Furthermore, if the metadata in the metadata storage [41] includes mappings between locations in internal metadata [34] and locations in a final image [22], parts of the initial frames [21] may purposefully be relocated relative to one another in the final image [22] such that, for example, the left half of the initial frame [21A] from the first computing device [11A] might be located in the final image [22] on the left of the initial frame [21B] from the second computing device [11B] while the right-hand half is located on the right of the initial frame [21B] from the second computing device [11B].
The composition engine [42] uses the metadata in the metadata storage to amend any internal metadata in order to produce remapped data as described hereinafter in Figure 5. It then transmits the compressed image data [33] from the compressed initial frames [32], the compression information [34/35] received from the computing devices [11], and a copy of its own metadata or the amended versions of the internal metadata (in either case known simply as metadata [35]), together known as remapped data [33/35], to the display control device [13], which includes a decompressor.
This combination will enable the decompressor [13] to identify which parts of the image data in the remapped data [33] are associated with each computing device [11] and therefore decompress them correctly and display them in the correct locations in the final frame [22].
In addition to metadata generated from internal metadata -for example, the locations that were transmitted as part of the internal metadata overwritten by locations relative to the final image [22] -and/or metadata from the compositor [12] alone, depending on the exact embodiment, the metadata [35] may include: * Frame size * Frame format * Information on any update batching used, such as grouping of transmitted tiles and any dependencies between them, which may also be included in the compression information.
* Any re-mapping that will be required of encoder and decoder resources, depending on the exact embodiment This may also be added to the compression information Though this is not an exhaustive list and other metadata [35] may be included.
In either case, the decompressor [13] can use the metadata [35] to apply the correct decompression to the correct parts of the image data in the remapped data [33] in order to produce the final image [22], which can then be sent to the display device [14] for display.
This method works best where the compression algorithms used use block-based compression methods, whereby the initial frames [21] are divided into blocks or tiles for compression, as previously mentioned. In any case, the composition must take place along a grid of some sort in order to allow the composition engine [42] to carry out composition with no knowledge of the contents of the initial frames [21] and to allow the decompressor [13] to apply the correct decompression algorithms to the correct parts of the image data in the remapped data [33]. An example of such a grid is shown in Figure 5, which shows remapped data [33/35] conceptualised as a larger version of the final image [22] shown in Figures 2 and 3.
This comprises the initial frame [21A] from the first computing device [11A] on the left, the initial frame [21B] from the second computing device [11B] on the right, and the display data from the initial frame [21C] from the third computing device [11C] used as background, appearing as a border around the other two images [2]A. 21B]. This represents a conceptual combination of the image data [33] and location and compression metadata [35] comprising the remapped data [33/35], which in practice may be transmitted piecemeal and only fully assembled when the final image [22] is produced.
The conceptual remapped data [51] is divided into a grid based on the size and shape of the final image [22] and when the composition engine [42] combined the compressed initial frames [32] to generate the remapped data [33] it lined the compressed initial frames [32] up with the grid, as shown in Figure 5 This means that, in this example, the conceptual remapped data [51] has one grid box (columns A and land rows 1 and 8) between the outside edge of the final image [22] and the edges of the initial frames [21A, 21B] from the first [11A] and second [11B] computing devices and a one grid box (column E) between the initial frames [21A, 2]B] from the first [11A] and second [11B] computing devices, these grid boxes being where the display data in the initial frame [21C] from the third computing device [11C] is visible.
The blocks in the grid may correspond to blocks or tiles used in compression [31], or they may simply be treated as co-ordinates. Furthermore, while the grid blocks shown in the Figure are relatively large, this is for clarity only; they may be any arbitrary size or shape and may be individual pixels, depending on the size and shape of the final image [22] and the composition and compression algorithms in use.
The grid and the alignment of the compressed initial frames [32] within it will be included in the metadata [35] sent from the compositor [12] to the decompressor [13], and the decompressor [13] is therefore able to determine which chunks of the image data in the remapped data [33] correspond to which initial frame [21] even though the compositor [12] had no knowledge of the contents of the initial frames [21] and the image data in the remapped data [33] will be garbled by the compression applied Accordingly, since the metadata [35] includes compression information, the decompressor [13] knows which decompression algorithms to use for different parts of the image data in the remapped data [33] and can therefore correctly decompress the entire final image [22]. In Figure 5, this is shown by a letter notation such that x, y, and z indicate different compression algorithms or different parameters used by the same compression algorithm.
In this example, the first computing device [11A] compresses [31A] its initial frame [21A] using algorithm y. It then transmits the compressed initial frame [32A] to the compositor [12], together with "y" as the compression information [34A]. The second computing device [11B] compresses [31B] its initial frame [21B] using algorithm z and transmits it to the compositor [12] together with compression information "z" [34B], and the third computing device [11C] compresses [31C] its initial frame [21C] using algorithm x and transmits it to the compositor [12] together with compression information "x" [34C]. The compressed initial frames [32] and the compression information [34] are stored in the frame buffer [41] and compression information storage [44] as previously described The composition engine [42] fetches metadata from the metadata storage [43] indicating that the initial frame [21A] from the first computing device [11A] should ultimately occupy a rectangle with the top left comer at B2 in the grid and the bottom right corner at D7 in the grid. It therefore fetches the compressed initial frame [32A] from the frame buffer [41], scales it as appropriate, and amends the metadata associated with each chunk of image data -in this example, a grid square -such that the initial frame [21A] will ultimately appear in that rectangle when the final image [22] is decompressed and displayed: for example, amending a co-ordinate A' l' indicating the top-left chunk in the compressed initial frame [32A] from the first computing device [11A] -a co-ordinate relative to that initial frame [21A/32A] only -to B2, a co-ordinate relative to the final image [22]. It does the same for the compressed initial frame [32B] from the second computing device [11B], placing it in a rectangle with the top left corner at F2 and the bottom right corner at H7. It then fills in columns A, E, and I and rows 1 and 8 with the contents of the compressed initial frame [32C] from the third computing device [11C] and transmits the image data [33] together with the metadata it used, the amended metadata, and the compression information [35] to the decompressor [13].
The decompressor [13] receives the image data [33] and the compression information and metadata [35] from the compositor [12] and is able to determine from the metadata [35] that image data [33] which according to its associated metadata [35] will occupy the rectangle with its top left corner at B2 in the grid and its bottom right corner at D7 is compressed using compression algorithm y, image data [33] which according to its associated metadata [35] will occupy the rectangle with its top left corner at F2 and its bottom right corner at H7 is compressed using compression algorithm z and the remainder of the image data [33] is compressed with compression algorithm x. It is therefore able to apply the correct decompression to the correct parts of the image data in the remapped data [33] in order to produce the final image [22].
Figure 6 shows an alternative network topography in which there are two display devices [14] connected to the display control device [13], which is in turn connected to a compositor [12] and thus three computing devices [11] as previously described. This version of the overall system will operate in a similar way to that shown in Figure 1, as described with reference to Figure 7, Figure 7 shows a similar view of the process to that shown in Figure 3, in the network environment shown in Figure 6. In this case, the two display devices [14A, 14B] show different final images [22A, 22B], composited from the same initial frames [21] In order to achieve this result, the three computing devices [11] generate, compress [31], and transmit the same three initial frames [21], together with compression information and possibly internal metadata [34] as previously described, to the compositor [12]. However, the compositor [12] is arranged to generate two sets of remapped data [33/35] and therefore has two sets of metadata, one for each set of remapped data [33/35] to be generated. Where the display devices [14] have, for example, different sizes or resolutions the grids may also be different, thus requiring a different set of metadata to determine the locations of the different data. However, for simplicity they will be described herein as having the same layouts and grids.
In this example, the first final image [22A] is as described in Figure 5. The second [22B] has the same layout, but the initial frames [21A, 21B] from the first [11A] and second [11B] computing devices are reversed such that the initial frame [21B] from the second computing device [11B] is on the left and the initial frame [21A] from the first computing device [11A] is on the right.
The composition engine [42] will therefore produce the first set of remapped data [33A/33B] as previously described with reference to Figure 5 but will then fetch the second set of metadata for the second set of remapped data [33B/35B]. It will therefore amend the metadata of the compressed initial frame [32B] from the second computing device [11B] such that in the conceptual remapped data [51] shown in Figure 5 it would appear in the rectangle with its top left corner at B2 in the grid and its bottom right corner at D7, while the compressed initial frame [32A] from the first computing device [11A] appears in the rectangle with its top left corner at F2 and its bottom right corner at H7 and the image data from the compressed initial frame [32C] from the third computing device [11C] appears in the remaining spaces. It then transmits the image data [33B] to the display control device [13] with the second set of metadata and the same set of compression information [35], though, naturally, linked to the second set of metadata such that the correct compression information is associated with the correct chunks of image data [33B] in the remapped data [33B/35B]. The metadata [35] in this case will also include an identification of the display device [14] on which each final image [22] is to be displayed.
The decompressor receives both sets of remapped data [33/35]. It then handles each set of remapped data [33/35] as previously described and sends the final images [22] for display on the display device [14] identified in the metadata [35] The same methods can be used where the final images [22] are not the same size and shape, where they have drastically different layouts rather than identical layouts, and where the final images [22] do not all show the initial images [21] from all available computing devices [11].
Figure 8 shows a further application of the system according to an embodiment of the invention in regulating update rates. Figure 8a shows a conventional system in which the rate at which the compositor [12] generates remapped data [33/35] is reliant on the rate at which the computing devices [11] generate initial frames [21], such that it generates updated remapped data [33/35] each time a computing device [ii] generates an initial frame [21], resulting in a rate of 90Hz from three connected computing devices [11] each with a rate of 30Hz. This can result in inconveniently high frame rates and excessive network traffic, since the contents of the initial frames [21] may not have changed appreciably and in any case only one initial frame [21] may have changed at all. Conventionally, the only way to avoid this problem is to synchronise the clocks and therefore output rates of the computing devices [11], which is notoriously difficult.
Figure 8b shows a system according to an embodiment of the invention in which the compositor [12] has no awareness of the contents of the initial frames [21] since they are received in a compressed form [32] and are not decompressed as part of the operation of the compositor [12]. In this system, the computing devices [11] can have different, arbitrary clock rates and output rates (in this example, the first computing device [11A] has a output rate of 90Hz, the second computing device [11B] has an output rate of 30Hz and the third computing device [11C] has an output rate of 10Hz) but the compositor [12] has its own output rate (in this example, 60Hz). Every time it generates remapped data [33/35], it simply consults the appropriate metadata and the stored compressed initial frames [32], regardless of when such initial frames [32] were last updated. This provides a further level of removal between the actual operation of the computing devices [11] and the display data they produce and the operation of the composition engine [42].
The initial frames [21] may not all be compressed [31] prior to composition and in some cases the composition engine [42] may compress the uncompressed data prior to transmitting the remapped data [33/35]. For example, if the third computing device [11C] is internal to the compositor [12] and only produces background data, this data may not be compressed when it is received by the composition engine [42]. Since the composition is independent of whether or how each initial frame [21] is compressed, the uncompressed data can be composited into the appropriate places in the grid as previously described and transmitted with null compression information or some other indication that it is not compressed, or it may be composited into the correct place and then compression applied to those parts of the combined frame [33/35] only. Naturally, the same applies to encryption or lack of encryption.
Although particular embodiments have been described in detail above, it will be appreciated that various changes, modifications and improvements can be made by a person skilled in the art without departing from the scope of the present invention as defined in the claims. For example, hardware aspects may be implemented as software where appropriate and vice versa, and engines/modules which are described as separate may be combined into single engines/modules and vice versa. Functionality of the engines or other modules may be embodied in one or more hardware processing device(s) e.g. processors and/or in one or more software modules, or in any appropriate combination of hardware devices and software modules. Furthermore, software instructions to implement the described methods may be provided on a computer readable medium.
Aspects of the apparatus and methods described herein are further exemplified in the following numbered CLAUSES: CLAUSE 1. A method of managing display data from a plurality of originating devices for display on a shared display, the method comprising: receiving, at a compositor from each of the plurality of originating devices, compressed and/or encrypted image data portions of a frame of image data, receiving, at the compositor from each of the plurality of originating devices, portion metadata for each of the compressed and/or encrypted image data portions indicating a location of the compressed and/or encrypted image data portions in the frame of image data from a particular originating device, a size of the compressed and/or encrypted image data portions, and compression and/or encryption parameters and/or protocols; receiving, at the compositor from each of the plurality of originating devices, frame metadata for the frame of image data indicating a size of the frame of image data and a format of the frame of image data; compositing, by the compositor, the compressed and/or encrypted image data portions without decompressing and/or decrypting the compressed and/or encrypted image data portions, based on the portion and frame metadata, by generating composited frame metadata for the composited image frame indicating a size of the composited image frame and a format of the composited image frame and amending the portion metadata for each of the compressed and/or encrypted image data portions from the plurality of originating devices to indicate a location of the compressed and/or encrypted image data portions in the composited image frame; transmitting, by the compositor to a display control device, the compressed and/or encrypted image data portions without decompressing and/or decrypting the compressed and/or encrypted image data portions, the composited frame metadata and the amended portion metadata, including the location of the compressed and/or encrypted image data portions in the composited image frame, and the compression and/or encryption parameters and/or protocols.
CLAUSE 2. A method of managing display data according to clause 1, wherein compositing comprises amending the portion metadata to indicate a different size of the compressed and/or encrypted image data portions in the composited image frame.
CLAUSE 3 A method of managing display data according to either clause 1 or clause 2, wherein different image data portions are compressed and/or encrypted using different compression and/or encryption parameters and/or protocols.
CLAUSE 4. A method of managing display data according to any preceding clause, wherein compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged to maintain the frame of image data from each of the plurality of originating devices separately in the composited image frame.
CLAUSE 5. A method of managing display data according to any preceding clause, wherein compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged in a grid-like pattern in the composited image frame.
CLAUSE 6. A method of managing display data according to any preceding clause, wherein compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices do not overlap in the composited image frame.
CLAUSE 7 A method of managing display data according to any preceding clause, further comprising, receiving by the compositor, instructions indicating where the frames of image data from different ones of the plurality of originating devices are to be arranged in the composited image frame, and wherein compositing comprises generating the composited frame metadata and amending the portion metadata so that the compressed and/or encrypted image data portions from each of the plurality of originating devices are arranged in the composited image frame according to the received instructions CLAUSE 8. A method of managing display data according to any preceding clause, wherein the compressed and/or encrypted image data portions are received from each of the plurality of originating devices at independent rates, as they are generated by the plurality of originating devices and the compressed and/or encrypted image data portions are transmitted by the compositor to the display control device at a rate independent of the rates at which the compressed and/or encrypted image data portions are received from each of the plurality of originating devices.
CLAUSE 9. A method of managing display data according to any preceding clause, wherein the compositor further receives uncompressed and/or unencrypted image data portions and composites them together with the compressed and/or encrypted image data portions without decompressing and/or decrypting the compressed and/or encrypted image data portions.
CLAUSE 10. A compositor configured to perform all steps of a method according to any one of the preceding clauses.
CLAUSE 11. A display system comprising: a compositor according to clause 10; a display control device configured to receive from the compositor the compressed and/or encrypted image data portions, the composited frame metadata and the amended portion metadata, the display control device including a decompressing and/or decryption module configured to decompress and/or decrypt the compressed and/or encrypted image data portions based on the compression and/or encryption parameters and/or protocols, the display control device further being configured to send the decompressed and/or decrypted image data portions to a display device to display the decompressed and/or decrypted image data portions according to the composited frame metadata and the amended portion metadata.
Claims (14)
- Claims A method for displaying images on a shared display, the method comprising: receiving a plurality of frames of image data from a plurality of originating devices, respectively, each frame of the plurality of frames including one or more compressed image data portions and respective portion metadata associated with each of the compressed image data portions the portion metadata associated with each compressed image data portion indicating a location of the compressed image data portion in relation to the frame; amending the portion metadata for each of the compressed image data portions, without decompressing the compressed image data portions, to indicate a pixel coordinate location of the compressed image data portion in a composited image frame, the composited image data frame being a composite of the plurality of frames; and transmitting, to the shared display, the one or more compressed image data portions included in the plurality of frames and the amended portion metadata associated with each of the compressed image data portions.
- 2. The method of claim 1, wherein the portion metadata associated with each compressed image portion further indicates a size of the compressed image data portion and the amending the portion metadata associated with each compressed image data portion changes the size of the compressed image data portion indicated by the portion metadata.
- 3. The method of either claim 1 or claim 2, wherein the portion metadata and the amended portion metadata associated with each compressed image data portion further indicate one or more compression parameters or protocols associated with the compressed image data portion.
- 4 The method of claim 3, wherein the portion metadata and the amended portion metadata associated with different compressed image data portions indicate different compression parameters or protocols
- 5. The method of any preceding claim, wherein the amending of the portion metadata comprises: arranging the compressed image data portions so that each of the plurality of frames is separately represented in the composited image frame.
- 6 The method of any preceding claim, wherein the amending of the portion metadata comprises: arranging the compressed image data portions in a grid-like pattern in the composited 10 image frame.
- 7. The method of any preceding claim, wherein the amending of the portion metadata comprises: arranging the compressed image data portions to not overlap in the composited image frame
- 8. The method of any preceding claim, wherein the amending of the portion metadata comprises: receiving instructions indicating an arrangement of the plurality of frames relative to the composited image frame; and arranging the compressed image data portions in the composited image frame based on the received instnictions
- 9. The method of any preceding claim, wherein a rate at which the compressed image data portions are transmitted to the shared display is independent of any rates at which the compressed image data portions are received from the plurality of originating devices.
- 10. The method of any preceding claim, further comprising: receiving one or more uncompressed image data portions; and 19 compositing the one or more uncompressed image data portions together with the compressed image data portions without decompressing any ofthe compressed image data portions.
- 11. The method of any preceding claim, further comprising: generating composite frame metadata indicating a size and format of the composited image frame, and transmitting the composited image frame metadata to the shared display.
- 12. The method of claim 11, wherein each of the plurality of frames further includes frame metadata indicating a size and format of the frame that is different than the size and format of the composited frame.
- 13. An apparatus comprising: one or more processors and a memory storing instructions that, when executed by the one or more processors, cause the apparatus to perform the method according to any one of the preceding claims.
- 14. A display system comprising: a shared display; a plurality of originating devices configured to output a plurality of frames of image data, respectively, for display on the shared display, each frame of the plurality of frames including one or more compressed image data portions and respective portion metadata associated with each of the compressed image data portions, the portions metadata associated with each compressed image data portion indicating a location of the compressed image data portion in relation to the frame, and a compositor comprising an apparatus according to claim 13
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2217132.6A GB2613459B (en) | 2019-01-04 | 2019-01-04 | A method of managing display data |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1900136.1A GB2580368B (en) | 2019-01-04 | 2019-01-04 | A method of managing display data |
| GB2217132.6A GB2613459B (en) | 2019-01-04 | 2019-01-04 | A method of managing display data |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| GB202217132D0 GB202217132D0 (en) | 2022-12-28 |
| GB2613459A true GB2613459A (en) | 2023-06-07 |
| GB2613459B GB2613459B (en) | 2023-09-06 |
Family
ID=86322905
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB2217132.6A Active GB2613459B (en) | 2019-01-04 | 2019-01-04 | A method of managing display data |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2613459B (en) |
-
2019
- 2019-01-04 GB GB2217132.6A patent/GB2613459B/en active Active
Non-Patent Citations (1)
| Title |
|---|
| None * |
Also Published As
| Publication number | Publication date |
|---|---|
| GB2613459B (en) | 2023-09-06 |
| GB202217132D0 (en) | 2022-12-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Zahran et al. | A comparison between parallel and segmentation methods used for image encryption-decryption | |
| EP2274739B1 (en) | Video multiviewer system with serial digital interface and related methods | |
| CN101282410B (en) | Multidimensional data encoding apparatus and decoding apparatus, and control method thereof | |
| US20070234229A1 (en) | Server apparatus of computer system | |
| GB2539241B (en) | Video processing system | |
| JP5158096B2 (en) | Encoding data generation apparatus, encoding data generation method, decoding apparatus, and decoding method | |
| US10896536B2 (en) | Providing output surface data to a display in data processing systems | |
| WO2020012968A1 (en) | Image processing device and method | |
| CN104285449A (en) | Systems and methods for providing content to wireless display screen | |
| US11805278B2 (en) | Image compression | |
| US20120127185A1 (en) | System and method for an optimized on-the-fly table creation algorithm | |
| JP4181802B2 (en) | Information processing apparatus, information processing method, program, and storage medium | |
| US11580619B2 (en) | Method of managing display data | |
| GB2613459A (en) | A method of managing display data | |
| US7515711B2 (en) | Methods and apparatuses for encrypting video and for decrypting video | |
| CN101489123A (en) | Digital video content fast protection and deprotection method | |
| US20210377550A1 (en) | Methods, apparatuses, computer programs and computer-readable media for processing configuration data | |
| JP7799430B2 (en) | Image processing device, control method and program | |
| US11189006B2 (en) | Managing data for transportation | |
| Salim et al. | FPGA implementation of hiding information using cryptography | |
| Hema et al. | Efficient Compression of Multimedia Data using Lempel–Ziv–Markov Chain Adaptive Block Compressive Sensing (LZMC-ABCS) | |
| KR102872205B1 (en) | A image signal processor, a method of operating a image signal processor and a image processing system including the image processing device | |
| JPH11312173A (en) | High-definition image processing apparatus and its program storage medium | |
| US8667300B2 (en) | Image processing apparatus and method | |
| JP2000115551A (en) | Ciphering method, deciphering method, ciphering device, deciphering device and recording medium |