IL97400A - Method and system for transmitting and receiving a video image - Google Patents
Method and system for transmitting and receiving a video imageInfo
- Publication number
- IL97400A IL97400A IL9740091A IL9740091A IL97400A IL 97400 A IL97400 A IL 97400A IL 9740091 A IL9740091 A IL 9740091A IL 9740091 A IL9740091 A IL 9740091A IL 97400 A IL97400 A IL 97400A
- Authority
- IL
- Israel
- Prior art keywords
- pixel data
- high definition
- area
- definition
- video frame
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 29
- 238000007906 compression Methods 0.000 claims description 8
- 230000006835 compression Effects 0.000 claims description 8
- 230000000875 corresponding effect Effects 0.000 description 17
- 108091006146 Channels Proteins 0.000 description 12
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 4
- 238000013144 data compression Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 101000694017 Homo sapiens Sodium channel protein type 5 subunit alpha Proteins 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Landscapes
- Television Systems (AREA)
Description
ΊΝ>7Μ jmmji no»t>p *) -m»«!? ro->y»"> nt>>© A method and system for transmitting and receiving a video image The inventors are: :on o»N>s»nn DOV GOSHEN }vn * at ELIAHU GOSHEN ORCOM LTD. n"ya οι -ηκ C. 82032 FIELD OF THE INVENTION The present invention relates to a method and system for transmitting and receiving a video image. In particular, it relates to high definition television.
BACKGROUND OF THE INVENTION An observer of an image display, such as a television screen, may have somewhat contradictory needs, namely: a wide field of view, on the one hand, and high angular resolution, on the other. A wide field of view serves for general orientation and detection of interesting features in the frame, whilst high angular resolution (i.e. high definition) is required in order to examine interesting features in detail.
One approach to satisfying both of these requirements is so-called high definition television (HDTV) which photographs an image using a high resolution television camera so that the resulting image has a relatively high number of pixels compared to conventional television images and then to display the high resolution video image on a suitable television screen having a corresponding number of pixels thereon. Clearly, such an approach requires that a much greater amount of pixel data be transmitted over the broadcast channel and this can only be done by increasing the bandwidth of the broadcast channel. Clearly, such an approach is not compatible with existing television broadcast standards such as, for example, PAL, SECAM and TSC.
However, such an approach is not even warranted when the viewer only wishes to discern a relatively small area of the television image at high resolution and is content to view the rest of the scene at a significantly lower resolution. Under these circumstances, a conventional low resolution television screen can be employed for displaying a wide angle view, zooming being used in order to magnify on the screen those details in which the viewer has particular interest. However, such technique does not permit the viewer to observe both the wide angle view and the high resolution view simultaneously. Alternatively, two low resolution television cameras can be employed, one for producing an image of the wide angle scene and the other for producing a low resolution image having a narrow field of view of selected features of the scene, both images having correlated lines of sight. The low and high resolution images are then transmitted either along separate communication channels or, alternatively, along a single wide band communication channel and are displayed on two TV screens. Alternatively, two images may be displayed simultaneously on the same, preferably high definition, TV screen. Such solutions are both expensive and cumbersome.
SUMMAKY OF THE INVENTION It is an object of the invention to provide a method and system for transmitting and receiving a video image having a generally low definition but including at least one area having a high definition, in which the draw-- backs associated with hitherto proposed methods and systems are- substantially reduced or eliminated.
According to a broad aspect of the invention there is provided a method for transmitting and receiving a video image having a generally low definition but including at least one area having a high definition, the method comprising the steps of: (1) at a transmitter site: (a) photographing a scene so as to produce a high definition video frame having a plurality of pixels, (b) identifying at least one area of the high definition video frame, (c) compressing the high definition video frame at a first compression ratio so as to produce a relatively low definition video frame, (d) storing pixel data associated with the or each area identified in (b) , (e) storing pixel data associated with the compressed low definition video frame, and (f) transmitting the stored low definition pixel data together with the stored high definition pixel data; and (2) at a receiver site: (g) receiving all of the transmitted data, (h) storing boundary data associated with the or each area identified in (b), ( j ) processing the pixel data received in step (g ) in synchronism with the boundary data stored in step (h) so that the low and high definition pixel data are written to respective pixels of a high resolution video frame buffer, and (k) displaying the data stored in the high resolution video frame buffer.
Thus, the method according to the invention permits a relatively low resolution image to be transmitted together with one or more high resolution windows along conventional broadcast channels employing existing television standards and to be displayed on a single, preferably high resolution, television screen.
In a system according to the invention a high resolution television camera is employed for producing a high definition video image frame which is subsequently compressed so as to form a relatively low definition image frame. One or more areas within the high definition video image frame are selected and transmitted together with the low definition image data for subsequent display on a single television screen. All or most of the video image frame created by the television camera is transmitted and displayed with reduced resolution according to the limitations set by the bandwidth or broadcast standard of the communication channel and the display resolution. Resolution can be reduced using any kind of image data compression. Only the selected areas within the image frame are transmitted and displayed at the full (high) resolution of the camera image.
The areas to be transmitted and displayed at high resolution can be selected either at the transmitter site or at the receiver site and may be determined either manually or by mathematical algorithms, such as Feature Analysis or Search Programs.
BRIEF DESCRIPTION OF THE DRAWINGS For a clearer understanding of the invention and to understand how the same may be carried out in practice, a preferred embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings in which: Figs. 1 and 2 are schematic representations relating to high and low definition pixel storage useful for explaining a method according to the invention; Fig. 3 shows a high definition TV screen displaying a low resolution image having a high resolution window superimposed thereon in registration therewith; - Eig. 4 shows a low definition TV screen displaying a low resolution image having a high resolution window superimposed thereon not in registration therewith; Fig. 5 shows a low resolution image having a high resolution window superimposed thereof and out of registration therewith; Fig. 6 is a block diagram showing schematically a transmitter according to the invention; and Fig. 7 is a block diagram showing schematically a receiver according to the invention.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT Referring to Fig. 1 there is shown schematically a high definition video frame 10 having a plurality of pixels designated generally as where 1 and j represent a row and column, respectively. Shown in dotted line is an area 12 corresponding to an area of particular interest which is to be transmitted and subsequently displayed at high resolu-tion.
Fig. 2 shows schematically a relatively low definition video frame 13 comprising a plurality of pixels where i and j have the same notation as explained above. The low definition video frame 13 is derived by compressing the high definition video frame 10 so that each low definition pixel is an average of four high definition pixels h21-1.2j-l> h2±-1.2j' h21.2j-l ^ h21,2j- Thus- bv wav of example, the first low definition pixel is an average of the first two high definition pixels in each row, i.e. ^, ¾2' ¾I and ¾2* Also shown is an area 14 corresponding to the area in the low resolution image where the high definition area 12 shown in Fig. 1 is to be accommodated.
In accordance with known television standards such as, for example, PAL which employs a 625 line video image, each frame of image data is built up of 625 lines each containing a predetermined number of pixels. Thus, the total number of pixels in each image frame is fixed according to the broadcast standard and this together with any other data which; must be transmitted with the image data itself, such as synchronisation signals, determines the bandwidth of the broadcast channel.
In video transmitters the necessary video processing to conform the video transmission to a specific broadcast standard is usually performed by a single integrated circuit which, owing to mass production, is relatively inexpensive. Likewise, in television receivers the necessary video decoding which decodes the video data transmitted in accordance with a particular broadcast standard to the displayed image is also performed by a single, relatively inexpensive, integrated circuit. Clearly, video data which is not transmitted according to one of the accepted standards, PAL, SECA , NTSC etc., are not compatible with existing transmitters and receivers and therefore require purpose built equipment which is clearly more expensive.
It is therefore important that the total amount of pixel data corresponding, respectively, to low definition and high definition pixel data, can be accommodated within the bandwidth of existing broadcast channels and conforms exactly to one of the existing broadcast standards. This requirement is met, in practice, by allocating to the low definition pixel data an amount of memory equal to the total amount of pixel data conforming to the broadcast standard less the amount of pixel data corresponding to the high definition pixel data. In other words, the memory is allocated dynamically to the low definition pixel data according to the total size of the selected high definition window or windows.
In order that the transmitted pixel data can be reconfigured by a suitably modified television receiver, the boundary of the high definition area 14 shown in Fig. 2 of the drawings must be known at the receiver site. It will be recalled that the location of the area 14 corresponds to the location of the area 12 shown in Fig. 1 only if the high definition area is to be displayed at the receiver site at the same location as its original location in the scene. However, even if the location of the displayed high defini-tion area is different to its location in the original scene, the number of pixels contained within the areas 12 and 14 must be identical. The boundary data may be fixed in which case it need be stored only at the receiver site. Alternatively, it can vary dynamically in which case it can be transmitted to the receiver site as a sideband of the video signal itself or as a Video Text signal together with the synchronisation signals or as a high frequency audio signal beyond the human aural frequency response.
The manner in which the synchronisation signals and the boundary data are transmitted is not a feature of the invention and will be readily understood by an average man of the art. It is therefore not proposed to explain these features in greater detail.
At the receiver site the low and high definition pixel data and respectively, are received sequen tially together with the boundary data relating to the high definition area 12 shown in Fig. 1. The received pixel data are processed in synchronism with the boundary data so that the low definition pixel data are first written to respective pixels of a high resolution video frame buffer leaving gaps corresponding to the high definition pixel data The high definition pixel data are then written to the respective pixels of the high resolution frame buffer so as to fill the gaps. The contents of the high resolution frame buffer are then displayed on a high resolution display device.
Al hough the area marked 14 in Fig. 2 is identical with regard to both size (i.e. the number of pixels ) and location to the area 12 shown in Fig. 1, in fact, its location need not correspond to that of the area 12 shown in Fig. 1. Clearly, if the location of the area 14 is differ-- ent to that shown in Fig. 2, then a different set of low definition pixels will need to be excluded from the low definition image displayed on the display device so as to accommodate therein the high definition area 12 shown in Fig. 1.
Fig. 3 shows a high resolution display device displaying a generally low resolution image 20 having therein an area 21 which is displayed at high resolution. In this particular configuration, the high resolution area 21 is displayed in its correct location within the low resolution image 20 seeing that at the boundary of the high resolution area 21, continuity between the low and high, resolution images is preserved.
Fig. 4 shows an alternative method for displaying a high resolution window 23 in a relatively low resolution image 24 for display on a relatively low resolution display device. In such a case, since the high definition area 23 requires more pixels than an equivalent low definition area, it occupies a proportionally greater area of the low resolu- tion display device than in the original high resolution image. Consequently, even if the high resolution window 23 is correctly centered with respect to the low definition image 24, it will overlay the low definition image and continuity at the boundaries will no longer be preserved.
Fig. 5 shows a low definition image 25 displayed on either a low or high resolution display device wherein an area 26 is displayed at a high resolution window 27 at the bottom left corner of the image. In this case, the size data relating to the two high definition areas 26 and 27 are identical but the respective location data corresponding to the location in the original image of the high definition area and its boundary in the displayed image are different. Clearly, in this case also, the displayed high resolution area 27 overlays the low resolution image 25.
Reference is now made to Fig. 6 of the drawings which shows functionally the principal elements in an image processing unit designated generally as 30 according to the invention. The image processing unit 30 includes a high resolution sensor 31 having a plurality of pixels such as shown schematically in Fig. 1 of the drawings. The high resolution sensor 31 may be a CCD, ICCD, or any other suitable high resolution sensor.
Pixel data from the high resolution sensor 31 is input into a video processor 32 which compresses the pixel data such that for each of a group of four of pixels proces-sed by the video processor 32, only one pixel is output therefrom having data equal to the average pixel data of the four pixels whose data is compressed by the video processor 32. The pixel data compressed by the video processor 32 relates to luminance and chrominance associated with each of the pixels.
A selection means (not shown) is coupled to the video processor 32 for selecting one or more areas of the high definition image which are to be displayed at high resolution. Each of the high definition areas comprises a rectangular window whose boundary within the high definition image is input to a logic and control unit 35 which is also coupled to high resolution sensor 31. The location data associated with each window indicates where, for example, the top left-hand corner of the window is located within the original high definition image data. Optionally, it may also specify where in the final image the window is to be located if this is different to its original location. Since from a knowledge of the boundary data associated with each window, the geometry of the window is known, the logic and control unit 35 is able to control which pixels in the compressed low definition image are to be replaced by the high definition pixel data corresponding to the window data.
Consequently, the video processor 32 operates under control of the logic and control unit 35 for inputting the analog low definition pixel data from a first output 36 thereof to a low definition frame buffer 37 via an analog-to-digital (A/D) 38. The A/D converter 38 converts the analog pixel data to digital format for storage in the frame buffer 37.
Likewise, the high definition analog pixel data is fed from a second output 40 of the video processor 32 to an area 42 of the frame buffer 37 containing high definition pixel data via an A/D converter 43.
Thus, the logic and control unit 35 controls the timing of the video processor 32 and synchronises the routing of the low definition and high definition pixel data via the channels 36 and 40, respectively so that the frame buffer 37 contains low definition pixel data corresponding to the sections of the image which do not require to be displayed at high definition whilst the second 42 contains high definition pixel data relating to the high definition windows. In practice, the high definition pixel data corresponding to the windows may change every frame whilst the boundary data associated with the windows may well remain constant for several thousand frames or more. In this case, the boundary data associated with each window need be trans-mitted only when it changes enabling the logic and control unit 35 to supervise the routing of the low definition and high definition pixel data through the video processor 32 immediately the pixel data becomes available.
The manner in which the video processor 32 routes the low definition and high definition pixel data through the channels 36 and 40 is as follows. Each group of four high definition pixels is compressed so as to produce a single pixel whose data is the average of the data associated with the four pixels. The compressed pixel data is then input to the frame buffer 37. The high definition pixel data is then appended to the area 42 of the frame buffer 37 via the channel 40. After this process is complete, the frame buffer 37 contains all the low definition pixel data corresponding to the final image whilst the area 42 contains the high definition window data. In fact, the frame buffer also contains the low definition pixel data of those pixels which fall within the high definition windows and which must therefore be excluded from the final image. The process of excluding from the displayed image the low definition pixels which fall within the high definition windows is achieved at the receiver site, as will be described below with reference to Fig. 7 of the drawings.
The low definition pixel data within the frame buffer 37 is fed to a video encoder 45 via a digital-to-analog (D/A) converter 46 after which the high definition pixel data corresponding to the windows stored within the area 42 is likewise converted to an analog signal by the D/A converter 46 and passed to the video encoder 45. The video encoder 45 generates a composite video signal by integrating the converted analog signals representing the pixel data with the required synchronisation signals so as to produce a video signal which conforms to the broadcast standard being used.
It should be noted that the total volume of pixel data transmitted along the broadcast channel is determined by the broadcast standard (e.g. PAL, NTSC etc.) being employed. The total contents of the video frame buffer 37 together with the area 42 may well, and generally will, contain more pixel data than can be accommodated by the broadcast standard. In this case, some of the low defini-tion pixel data is eliminated typically at the edges of the video frame where a slight loss of image is insignificant.
Referring now to Fig. 7 of the drawings there is shown a receiver designated generally as 50 comprising at a front end thereof a video decoder 51 for receiving the com-posite video signal transmitted by the image processing unit 30. The video decoder 51 decodes the composite video signal whilst a Sync. Stripper 52 associated with the video decoder 51 filters out the synchronisation signals which are then passed together with the boundary data corresponding to the high definition windows to a logic and control unit 53. It will be recalled that the boundary data are transmitted either as a separate side band of the video signal or as a Video- Text signal or as a high frequency audio signal and are thus available to the logic and control unit 53 without requiring further decoding.
However, it can also be transmitted together with the frame data itself in which case it will not be available to the logic and control unit 53 until the frame data to which it is appended is first decoded by the video decoder 51. In practice, this simply means that if and when the high definition window data changes, the logic and control unit 53 will not know the updated window information until the first frame relating thereto has been decoded by the video decoder 51 and, therefore, the first frame will effectively be lost. Since the frames are repeatedly updated some thirty times each second, the loss of a single frame in this manner is insignificant.
As an alternative, the boundary data can be transmitted ahead of the pixel data itself, thereby preventing any loss of data at the receiver site.
The output from the video decoder 51 is an analog video signal which is input to a high resolution frame buffer 55 via an analog-to-digital (A/D) converter 56. The manner in which the digitized video signal is fed to the high resolution frame buffer is as follows. The low defini-tion pixel data is processed first and the data corresponding to each pixel is written to a plurality of pixels within the high resolution frame buffer 55 in accordance with the compression algorithm by means of which the original high definition frame was compressed. Thus, to revert to the example described above with reference to Figs. 1 and 2 of the drawings, the high definition image was compressed such that each low definition pixel contains an average data of the four high definition pixels ¾!' ¾2' ¾1 Thus, the low definition pixel data corresponding to the pixel ^s written to the four pixels ¾2' ¾I and -22 within the high resolution frame buffer 55.
This process continues under control of the logic and control unit 53 until a pixel is encountered in the high resolution frame buffer 55 which falls within the boundary of one of the high definition windows. As soon as this happens, the logic and control unit 53 ensures that the next available location within the high resolution frame buffer 55 after the high definition window. This task is undertaken under control of the logic and control unit 53 which has already been apprised of the window boundary data.
At the end of this procedure, the high resolution frame buffer 55 contains all the low definition pixel data, gaps remaining corresponding to the high definition windows whose pixel data is now output through the video decoder 51, digitized by the A/D converter 56 and fed under supervision of the logic and control unit 53 to the appropriate locations of the high resolution frame buffer 55.
The contents of the high resolution frame buffer 55 now correspond to the desired picture comprising a generally low definition background and one or more high definition windows whose pixel data is stored digitally. The pixel data within the high resolution frame buffer 55 is converted to an analog signal by a D/A converter 57 whence it is fed to a high resolution video encoder 58. The high resolution video encoder 58 produces a composite video signal which is fed to a high resolution display 60 together with appropriate synchronisation signals generated by the logic and control unit 53.
It will be understood that many variations can be made to the preferred embodiment without in any way departing from the spirit of the invention. Thus, for example, whilst it has been assumed that the high definition windows are selected in conjunction with the image processing unit 30 (shown in Fig. 6) it can equally well be selected in conjunction with the image display unit 50 (Fig. 7). In this case a suitable communications link must be established between the image display unit 50 and the image processing unit 30 so that the selected window boundary data can be transmitted to the image processing unit 30 so as to be identified thereby and thus allow it to perform the necessary pixel processing as described above.
Likewise, the boundaries of the high definition windows may be determined either by a human operator or, alternatively, using mathematical algorithms such as Feature Analysis or Search Program. It should also be noted that determination of the boundaries associated with the high definition windows permits the high definition windows to be identified unambiguously. Thus, a knowledge of the boundaries associated with a high definition window specifies precisely the location of the window as well as its geometry and orientation. There may be occasions when only square or rectangular windows are to be transmitted orientated such that their edges are parallel to the edges of the screen. In such cases, a more compact form of boundary data may be employed including only size and location data.
Whilst the high definition areas have been described with reference to rectangular windows, this also is optional and any window geometry can be employed providing only that it is transmitted to the logic and control unit 53 within the image display unit 50 so that the transmitted pixel data can be reconfigured unambiguously.
It should also be understood that although a particular form of data compression has been described whereby each group of four pixels is compressed to a single average pixel, this form of data compression is not a requirement and any other suitable video signal compression technique may be employed.
In the preferred embodiment, all of the low resolution pixel data is stored together with the high resolution pixel data prior to processing at the transmitter site and transmission to the receiver site. In fact, those low resolution pixels which fall within one of the high definition windows need not be stored and can, if desired, be excluded from the frame buffer 37. In this case, the frame buffer 37 contains all the low definition pixel data corresponding to the final image and excluding therefrom the high definition window data whilst the area 42 contains the high definition window data.
It should also be noted that whilst analog data transmission has been employed in the preferred embodiment, digital transmission can equally well be employed. In this case, the video encoder 45 and the digital-to-analog converter 46 are not required at the transmitter site and the video decoder 51, the sync, stripper 52 and the analog-to-digital converter 56 are not required at the receiver site.
It should also be noted that video compression techniques can be employed whereby the high resolution areas are compressed at a lower compression ratio than the rest of the video frame.
Finally, it will be understood that whilst in the preferred embodiment the low definition pixel data is pro-cessed and transmitted first followed by the high definition pixel data, this also is not a requirement and the low and high definition pixel data can be processed and transmitted in any order. It is only required that the the total quantity of pixel data transmitted conforms to the broadcast standard being used.
Claims (20)
1. A method for transmitting and receiving a video image having a generally low definition but including at least one area having a high definition, the method compris-ing the steps of: (1) at a transmitter site: a) photographing a scene so as to produce a high definition video frame having a plurality of pixels, (b) identifying at least one area of the high definition video frame, (c) compressing the high definition video frame at a first compression ratio so as to produce a relatively low definition video frame, (d) storing pixel data associated with the or each area identified in (b), (e) storing pixel data associated with the compressed low definition video frame, and (f) transmitting the stored low definition pixel data together with the stored high definition pixel data; and (2) at a receiver site: (g) receiving all of the transmitted data, (h) storing boundary data associated with the or each area identified in (b), (j) processing the pixel data received in step (g) in synchronism with the boundary data stored in step (h) so that the low and high definition pixel data are written to respective pixels of a high resolu- tion video frame buffer, and (k) displaying the data stored in the high resolution video frame buffer.
2. The method according to Claim 1, wherein the low definition and the high definition pixel data occupy a memory size which is fixed in accordance with a predetermined broadcast standard.
3. The method according to Claim 1 or 2, wherein: step (c) compresses a plurality of pixels to a single pixel having an average of the data associated with each of the plurality of pixels, and step ( j ) writes said average data to each of an equal plurality of low definition pixels.
4.. ; . The method according to any one of the preceding claims, wherein the boundary data includes separate boundary data relating to the boundary of the or each area in the photographed and displayed scene, respectively.
5. The method according to any one of the preceding claims, further comprising the steps of: (1) selecting the or each identified area at the transmitter site, and (m) transmitting from the transmitter site to the receiver site the boundary data associated with the or each area selected in (1) for storage at the receiver site; thereby permitting the boundary data associated with the or each high definition area to be altered dynamically.
6. The method according to any one of Claims 1 to 4, further comprising the steps of: (n) selecting the or each identified area at the receiver site, and (o) transmitting from the receiver site to the transmitter site the boundary data associated with the or each area selected in (n) so as to be identified by the transmitter site.
7. The method according to Claim 5 or 6, wherein the boundary data are transmitted as a video signal having a frequency close to a frequency of the pixel data.
8. The method according to Claim 5 or 6, wherein the boundary data are transmitted as an audio frequency signal.
9. The method according to any one of the preceding claims, wherein prior to the step of storing the low definition pixel data at the transmitter site, there is further included the step of: (p) processing the low definition video frame so as to exclude therefrom the pixels associated with the -.- -■. or each high definition area; whereby the processed low definition video frame occupies a lower volume of memory.
10. The method according to any one of the preceding claims, wherein prior to transmitting the pixel data there is further included the step of compressing the or each high definition area at a second compression ratio lower than the first compression ratio.
11. A method for transmitting and receiving a video image substantially as described herein with reference to Figs. 1 to 5 of the drawings.
12. A system for transmitting and receiving a video image having a generally low definition but including at least one area having a high definition, the system comprising an image processing unit and a receiver, wherein: the image processing unit comprises: a first memory for storing therein a high definition video frame having a plurality of pixels, compressing means coupled to the first memory for compressing the high definition video frame so as to form a low definition pixel data, a second memory coupled to the compressing means for storing therein the low definition pixel data, a third memory coupled to the first memory for storing therein high definition pixel data in respect of at least one area of the high definition video frame, and a first transmitting means coupled to the second and third memory means for transmitting the low definition pixel data together with the high definition pixel data; and a receiver comprising: a video decoder responsive to the transmitted pixel data for producing video signals corresponding to the low definition and high definition pixel data, a high resolution video frame buffer for storing therein a high definition video frame, a fourth memory for storing therein boundary data in respect of the or each high definition area, a logic and control unit coupled to the video decoder, the high resolution video frame buffer and to the fourth memory and responsive to the low definition and high definition pixel data and to the boundary data for writing the low definition and the high definition pixel data to the high resolution frame buffer in correct registration, a high resolution video encoder coupled to the high resolution video frame buffer and responsive to the pixel data stored therein for producing an equivalent raster signal, and a display coupled to the high resolution video encoder and responsive to the raster signal for displaying the video image.
13. The system according to Claim 12, wherein: the compressing means compresses a plurality of pixels to a single pixel having an average of the data associated with each of the plurality of pixels, and the logic and control unit writes said average pixel data to each of an equal plurality of pixels in the high resolution video frame buffer.
14. The system according to Claim 12 or 13, wherein the image processing unit further comprises: identifying means coupled to the first memory for identifying at least one area of the high definition video frame, and a fifth memory coupled to the identifying means for storing therein the boundary data in respect of the or each identified area.
15. The system according to Claim 14, wherein: the identifying means includes: a first selection means in the image processing unit coupled to the first memory for selecting the or each area of the high definition video frame; and there is further provided: r- ,a_second transmitting means coupled to the fifth memory for transmitting the boundary data in respect of the or each identified area; thereby permitting the boundary data associated with the or each high definition area to be altered dynamically.
16. The system according to Claim 14 or 15, wherein the receiver includes: a second selection means coupled to the high resolution video frame buffer for selecting the or each area of the high definition video frame, and a third transmitting means for transmitting the boundary data from the receiver to the image processing unit so as to be available to the identifying means therein.
17. The system according to any one of Claims 12 to 16, wherein the image processing unit further comprises: video processing means coupled to the second memory and to the compressing means for removing from the low definition pixel data all those pixels within the or each high definition area prior to storing the low definition pixel data; thereby permitting the volume of the second memory to be reduced.
18. The system according to any one of Claims 12 to 17, wherein the display is a high resolution display.
19. The system according to any one of Claims 12 to 17, wherein the display is a low resolution display.
20. A system for transmitting and receiving a video image substantially as described herein with reference to the accompanying drawings. For the Applicants DR. REINHOLD COHN AND PARTNERS By:
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL9740091A IL97400A (en) | 1991-03-04 | 1991-03-04 | Method and system for transmitting and receiving a video image |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL9740091A IL97400A (en) | 1991-03-04 | 1991-03-04 | Method and system for transmitting and receiving a video image |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| IL97400A0 IL97400A0 (en) | 1993-02-21 |
| IL97400A true IL97400A (en) | 1994-01-25 |
Family
ID=11062162
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| IL9740091A IL97400A (en) | 1991-03-04 | 1991-03-04 | Method and system for transmitting and receiving a video image |
Country Status (1)
| Country | Link |
|---|---|
| IL (1) | IL97400A (en) |
-
1991
- 1991-03-04 IL IL9740091A patent/IL97400A/en not_active IP Right Cessation
Also Published As
| Publication number | Publication date |
|---|---|
| IL97400A0 (en) | 1993-02-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP0870401B1 (en) | Television apparatus and method with provisions for displaying an auxiliary image of variable size | |
| US8681859B2 (en) | Systems and methods for multi-stream image processing | |
| US6104425A (en) | Method and apparatus for transmitting television signals, method and apparatus for receiving television signals, and method and apparatus for transmitting/receiving television signals | |
| US5128754A (en) | Apparatus and method for encoding and decoding video | |
| CN1083664C (en) | Video signal aspect ratio conversion apparatus | |
| US5627825A (en) | Video communication apparatus | |
| KR100194802B1 (en) | MPEG-2 Encoder Preprocessor for Split Screen Image Processing of Digital TV and High-Definition TV | |
| IL97400A (en) | Method and system for transmitting and receiving a video image | |
| US7190408B2 (en) | TV-receiver, image display apparatus, TV-system and method for displaying an image | |
| JPS61206395A (en) | Stereoscopic television image transmitting system | |
| US20080018787A1 (en) | Method and Apparatus for Producing Images | |
| JPH08140065A (en) | Teletext receiver | |
| RU2053597C1 (en) | Method of multiprogram television broadcasting | |
| EP0343732A2 (en) | Method and apparatus for introducing a pan and scan feature to high definition television systems | |
| JP3862573B2 (en) | Digital broadcast receiver | |
| US20050190297A1 (en) | Video signal processor and video display device | |
| JPH0244885A (en) | Method and device for picture transmission | |
| JPH06311405A (en) | Image transmission equipment | |
| JPH04126486A (en) | Picture communication system | |
| JPH1132275A (en) | Video signal processing device | |
| JPH08205102A (en) | Information transmission / reception method | |
| JPH07123374A (en) | Signal processing system for teletext signals and TV video signals | |
| JPH11205777A (en) | Video signal synthesizing apparatus and video signal processing apparatus for processing a synthesized video signal | |
| JPH027685A (en) | Television receiver | |
| JPH0263095A (en) | Video display system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| RH | Patent void |