[go: up one dir, main page]

HK1191786B - Apparatus and method for generating picture-in-picture (pip) image - Google Patents

Apparatus and method for generating picture-in-picture (pip) image Download PDF

Info

Publication number
HK1191786B
HK1191786B HK14104910.3A HK14104910A HK1191786B HK 1191786 B HK1191786 B HK 1191786B HK 14104910 A HK14104910 A HK 14104910A HK 1191786 B HK1191786 B HK 1191786B
Authority
HK
Hong Kong
Prior art keywords
video image
image data
pip
subject
picture
Prior art date
Application number
HK14104910.3A
Other languages
Chinese (zh)
Other versions
HK1191786A (en
Inventor
邓伟
Original Assignee
豪威科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 豪威科技股份有限公司 filed Critical 豪威科技股份有限公司
Publication of HK1191786A publication Critical patent/HK1191786A/en
Publication of HK1191786B publication Critical patent/HK1191786B/en

Links

Abstract

According to a picture-in-picture (PIP) system and method, a first image sensor device detects light from a first subject and generates a first signal indicative of image data for the first subject. A second image sensor device detects light from a second subject and generates a second signal indicative of image data for the second subject. Overlay logic combines the first and second signals to generate a picture-in-picture signal indicative of a combination of an image of the first subject and an image of the second subject, wherein the overlay logic is located within the first image sensor device. The first image sensor device generates a synchronization signal which is received by the second image sensor device and triggers the second image sensor device to generate the second signal indicative of image data for the second subject.

Description

Apparatus and method for generating Picture In Picture (PIP) image
Technical Field
The present disclosure relates to generation of video images, and more particularly, to an apparatus and method for generating picture-in-picture video images.
Background
In a conventional picture-in-picture (PIP) system, a first image sensor generates a first image and a second image sensor generates a smaller and lower resolution second image. Each image sensor provides video image data in the form of a video image data stream for its respective image. Typically, data of the video image data stream of the second smaller image is stored in a frame memory. The overlay logic receives a video image data stream from the first image sensor and a video image data stream of the second image from the frame memory and combines the two images to generate a video image data stream for PIP image output. The PIP video image data stream output from the overlay logic is then used to generate a composite PIP video image.
The frame memory for these conventional systems is necessarily large because of the large amount of video data that needs to be stored. Since the frame memory occupies a large amount of space on an integrated circuit chip, it cannot be integrated with an image sensor, and thus, is manufactured as a separate single device element occupying a large amount of space in a conventional PIP system. Also, the separate frame memory results in a relatively high overall system power consumption due to the large number of memory access operations required.
Disclosure of Invention
According to a first aspect, a picture-in-picture (PIP) system is provided. The system comprises: a first image sensor device for detecting light from a first subject (subject) and generating a first signal indicative of image data of the first subject; and a second image sensor device for detecting light from a second subject and generating a second signal indicative of image data of the second subject. Overlay logic combines the first and second signals to generate a picture-in-picture signal indicative of a combination of an image of the first subject and an image of the second subject. The overlay logic is located within the first image sensor device.
According to another aspect, a picture-in-picture (PIP) system is provided. The system comprises: a first image sensor device for detecting light from a first subject and generating a first signal indicative of image data of the first subject; the first signal comprises a first video image data stream, and the first image sensor device generates a synchronization signal. A second image sensor device detecting light from a second subject and generating a second signal indicative of image data of the second subject; the second signal comprises a second video image data stream, which is triggered in response to the synchronization signal. Overlay logic combines the first and second signals to generate a picture-in-picture signal indicative of a combination of an image of the first subject and an image of the second subject.
According to another aspect, a picture-in-picture (PIP) method is provided. According to the method, light from a first subject is detected using a first image sensor device and a first signal indicative of image data of the first subject is generated. Light from a second subject is detected using a second image sensor device and a second signal indicative of image data of the second subject is generated. Combining the first and second signals to generate a picture-in-picture signal indicative of a combination of the image of the first subject and the image of the second subject. The first and second signals are combined by the first image sensor device.
According to another aspect, a picture-in-picture (PIP) method is provided. According to the method, light from a first subject is detected using a first image sensor device and a first signal indicative of image data of the first subject is generated. The first signal comprises a first video image data stream, and the first image sensor device generates a synchronization signal. Light from a second subject is detected using a second image sensor device and a second signal indicative of image data of the second subject is generated. The second signal comprises a second stream of video image data. Triggering the second video image data stream in response to the synchronization signal. Combining the first and second signals to generate a picture-in-picture signal indicative of a combination of the image of the first subject and the image of the second subject.
Drawings
The foregoing and other features and advantages will be apparent from the following, more particular description of preferred embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale. In the drawings, the size of features may be exaggerated for clarity.
Fig. 1 includes a schematic block diagram of a conventional system and method for forming a picture-in-picture (PIP) image.
Fig. 2 includes a schematic functional block diagram of systems and methods for forming a picture-in-picture (PIP) image, according to some example embodiments.
Fig. 3A and 3B include schematic timing diagrams illustrating the timing of the first and second video image data streams and the synchronization signal SYNC, according to some example embodiments.
Fig. 4 includes a schematic detailed block diagram of at least portions of first and second video image sensor devices, according to some example embodiments.
Fig. 5 includes a timing diagram illustrating the generation of a composite PIP video image from a first video image, a second video image, and PIP boundary image data, in accordance with some example embodiments.
Detailed Description
Fig. 1 includes a schematic functional block diagram of a conventional system 100 and method for forming a picture-in-picture (PIP) image. Referring to fig. 1, the system 100 includes a first video image sensor 102 for generating a first video image "F" 103 of a first scene. The system 100 further comprises a second video image sensor 104 for generating a second video image "s" 105 of a second scene. In general, the second video image sensor 104 generates a smaller video image "s" 105 having a lower resolution than the first video image "F" 103 of the first video image sensor 102. In the final composite PIP video image 109, the second video image "s" 105 is combined with the first video image "F" 103, and the second video image "s" 105 is located within the first video image "F" 103.
The system 100 of FIG. 1 also includes overlay logic 106. Overlay logic 106 contains the processing and logic circuitry used to combine the first video image "F" 103 of the first video image sensor 102 and the second video image "s" 105 of the second image sensor 104 to generate the video image data required for the composite PIP video image 109. Overlay logic 106 outputs a PIP video image data stream of composite PIP video image 109 to a device 108, such as a display for displaying composite PIP video image 109 and/or a memory for storing composite PIP video image 109. The composite PIP video image 109 is a combination of the first video image "F" 103 and the second video image "s" 105.
Referring to fig. 1, the overlay logic 106 receives a video image data stream of a first video image "F" 103 directly from the first video image sensor 102. While the overlay logic 106 receives a video image data stream of a second video image "s" 105 from the frame memory 110. The video image data of the second video image "s" 105 is temporarily stored in the frame memory 110 by the second video image sensor 104. Such temporary storage of video image data is necessary in conventional PIP system 100 because the first and second video image sensors 102 and 104 and their respective video image data streams are not synchronized. As a result, the video image data of the second video image "s" 105 is temporarily stored in the frame memory 110 to ensure that the video image data of the second video image "s" 105 is not lost while the overlay logic 106 performs the processing required to combine the first video image "F" 103 and the second video image "s" 105.
The frame memory 110 is typically a very large memory required to store a large amount of video image data. Therefore, it cannot be integrated with the first video image sensor 102 or the second video image sensor 104, and is manufactured as a separate device. Thus, a significant increase in the size of the system 100 results. The separate frame memory 110 also consumes a relatively large amount of power.
Fig. 2 includes a schematic functional block diagram of a system 200 and method for forming a picture-in-picture (PIP) image, according to some example embodiments. Referring to fig. 2, according to some exemplary embodiments, a system 200 includes: a first video image sensor 202 for generating a first video image "F" 203 of a first scene; and a second video image sensor 204 for generating a second video image "s" 205 of a second scene. In general, the second video image sensor 204 generates a smaller video image "s" 205 having a lower resolution than the first video image "F" 203 of the first video image sensor 202. In the final composite PIP video image 209, the second video image "s" 205 is combined with the first video image "F" 203 and the second video image "s" 205 is located within the first video image "F" 203.
In some exemplary embodiments, a second video image "s" 205 is surrounded within a composite PIP video image 209, including a selectable and/or adjustable PIP boundary 207. The PIP boundary 207 may be of any desired thickness and color. And may also be omitted entirely. The PIP boundary 207 includes a top horizontal line, a bottom horizontal line, and left and right vertical lines. These PIP borderlines are defined by the relevant image data in the composite PIP video image 209, which is described in more detail below.
The system 200 of fig. 2 also includes overlay logic 206, the overlay logic 206 containing processing and logic circuitry to combine the first video image "F" 203 of the first video image sensor 202 and the second video image "s" 205 of the second image sensor 204 to generate the video image data required for the composite PIP video image 209. Overlay logic 206 outputs a PIP video image data stream of composite PIP video image 209 to a device 208, such as a display for displaying composite PIP video image 209 and/or a memory for storing composite PIP video image 209. The composite PIP video image 209 is a combination of the first image "F" 203 and the second video image "s" 205.
Referring to fig. 2, the overlay logic 206 receives a first video image data stream of a first video image "F" 203 directly from the first video image sensor 202 and also receives a second video image data stream of a second video image "s" 205 directly from the second video image sensor 204. That is, in the exemplary embodiment shown in fig. 2, the frame memory 110 used in the conventional system 100 described above and illustrated in fig. 1 is not used. In the system 200 shown in fig. 2, frame memory is not necessary because, in an exemplary embodiment, the first video image sensor 202 and the second video image sensor 204 and their respective first and second video image data streams are synchronized. As a result, the video image data of the second video image "s" 205 need not be temporarily stored in the frame memory to ensure that the video image data of the second video image "s" 205 is not lost when the overlay logic 206 performs the processing required to combine the first video image "F" 203 and the second video image "s" 205. In effect, the overlay logic 206 receives the two video image data streams directly from the respective video image sensors 202, 204 and processes them to produce a composite PIP video image data stream without an intervening frame memory.
As shown in fig. 2, in some example embodiments, the first video image sensor 202 may be integrally formed with the overlay logic 206, since in example embodiments, no frame memory is included. That is, referring to FIG. 2, the first video image sensor 202 and the overlay logic 206 may form part of the same video image sensor device 220. The first video image sensor device 220 may be a substrate or chip or other single device configuration. The first video image sensor device 220 may also include additional processing or support circuitry 216 required to perform the operations of the first video image sensor device 220. The additional processing or support circuitry 216 may also be integrated in the same substrate or chip or other single device configuration as the first video image sensor 202 and the overlay logic 206. As further illustrated in fig. 2, the second video image sensor 204 may be fabricated as part of another single second video image sensor device 222 that is separate from the first video image sensor device 220. The second video image sensor device 222 may also include additional processing or support circuitry 218 required to perform the operations of the second video image sensor device 222. The additional processing or support circuitry 218 may be integrated in the same substrate or chip die or other single device configuration as the second video image sensor 202. The first video image sensor 220 and the second video image sensor 222 may together form a kit or assembly of PIP video image sensors, which may be provided together as a set.
As described above, according to some exemplary embodiments, the first video image data stream generated by the first video image sensor device 220 and the second video image data stream generated by the second video image sensor device 222 are synchronized. To this end, as shown in fig. 2, in some exemplary embodiments, the first video image sensor device 220 generates a synchronization signal SYNC and transmits the signal SYNC to the second video image sensor device 222 on a signal line or bus 224 to synchronize the two video image sensor devices 220 and 222 and the first and second video image data streams, respectively.
In some exemplary embodiments, the synchronization signal SYNC is generated as a single pulse signal. It is received by the second video image sensor device 222 and used by the second video image sensor device 222 to trigger the start of the transmission of the second video image data stream of the second video image "s" 205 to the overlay logic 206 in the first video image sensor device 220. The timing of the transmission of the second video image data stream of the second video image "s" 205 determines the position of the second video image "s" 205 in the first video image "F" 203 and in the composite PIP video image 209. Thus, the timing of the synchronization signal SYNC is selected to position the second video image "s" 205 in the composite PIP video image 209.
Fig. 3A and 3B include some schematic timing diagrams illustrating the timing of the first and second video image data streams transmitted to the overlay logic 206 and the synchronization signal SYNC transmitted from the first video image sensor device 220 to the second video image sensor device 222, according to some example embodiments. Fig. 3A and 3B illustrate two examples of the relationship between the timing of the synchronization signal SYNC and the relative timing of the two video image data streams and the position of the second video image "s" 205 within the first video image "F" 203 in the composite PIP video image 209.
Referring to fig. 3A, a case where the synchronization signal SYNC is issued at the start of the first video image stream is illustrated. In this particular exemplary illustration, the second video image data stream is initiated within a very short time after the synchronization signal SYNC is issued, so that the first and second video image data streams start at or near the beginning of a frame. As a result, the second video image "s" 205 would be positioned at or near the top of the first video image "F" 203 in the composite PIP video image 209.
Referring to fig. 3B, a situation is illustrated in which the synchronization signal SYNC is issued at or near the middle of the first video image data stream. In this particular exemplary illustration, the second video image data stream is initiated within a very short time after the synchronization signal SYNC is emitted, so that the second video image data stream starts in the middle or close to the middle of a frame, i.e. the first video image data stream. As a result, the second video image "s" 205 would be positioned in the middle or near the middle of the first video image "F" 203 in the composite PIP video image 209.
It will be appreciated that fig. 3A and 3B illustrate only an exemplary scenario. For example, the timing of the synchronization signal SYNC may be adjusted as needed to time the initiation of the second video image data stream to any time so that the second video image "s" 205 may be positioned anywhere within the first video image "F" 203 in the composite PIP video image 209. Likewise, it is also understood that the second video image data stream may be initiated at any time after the synchronization signal SYNC is issued by the first video image sensor device 220 and received by the second video image sensor device 222. The exemplary embodiment illustrated in fig. 3A and 3B shows that the second video image data stream starts almost immediately after the synchronization signal SYNC is issued. This need not always be the case, i.e., according to some exemplary embodiments, the time delay may be set to any desired delay, which may be adjusted as desired.
Fig. 4 includes a schematic detailed block diagram of at least portions of first video image sensor device 220 and second video image sensor device 222, according to some example embodiments. Referring to fig. 4, the first video image sensor device 220 includes a first video image sensor 202 for generating a first video image "F" 203. As described above, the first video image sensor device 220 may also include additional processing or support circuitry 216 as needed to perform the operations of the first video image sensor device 220. This additional processing or support circuitry 216 may include PIP image boundary generator circuitry 231 that generates image data surrounding an optional PIP boundary 207 of the second video image "s" 205 in the composite PIP video image 209. The additional processing or support circuitry 216 may also include data receiver/data buffer circuitry 232, which may be serial data receiver/data buffer circuitry, that receives and may temporarily store data of the second video image data stream of the second video image "s" 205 received from the second video image sensor device 222. The additional processing or support circuitry 216 may also include timing control circuitry 234. The timing control circuit 234 includes the processing and timing circuitry required to generate the synchronization signal SYNC and transmit this signal SYNC to the second video image sensor device 222 to trigger initiation of a second video image data stream of the second video image "s" 205 from the second video image sensor 204. The timing control circuit 234 clocks the synchronization signal SYNC with respect to the data of the first video image sensor 202. To this end, the timing control circuit 234 receives a timing signal from the first video image sensor 202.
The timing control circuit 234 may receive a selection input, which may be provided, for example, by a user, or may be pre-programmed. The selection input defines a desired position of the second video image "s" 205 within the first video image "F" 203 of the composite PIP video image 209. Based on the desired position input at the select input, the timing control circuit 234 generates a synchronization signal SYNC with respect to the first video image data stream from the first video image sensor 202, transmits this signal SYNC on line or bus 224 to the timing control circuit 238 in the second video image sensor 222 to initiate a video image data stream of the second video image "s" 205 by the second video image sensor 204 in the second video image sensor device 222 such that the second video image "s" 205 appears at the selected desired position within the first video image "F" 203 in the composite PIP video image 209 when the first video image "F" 203 and the second video image "s" 205 are combined.
The second video image sensor device 222 includes a video image sensor 204 that generates and outputs a second video image data stream of a second video image "s" 205 to a data transmission circuit 240, which may be, for example, a serial data transmitter, that transmits the second video image data stream to the first video image sensor device 220 via a serial data port. The transmission of the second video image data stream from the second video image sensor 204 to the data transmission circuit 240 is triggered by the timing control circuit 238, the timing control circuit 238 commanding the second video image sensor 204 to initiate the transmission of the second video image data stream by a control signal in response to the synchronization signal SYNC received from the first video image sensor device 220.
As described above, the data receiver/data buffer circuit 232 in the first video image sensor device 220 receives the second video image data stream from the data transmission circuit 240 in the second video image sensor device 222. The second video image data stream is transmitted by the data receiver/data buffer circuit 232 to the PIP image combining logic 236, which PIP image combining logic 236 may be considered at least a part of the overlay logic 206 described in detail above. The PIP picture combining logic 236 also receives a first video image data stream from the first video image sensor 202 and PIP image boundary data from the PIP image boundary generator circuit 231. The PIP image boundary generator circuit 231 also receives information about the second video image "s" 205 from the data receiver/data buffer circuit 232 such that the data of the PIP boundary 207 generated by the PIP boundary generator circuit 231 includes appropriate position information such that the PIP boundary 207 is properly positioned around the second video image "s" 205. PIP picture combining logic 236 combines data of the first video picture "F" 203 in the first video picture data stream, data of the second video picture "s" 205 in the second video picture data stream, and PIP picture boundary data from PIP picture boundary generator circuit 231 to generate data of a composite PIP video picture 209. A video image data stream of a composite PIP video image 209 is output from the first video image sensor device 220 to the device 208 for display and/or storage of the composite PIP video image 209.
Fig. 5 includes a schematic timing diagram illustrating the generation of a composite PIP video image 209 from a first video image "F" 203, a second video image "s" 205 and PIP boundary image data, in accordance with some example embodiments. Referring to FIG. 5, the uppermost timing diagram illustrates the data valid time periods for a first video image data stream of two typical straight lines N and N +1 in a first video image "F" 203 from a first video image sensor 202 ("sensor 1"). The second timing diagram illustrates the data valid time periods for a second video image data stream of two typical straight lines M and M +1 in a second video image "s" 205 from a second video image sensor 204 ("sensor 2"). The third timing diagram illustrates the data valid time periods of the selectable boundary 207 for two typical straight lines M and M +1 in the second video image "s" 205. It should be noted that the timing diagrams of the boundary data for lines M and M +1 illustrate the left and right vertical lines in the straight lines where the used portion of boundary 207 is boundary 207. For the case of the top and bottom horizontal lines of the boundary 207, the data valid period of the boundary data will be substantially the same as the sensor 2 data valid time period of the straight lines M and M +1 of the second timing diagram of fig. 5.
The fourth diagram of fig. 5 illustrates the timing of the composite PIP video image data stream after the combination of the first and second video image data streams and the boundary data is complete. Referring to the fourth timing diagram in fig. 5, the combined PIP video image data includes data invalid time periods that occur between straight lines of the first video image "F" 203. The timing then includes a sensor 1 data period that coincides with a straight portion of the composite PIP video image 209 to the left of the second video image "s" 205. Next, the timing includes a first boundary data period, which coincides with the left vertical line of the PIP boundary 207. The timing then includes a sensor 2 data period that coincides with a line of the second video image "s" 205 within the current line of the composite PIP video image 209. The timing then includes a second boundary data period, which coincides with the right vertical line of the PIP boundary 207. The timing then includes another sensor 1 data period that coincides with the current straight line portion in the composite PIP video image 209 to the right of the second video image "s" 205. At the end of the second sensor 1 data period, the end of the current straight line is indicated by another data invalid time period. The next line, line N +1, begins with the first sensor 1 data time period for the next line. This timing is repeated for each line of the composite PIP video image 209 that includes a line of the second video image "s" 205.
As described above, the timing diagram of fig. 5 presents a composite PIP video image 209 that is substantially rectilinear, including the left and right vertical lines of the PIP boundary 207. As noted above, PIP boundary 207 is optional and need not be included in composite PIP video image 209. The PIP boundary data time period of the third timing diagram of fig. 5 does not occur if the PIP boundary 207 is omitted. As a result, the fourth timing diagram of FIG. 5 includes only data-invalid time periods, sensor 1 data time periods, and sensor 2 data time periods.
It should be noted that the timing diagram of fig. 5 is an illustrative example used to provide a clear description. The timing of the respective data valid and data invalid periods depends on the two-dimensional position of the second video image "s" 205 within the first video image "F" 203 of the composite PIP video image 209. As illustrated in the exemplary illustration of fig. 5, the second video image "s" 205 is horizontally centered in the first video image "F" 203. As indicated above, this positioning may be controlled by controlling the timing of the synchronization signal SYNC. The vertical positioning of the second video image "s" 205 within the first video image "F" 203 of the composite PIP video image 209 is not illustrated in fig. 5, but it may also be controlled by controlling the timing of the synchronization signal SYNC, as described above.
Combinations of features
Various features of the disclosure have been described above in detail. This disclosure covers any and all combinations of the many features described herein, except where the description explicitly excludes combinations of features. The following examples illustrate some of the combinations of features contemplated and disclosed herein in accordance with this disclosure.
In any of the embodiments described in detail and/or claimed herein, the first image sensor device and the overlay logic are formed in a first substrate.
In any of the embodiments described in detail and/or claimed herein, the second image sensor device is formed in the second substrate.
In any of the embodiments described in detail and/or claimed herein, the first signal comprises a first video image data stream and the second signal comprises a second video image data stream.
In any of the embodiments described in detail and/or claimed herein, the first image sensor device generates a synchronization signal in response to which the second video image data stream is triggered.
In any of the embodiments described in detail and/or claimed herein, the synchronization signal triggers the second video image data stream to start at a selected position of the first video image data stream.
In any of the embodiments described in detail and/or claimed herein, the first and second video image data streams have first and second frame rates, respectively.
In any of the embodiments described in detail and/or claimed herein, the first and second frame rates are substantially equal.
While the present disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure.

Claims (19)

1. A picture-in-picture, PIP, system comprising:
a first image sensor device for detecting light from a first subject and generating a first signal indicative of image data of the first subject, the first signal comprising a first stream of video image data;
a second image sensor device for detecting light from a second subject and generating a second signal indicative of image data of the second subject, the second signal comprising a second stream of video image data; and
overlay logic to combine the first signal and second signal to generate a picture-in-picture signal indicative of a combination of an image of the first subject and an image of the second subject, wherein the overlay logic is located within the first image sensor device,
wherein the first image sensor device generates a synchronization signal in response to which the second video image data stream is triggered.
2. The PIP system of claim 1, wherein the first image sensor device and the overlay logic are formed in a first substrate.
3. The PIP system of claim 2, wherein the second image sensor device is formed in a second substrate.
4. The PIP system of claim 1, wherein the synchronization signal triggers the second video image data stream to begin at a selected location of the first video image data stream.
5. The PIP system of claim 1, wherein the first and second video image data streams have first and second frame rates, respectively.
6. The PIP system of claim 5, wherein the first and second frame rates are substantially equal.
7. A picture-in-picture, PIP, system comprising:
a first image sensor device for detecting light from a first subject and generating a first signal indicative of image data of the first subject, the first signal comprising a first stream of video image data, the first image sensor device generating a synchronization signal;
a second image sensor device for detecting light from a second subject and generating a second signal indicative of image data of the second subject, the second signal comprising a second video image data stream, the second video image data stream being triggered in response to the synchronization signal; and
overlay logic to combine the first signal and the second signal to generate a picture-in-picture signal indicative of a combination of the image of the first subject and the image of the second subject.
8. The PIP system of claim 7, wherein the overlay logic is located within the first image sensor device.
9. The PIP system of claim 7, wherein the synchronization signal triggers the second video image data stream to begin at a selected location of the first video image data stream.
10. The PIP system of claim 7, wherein the first and second video image data streams have first and second frame rates, respectively.
11. The PIP system of claim 10, wherein the first and second frame rates are substantially equal.
12. A picture-in-picture, PIP, method comprising:
detecting light from a first subject with a first image sensor device and generating a first signal indicative of image data of the first subject, the first signal comprising a first stream of video image data;
detecting light from a second subject with a second image sensor device and generating a second signal indicative of image data of the second subject, the second signal comprising a second stream of video image data;
combining the first and second signals to generate a picture-in-picture signal indicative of a combination of the image of the first subject and the image of the second subject, the combination being performed by the first image sensor device; and
triggering the second video image data stream in response to a synchronization signal generated by the first image sensor device.
13. The PIP method of claim 12, wherein the synchronization signal triggers the second video image data stream to begin at a selected location of the first video image data stream.
14. The PIP method of claim 12, wherein the first and second video image data streams have first and second frame rates, respectively.
15. The PIP method of claim 14, wherein the first and second frame rates are substantially equal.
16. A picture-in-picture, PIP, method comprising:
detecting light from a first subject with a first image sensor device and generating a first signal indicative of image data of the first subject, the first signal comprising a first video image data stream, the first image sensor device generating a synchronization signal;
detecting light from a second subject with a second image sensor device and generating a second signal indicative of image data of the second subject, the second signal comprising a second stream of video image data;
triggering the second video image data stream in response to the synchronization signal; and
combining the first and second signals to generate a picture-in-picture signal indicative of a combination of the image of the first subject and the image of the second subject.
17. The PIP method of claim 16, wherein the synchronization signal triggers the second video image data stream to begin at a selected location of the first video image data stream.
18. The PIP method of claim 16, wherein the first and second video image data streams have first and second frame rates, respectively.
19. The PIP method of claim 18, wherein the first and second frame rates are substantially equal.
HK14104910.3A 2012-07-25 2014-05-26 Apparatus and method for generating picture-in-picture (pip) image HK1191786B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/558,207 2012-07-25

Publications (2)

Publication Number Publication Date
HK1191786A HK1191786A (en) 2014-08-01
HK1191786B true HK1191786B (en) 2018-02-15

Family

ID=

Similar Documents

Publication Publication Date Title
US11240404B2 (en) Systems and methods for synchronizing sensor capture
US10863155B2 (en) Reduction of banding artifacts in image processing
US9626937B2 (en) Driving method and driving system for display panel
EP2442561A2 (en) Device and method for providing a three-dimensional pip image
US20170344330A1 (en) Multi-display device
KR20180056540A (en) Frame grabber, image processing system including the same, and image processing method using the frame grabber
JP5447004B2 (en) Transmission device, display device, shutter glasses, transmission / reception system, display system, and transmission / reception method
US9088750B2 (en) Apparatus and method for generating picture-in-picture (PIP) image
US20130235059A1 (en) Image compositing apparatus
CN108616674A (en) Two-way video-signal timing sequence generating circuit structure with outer synchronizing function
US20150116459A1 (en) Sensing device and signal processing method thereof
HK1191786A (en) Apparatus and method for generating picture-in-picture (pip) image
HK1191786B (en) Apparatus and method for generating picture-in-picture (pip) image
US20140176515A1 (en) Display device and method for processing frame thereof
JP2014207492A (en) Stereoscopic image display device
US20200045359A1 (en) Reproduction device, generation device, reproduction system, program, recording medium
JP2009177331A (en) Image signal transfer system, method, and imaging device with the transfer system
KR100824016B1 (en) Systems, methods and recording media for synchronizing output data generated by an asynchronous binocular camera
KR100744371B1 (en) Stereoscopic image display device and stereoscopic image processing method
JP2013219624A (en) Imaging apparatus
KR101989307B1 (en) An apparatus for synchronizing a video signal and a control signal for a vehicle
US20140085497A1 (en) Systems And Methods For Sychronizing Multiple Video Sensors
KR101664403B1 (en) Synchronization Apparatus, Synchronization Method And Synchronization System, and 3D Display Apparatus Using The Same
KR102183906B1 (en) Method and system for image fusion
KR101264108B1 (en) High speed video transition switch and switching method