TWI913008B - Method and device of synchronizing images applied to vehicles - Google Patents
Method and device of synchronizing images applied to vehiclesInfo
- Publication number
- TWI913008B TWI913008B TW113143638A TW113143638A TWI913008B TW I913008 B TWI913008 B TW I913008B TW 113143638 A TW113143638 A TW 113143638A TW 113143638 A TW113143638 A TW 113143638A TW I913008 B TWI913008 B TW I913008B
- Authority
- TW
- Taiwan
- Prior art keywords
- buffer
- frames
- processor
- image
- frame
- Prior art date
Links
Abstract
Description
本揭示涉及影像處理的技術,尤其涉及應用於載具的影像同步方法以及裝置。This disclosure relates to image processing techniques, and more particularly to image synchronization methods and devices applied to vehicles.
目前的載具上大多設置有監視攝影機。最常見的設置方式是將監視攝影機安裝於載具的前方或後方,以供使用者在載具內的顯示器上觀看所拍攝到的影像。多個傳統的監視攝影機雖可同時拍攝前後方路況,然而,這些影像是分別傳送並儲存至車載系統,車載系統通常不會記錄各監視攝影機拍攝到影像的時間。而當這些監視攝影機將影像傳送至車載系統時,這些監視攝影機與車載系統之間可能存在不同的傳輸延遲。不同的傳輸延遲會造成同時拍攝到的這些影像存在不同步的問題。因此,要怎麼解決由不同的傳輸延遲所造成的影像不同步是本領域技術人員亟欲解決的問題。Most vehicles currently equipped with surveillance cameras. The most common setup involves mounting the cameras at the front or rear of the vehicle, allowing users to view the captured footage on a display inside the vehicle. While multiple traditional surveillance cameras can simultaneously capture road conditions in front and behind, these images are transmitted and stored separately to the vehicle's onboard system. The onboard system typically does not record the time each camera captured the image. Furthermore, when these surveillance cameras transmit images to the onboard system, there may be varying transmission delays between the cameras and the system. These different delays can cause the simultaneously captured images to be out of sync. Therefore, how to solve the image asynchrony caused by different transmission delays is a problem that technical personnel in this field urgently want to solve.
本揭示之主要目的,在於提供一種應用於載具的影像同步方法以及裝置,可解決由不同的傳輸延遲所造成的影像不同步的問題。The main purpose of this disclosure is to provide an image synchronization method and device for use in vehicles, which can solve the problem of image asynchrony caused by different transmission delays.
為了達成上述之目的,本揭示提出的應用於載具的影像同步方法,包括: 步驟a)藉由一載具上的一處理器,經由該載具上的一傳輸電路檢測多個攝影電路與該處理器之間的多個延遲時間; 步驟b)藉由該處理器,根據該多個延遲時間計算該載具上的一記憶體中的多個緩衝佇列各自的一緩衝幀數,其中該多個緩衝佇列分別對應於該多個攝影電路; 步驟c)藉由該處理器,經由該傳輸電路控制該多個攝影電路以多個拍攝方向分別拍攝該載具周圍以產生多個影像串流,並控制該傳輸電路將該多個影像串流分別傳送至該記憶體中的該多個緩衝佇列; 步驟d)藉由該處理器,檢測各該緩衝佇列中暫存的該影像串流的一佇列幀數; 步驟e)藉由該處理器,根據各該緩衝佇列的該緩衝幀數以及該佇列幀數判斷是否取出該多個緩衝佇列中各自暫存的該影像串流的一首幀,其中該首幀為各該影像串流最先儲存至對應的該緩衝佇列的一影像幀; 步驟f)當判斷取出該多個緩衝佇列中各自暫存的該影像串流的該首幀時,藉由該處理器,將所有該首幀合併為一合併影像;以及 步驟g)當判斷不取出各該緩衝佇列中暫存的該影像串流的該首幀時,再次執行該步驟c)至該步驟e)。 To achieve the above objectives, the present disclosure discloses an image synchronization method for a vehicle, comprising: Step a) Detecting multiple delay times between multiple camera circuits and the processor via a transmission circuit on the vehicle using a processor on the vehicle; Step b) Calculating, using the processor, a buffer frame number for each of multiple buffer sequences in a memory on the vehicle based on the multiple delay times, wherein the multiple buffer sequences correspond to the multiple camera circuits respectively; Step c) The processor, via the transmission circuit, controls the multiple camera circuits to capture images of the area around the vehicle from multiple shooting directions to generate multiple image streams, and controls the transmission circuit to transmit the multiple image streams to the multiple buffer queues in the memory respectively; Step d) The processor, via the transmission circuit, detects the number of frames of one buffer queue of the image stream temporarily stored in each buffer queue; Step e) The processor, based on the number of buffer frames and the number of frames in each buffer queue, determines whether to retrieve the first frame of the temporarily stored image stream in each of the plurality of buffer queues, wherein the first frame is the first image frame of each image stream stored in the corresponding buffer queue; Step f) When it is determined that the first frame of the temporarily stored image stream in each of the plurality of buffer queues should be retrieved, the processor merges all the first frames into a single merged image; and If, in step g), it is determined that the first frame of the image stream temporarily stored in each of the buffer queues should not be retrieved, then steps c) through e) are executed again.
為了達成上述之目的,本揭示提出的應用於載具的影像同步裝置,包括: 多個攝影電路,設置於一載具上,並經配置以以多個拍攝方向分別拍攝載具周圍; 一傳輸電路,連接該多個攝影電路,並經配置以接收一影像串流; 一記憶體,連接該傳輸電路,並經配置以儲存多個指令以及多個緩衝佇列,其中該多個緩衝佇列分別對應於該多個攝影電路; 一處理器,連接該傳輸電路以及該記憶體,並經配置以存取該多個指令以執行以下步驟: 步驟a)經由該傳輸電路檢測該多個攝影電路與該處理器之間的多個延遲時間; 步驟b)根據該多個延遲時間計算該載具上的該多個緩衝佇列各自的一緩衝幀數; 步驟c)經由該傳輸電路控制該多個攝影電路以多個拍攝方向分別拍攝該載具周圍以產生多個影像串流,並控制該傳輸電路將該多個影像串流分別傳送至該多個緩衝佇列; 步驟d)檢測各該緩衝佇列中暫存的該影像串流的一佇列幀數; 步驟e)根據各該緩衝佇列的該緩衝幀數以及該佇列幀數判斷是否取出該多個緩衝佇列中各自暫存的該影像串流的一首幀,其中該首幀為各該影像串流最先儲存至對應的該緩衝佇列的一影像幀; 步驟f)當判斷取出該多個緩衝佇列中各自暫存的該影像串流的該首幀時,將所有該首幀合併為一合併影像;以及 步驟g)當判斷不取出各該緩衝佇列中暫存的該影像串流的該首幀時,再次執行該步驟c)至該步驟e)。 To achieve the above objectives, the image synchronization device for a vehicle disclosed herein includes: a plurality of camera circuits disposed on a vehicle and configured to capture images of the area surrounding the vehicle from multiple shooting directions; a transmission circuit connected to the plurality of camera circuits and configured to receive an image stream; a memory connected to the transmission circuit and configured to store a plurality of instructions and a plurality of buffer queues, wherein the plurality of buffer queues correspond to the plurality of camera circuits; a processor connected to the transmission circuit and the memory and configured to access the plurality of instructions to perform the following steps: Step a) Detect multiple delay times between the plurality of camera circuits and the processor via the transmission circuit; Step b) Calculate the number of buffered frames for each of the plurality of buffered columns on the vehicle based on the multiple delay times; Step c) Control the plurality of camera circuits via the transmission circuit to capture images of the area around the vehicle from multiple shooting directions to generate multiple image streams, and control the transmission circuit to transmit the multiple image streams to the plurality of buffered columns respectively; Step d) Detect the number of frames in a column of the image stream temporarily stored in each buffered column; Step e) Based on the number of buffer frames and the number of frames in each buffer queue, determine whether to retrieve the first frame of the temporarily stored image stream in each of the plurality of buffer queues, wherein the first frame is the first image frame of each image stream stored in the corresponding buffer queue; Step f) When it is determined that the first frame of the temporarily stored image stream in each of the plurality of buffer queues should be retrieved, merge all the first frames into a single merged image; and If, in step g), it is determined that the first frame of the image stream temporarily stored in each of the buffer queues should not be retrieved, then steps c) through e) are executed again.
相較於先前技術,本揭示先檢測拍攝載具四周的攝影電路的延遲時間以設定各緩衝佇列的緩衝幀數,並比較各緩衝佇列的佇列幀數以及緩衝幀數以判斷什麼時候取出各緩衝佇列的首幀以進行環景的影像同步。藉此,本揭示將可解決由各攝影電路之不同的傳輸延遲所造成的影像不同步的問題。Compared to previous technologies, this invention first detects the latency of the camera circuits around the shooting vehicle to set the number of buffer frames for each buffer sequence, and compares the number of frames in each buffer sequence with the number of buffer frames to determine when to retrieve the first frame of each buffer sequence for ambient image synchronization. In this way, this invention can solve the problem of image asynchrony caused by different transmission delays in each camera circuit.
參照圖1,圖1繪示本揭示在一些實施例中的應用於載具的影像同步裝置100的方塊圖。如圖1所示,應用於載具的影像同步裝置100(下稱影像同步裝置100)包括多個攝影電路111~11n、傳輸電路120、記憶體130以及處理器140,其中n為大於1的正整數。傳輸電路120連接於攝影電路111~11n、記憶體130以及處理器140。記憶體130連接於處理器140。在一些實施例中,攝影電路111~11n皆以無線連接的方式連接於傳輸電路120。值得注意的是,無線連接的方式會導致攝影電路111~11n與傳輸電路120之間分別存在多個傳輸延遲,並且這些傳輸延遲可能相同或不相同(例如,200毫秒、150毫秒以及100毫秒)。Referring to FIG1, FIG1 illustrates a block diagram of an image synchronization device 100 applied to a vehicle according to some embodiments of the present invention. As shown in FIG1, the image synchronization device 100 applied to a vehicle (hereinafter referred to as the image synchronization device 100) includes multiple camera circuits 111-11n, a transmission circuit 120, a memory 130, and a processor 140, where n is a positive integer greater than 1. The transmission circuit 120 is connected to the camera circuits 111-11n, the memory 130, and the processor 140. The memory 130 is connected to the processor 140. In some embodiments, the camera circuits 111-11n are all wirelessly connected to the transmission circuit 120. It is worth noting that the wireless connection method will result in multiple transmission delays between the camera circuits 111~11n and the transmission circuit 120, and these transmission delays may be the same or different (e.g., 200 milliseconds, 150 milliseconds and 100 milliseconds).
在另一些實施例中,攝影電路111~11n的一部份以無線連接的方式連接於傳輸電路120,而攝影電路111~11n的另一部份以有線連接的方式連接於傳輸電路120。值得注意的是,無線連接的方式會導致攝影電路111~11n的一部份與傳輸電路120之間分別存在多個不同的傳輸延遲(例如,200毫秒以及100毫秒),而有線連接的方式則會使攝影電路111~11n的另一部份與傳輸電路120之間不存在傳輸延遲(即,傳輸延遲為0秒)。值得注意的是,若要避免傳輸延遲而讓攝影電路111~11n皆以有線連接的方式連接於傳輸電路120時,將會耗費大量的佈線成本。反之,若讓攝影電路111~11n皆以無線連接的方式連接於傳輸電路120,雖然可以大大節省佈線成本,但會造成傳輸延遲。In other embodiments, a portion of the camera circuits 111-11n is wirelessly connected to the transmission circuit 120, while another portion of the camera circuits 111-11n is wiredly connected to the transmission circuit 120. It is worth noting that the wireless connection results in several different transmission delays (e.g., 200 milliseconds and 100 milliseconds) between the portion of the camera circuits 111-11n and the transmission circuit 120, while the wired connection eliminates any transmission delay between the other portion of the camera circuits 111-11n and the transmission circuit 120 (i.e., the transmission delay is 0 seconds). It is worth noting that if camera circuits 111-11n are connected to transmission circuit 120 via wired connections to avoid transmission delays, it will incur significant wiring costs. Conversely, if camera circuits 111-11n are connected to transmission circuit 120 via wireless connections, although wiring costs can be greatly reduced, transmission delays will occur.
在本實施例中,攝影電路111~11n分別設置於載具上的相同或不同位置,並用以以多個拍攝方向分別拍攝載具的周圍的影像。在一些實施例中,攝影電路111~11n分別對應於多個拍攝方向。換言之,各攝影電路可從載具上以對應的拍攝方向進行拍攝以產生各自的影像串流(即,多個連續影像)。在一些實施例中,各攝影電路可設置於載具上任意可拍攝到載具周圍的位置。在一些實施例中,拍攝方向可以是朝向載具周圍的任意方向。在一些實施例中,載具可以是車輛、電動自行車、機車或腳踏車等。在一些實施例中,各攝影電路可以由任意的影像擷取電路(例如,光學攝影電路、紅外線攝影電路或三維攝影電路等)實現。In this embodiment, camera circuits 111-11n are respectively disposed at the same or different positions on the vehicle and are used to capture images of the surroundings of the vehicle from multiple shooting directions. In some embodiments, camera circuits 111-11n correspond to multiple shooting directions. In other words, each camera circuit can capture images from the vehicle from the corresponding shooting direction to generate its own image stream (i.e., multiple consecutive images). In some embodiments, each camera circuit can be disposed at any position on the vehicle that can capture images of the surroundings of the vehicle. In some embodiments, the shooting direction can be any direction towards the surroundings of the vehicle. In some embodiments, the vehicle can be a vehicle, electric bicycle, motorcycle, or bicycle, etc. In some embodiments, each imaging circuit can be implemented by any image capture circuit (e.g., optical imaging circuit, infrared imaging circuit, or 3D imaging circuit).
以下以實際例子對各攝影電路的設置方式進行說明。一併參照圖2,圖2繪示本揭示在一些實施例中的攝影電路111~114的設置方式的示意圖。如圖2所示,假設影像同步裝置100具有四個攝影電路111~114,攝影電路111可設置於載具VH的車頭前緣,攝影電路112可設置於載具VH的其中一側的側門柱上,攝影電路113可設置於載具VH的其中另一側的側門柱上,攝影電路114可設置於載具VH的車尾前緣。The following describes the configuration of each camera circuit using practical examples. Referring also to Figure 2, Figure 2 illustrates the configuration of camera circuits 111-114 in some embodiments disclosed herein. As shown in Figure 2, assuming the image synchronization device 100 has four camera circuits 111-114, camera circuit 111 can be installed at the front edge of the vehicle VH, camera circuit 112 can be installed on one side door pillar of the vehicle VH, camera circuit 113 can be installed on the other side door pillar of the vehicle VH, and camera circuit 114 can be installed at the rear edge of the vehicle VH.
此外,攝影電路111~114分別具有多個拍攝範圍210~240,攝影電路111~114的拍攝方向分別對應於拍攝範圍210~240。即,攝影電路111的拍攝方向可正對攝影電路111的拍攝範圍210,攝影電路112的拍攝方向可正對攝影電路112的拍攝範圍220,攝影電路113的拍攝方向可正對攝影電路113的拍攝範圍230,攝影電路114的拍攝方向可正對攝影電路114的拍攝範圍240。Furthermore, the camera circuits 111-114 each have multiple shooting ranges 210-240, and the shooting directions of the camera circuits 111-114 correspond to the shooting ranges 210-240, respectively. That is, the shooting direction of camera circuit 111 can be directly facing the shooting range 210 of camera circuit 111, the shooting direction of camera circuit 112 can be directly facing the shooting range 220 of camera circuit 112, the shooting direction of camera circuit 113 can be directly facing the shooting range 230 of camera circuit 113, and the shooting direction of camera circuit 114 can be directly facing the shooting range 240 of camera circuit 114.
回到圖1,在本實施例中,傳輸電路120用以從各個攝影電路111~11n分別接收影像串流。在一些實施例中,傳輸電路120可以由任意用以進行通訊的通訊電路(例如,控制區域網路(controller area network, CAN)的通訊電路、Wi-Fi通訊電路或藍芽通訊電路)實現。Returning to Figure 1, in this embodiment, the transmission circuit 120 is used to receive image streams from each of the camera circuits 111-11n. In some embodiments, the transmission circuit 120 can be implemented by any communication circuit used for communication (e.g., a controller area network (CAN) communication circuit, a Wi-Fi communication circuit, or a Bluetooth communication circuit).
在本實施例中,記憶體130儲存多個緩衝佇列,多個緩衝佇列分別對應於攝影電路111~11n。例如,記憶體130可儲存n個緩衝佇列,其中第一緩衝佇列對應於攝影電路111、第二緩衝佇列對應於攝影電路112、第三緩衝佇列對應於攝影電路11n,以此類推。在一些實施例中,多個緩衝佇列皆為記憶體130中的先進先出(first in first out, FIFO)暫存區塊。在一些實施例中,各緩衝佇列分別暫存對應的攝影電路所產生的影像串流中的連續的多個幀。換言之,每當傳輸電路120從其中一攝影電路依序接收到影像串流的多個幀,處理器140可控制傳輸電路120將這些幀依序傳送到與此攝影電路對應的緩衝佇列。藉此,對應的緩衝佇列可以先進先出的方式依序儲存這些幀。In this embodiment, memory 130 stores multiple buffer queues, each corresponding to a camera circuit 111 to 11n. For example, memory 130 may store n buffer queues, where the first buffer queue corresponds to camera circuit 111, the second buffer queue corresponds to camera circuit 112, the third buffer queue corresponds to camera circuit 11n, and so on. In some embodiments, the multiple buffer queues are all first-in-first-out (FIFO) temporary blocks in memory 130. In some embodiments, each buffer queue temporarily stores multiple consecutive frames from the image stream generated by the corresponding camera circuit. In other words, whenever the transmission circuit 120 sequentially receives multiple frames of the image stream from one of the camera circuits, the processor 140 can control the transmission circuit 120 to sequentially transmit these frames to the buffer queue corresponding to that camera circuit. In this way, the corresponding buffer queue can sequentially store these frames in a first-in-first-out manner.
以下以實際的例子對緩衝佇列進行說明。一併參照圖3,圖3繪示本揭示在一些實施例中的緩衝佇列131~13n的示意圖。如圖3所示,假設緩衝佇列131~13n分別對應於攝影電路111~11n,緩衝佇列131~13n可以先進先出的方式暫存攝影電路111~11n分別產生的多個影像串流img1~imgn的多個幀。The following describes the buffer array with a practical example. Referring also to Figure 3, Figure 3 illustrates a schematic diagram of buffer arrays 131-13n in some embodiments. As shown in Figure 3, assuming that buffer arrays 131-13n correspond to camera circuits 111-11n respectively, buffer arrays 131-13n can temporarily store multiple frames of multiple image streams img1-imgn generated by camera circuits 111-11n in a first-in-first-out manner.
回到圖1,在一些實施例中,記憶體130更儲存多個指令,處理器140基於這些指令來執行後續段落所描述的詳細步驟。在一些實施例中,記憶體130可以由快閃記憶體、唯讀記憶體、硬碟或任何具相等性的儲存組件等實現,但不以此為限。在一些實施例中,處理器140可以由中央處理單元(central processing unit, CPU)、微控制單元(micro control unit, MCU)、可程式化邏輯控制器(programmable logic controller, PLC)、系統單晶片(system on chip, SoC)或現場可程式邏輯閘陣列(field programmable gate array, FPGA)等實現,但不以此為限。在一些實施例中,記憶體130更用以儲存後續段落的合併影像。Returning to Figure 1, in some embodiments, memory 130 stores multiple instructions, upon which processor 140 executes the detailed steps described in subsequent paragraphs. In some embodiments, memory 130 may be implemented using flash memory, read-only memory, a hard disk, or any equivalent storage device, but is not limited thereto. In some embodiments, processor 140 may be implemented using a central processing unit (CPU), a microcontroller unit (MCU), a programmable logic controller (PLC), a system-on-a-chip (SoC), or a field-programmable gate array (FPGA), but is not limited thereto. In some embodiments, memory 130 is also used to store merged images of subsequent segments.
在一些實施例中,影像同步裝置100更可包括顯示器150,顯示器150連接處理器140,並用以顯示後續段落的合併影像。在一些實施例中,顯示器150可以由任意的顯示面板或顯示電路(例如,液晶螢幕或觸控式面板)等實現。在一些實施例中,傳輸電路120、記憶體130、處理器140以及顯示器150可設置於載具上的車載系統(例如,控制區域網路)中。In some embodiments, the image synchronization device 100 may further include a display 150 connected to the processor 140 for displaying merged images of subsequent segments. In some embodiments, the display 150 may be implemented using any display panel or display circuit (e.g., an LCD screen or a touch panel). In some embodiments, the transmission circuit 120, memory 130, processor 140, and display 150 may be located in an onboard system (e.g., a control area network) on a vehicle.
一併參照圖4,圖4繪示本揭示在一些實施例中的應用於載具的影像同步方法的流程圖,其中此影像同步方法適用於圖1所示的影像同步裝置100。Referring also to Figure 4, which illustrates a flowchart of an image synchronization method for a vehicle in some embodiments, wherein the image synchronization method is applicable to the image synchronization device 100 shown in Figure 1.
如圖4所示,應用於載具的影像同步方法包括步驟S410~S460。首先,於步驟S410中,處理器140經由傳輸電路120檢測攝影電路111~11n與處理器140之間的多個延遲時間。在一些實施例中,各延遲時間指示對應的攝影電路111~11n傳輸影像串流到處理器140所耗費的時間。As shown in Figure 4, the image synchronization method applied to the vehicle includes steps S410 to S460. First, in step S410, the processor 140 detects multiple delay times between the camera circuits 111-11n and the processor 140 via the transmission circuit 120. In some embodiments, each delay time indicates the time consumed by the corresponding camera circuits 111-11n to transmit the image stream to the processor 140.
以下進一步對延遲時間的檢測進行說明。一併參照圖5,圖5繪示本揭示在一些實施例中的應用於載具的影像同步方法的步驟S410包括的多個步驟S411~S413的流程圖。如圖5所示,圖4的步驟S410包括步驟S411~S413。The detection of delay time will be further explained below. Referring also to Figure 5, Figure 5 illustrates a flowchart of steps S411 to S413, which are included in step S410 of the image synchronization method for a vehicle disclosed in some embodiments. As shown in Figure 5, step S410 of Figure 4 includes steps S411 to S413.
於步驟S411中,處理器140控制傳輸電路120向攝影電路111~11n傳送時間戳(timestamp)。在一些實施例中,處理器140週期性地產生一個時間戳,並同時將此時間戳傳送至攝影電路111~11n,其中時間戳指示影像同步裝置100所連接的車載系統的系統時間。換言之,處理器140可週期性地利用時間戳確認與各攝影電路111~11n對應的延遲時間。In step S411, processor 140 controls transmission circuit 120 to transmit timestamps to camera circuits 111-11n. In some embodiments, processor 140 periodically generates a timestamp and simultaneously transmits this timestamp to camera circuits 111-11n, wherein the timestamp indicates the system time of the vehicle system to which the image synchronization device 100 is connected. In other words, processor 140 can periodically use timestamps to confirm the delay time corresponding to each camera circuit 111-11n.
於步驟S412中,處理器140控制傳輸電路120從攝影電路111~11n分別接收多個回饋訊息。在一些實施例中,當各攝影電路111~11n接收時間戳時,各攝影電路將接收到的時間戳做為回饋訊息回傳至傳輸電路120,以使處理器140通過傳輸電路120分別接收到攝影電路111~11n各自的回饋訊息。In step S412, processor 140 controls transmission circuit 120 to receive multiple feedback messages from camera circuits 111 to 11n respectively. In some embodiments, when each camera circuit 111 to 11n receives a timestamp, each camera circuit sends the received timestamp back to transmission circuit 120 as a feedback message, so that processor 140 can receive the respective feedback messages from camera circuits 111 to 11n through transmission circuit 120.
於步驟S413中,處理器140計算傳送時間戳的傳送時間與分別接收多個回饋訊息的多個接收時間之間的多個時間差,做為攝影電路111~11n與處理器之間的多個延遲時間。在一些實施例中,處理器140記錄對外傳送時間戳的時間做為傳送時間。在一些實施例中,處理器140記錄接收到各攝影電路的回饋訊息的時間做為各接收時間。換言之,處理器140可分別計算傳送時間與各接收時間之間的時間差做為與各攝影電路對應的延遲時間。In step S413, processor 140 calculates multiple time differences between the transmission time of the transmission timestamp and multiple reception times of receiving multiple feedback messages, as multiple delay times between camera circuits 111-11n and the processor. In some embodiments, processor 140 records the time of external transmission of timestamps as the transmission time. In some embodiments, processor 140 records the time of receiving feedback messages from each camera circuit as each reception time. In other words, processor 140 can calculate the time difference between the transmission time and each reception time as the delay time corresponding to each camera circuit.
回到圖4,於步驟S420中,處理器140根據多個延遲時間分別計算記憶體130中的多個緩衝佇列131~13n各自的緩衝(buffer)幀數。在一些實施例中,各緩衝佇列的緩衝幀數指示各緩衝佇列需要緩衝的幀數(即,此緩衝佇列需要緩衝此幀數才能達成影像同步)。Returning to Figure 4, in step S420, the processor 140 calculates the buffer frame count for each of the multiple buffer columns 131~13n in memory 130 based on the multiple delay times. In some embodiments, the buffer frame count for each buffer column indicates the number of frames that each buffer column needs to buffer (i.e., this buffer column needs to buffer this number of frames to achieve image synchronization).
以下進一步對緩衝幀數的計算進行說明。一併參照圖6,圖6繪示本揭示在一些實施例中的應用於載具的影像同步方法的步驟S420包括的多個步驟S421~S422的流程圖。如圖6所示,圖4的步驟S410包括步驟S421~S422。The calculation of the buffer frame number will be explained further below. Referring also to Figure 6, Figure 6 illustrates a flowchart of steps S420, which includes multiple steps S421 to S422, in an image synchronization method for a vehicle disclosed in some embodiments. As shown in Figure 6, step S410 of Figure 4 includes steps S421 to S422.
於步驟S421中,處理器140於在步驟S413中計算出的多個延遲時間中取出最大者(即,最大延遲時間),並且分別計算多個延遲時間與多個延遲時間中的最大者之間的多個時間差。換言之,處理器140可先取得最大的延遲時間,並計算最大的延遲時間與各延遲時間之間的時間差。例如,處理器140計算第一延遲時間與最大延遲時間之間的第一時間差(對應至第一攝影電路)、計算第二延遲時間與最大延遲時間之間的第二時間差(對應至第二攝影電路)、計算第n延遲時間與最大延遲時間之間的第n時間差(對應至第n攝影電路)等,以此類推。In step S421, processor 140 selects the largest of the multiple delay times calculated in step S413 (i.e., the maximum delay time), and calculates multiple time differences between the multiple delay times and the maximum of the multiple delay times. In other words, processor 140 can first obtain the maximum delay time and calculate the maximum delay time and the time differences between each delay time. For example, processor 140 calculates a first time difference between a first delay time and a maximum delay time (corresponding to the first camera circuit), calculates a second time difference between a second delay time and a maximum delay time (corresponding to the second camera circuit), calculates an nth time difference between an nth delay time and a maximum delay time (corresponding to the nth camera circuit), and so on.
於步驟S422中,處理器140計算多個時間差與攝影電路111~11n的幀率(frame rate)之間的多個乘積值做為緩衝佇列131~13n的多個緩衝幀數。在一些實施例,攝影電路111~11n的幀率可完全相同。換言之,處理器140可分別計算出各時間差與此幀率之間的乘積值,以做為與各時間差對應的各個緩衝佇列的緩衝幀數。詳細而言,進入一個緩衝佇列的一個影像幀需要等待特定數量的其他影像幀進入同一個緩衝佇列後,才可以從這個緩衝佇列被取出,其中等待的特定數量即為緩衝幀數,並且緩衝幀數與攝影電路的延遲時間成反比。In step S422, processor 140 calculates multiple product values between multiple time differences and the frame rates of camera circuits 111-11n as multiple buffer frame numbers for buffer rows 131-13n. In some embodiments, the frame rates of camera circuits 111-11n can be exactly the same. In other words, processor 140 can calculate the product value between each time difference and this frame rate separately as the buffer frame number for each buffer row corresponding to each time difference. In detail, an image frame that enters a buffer queue must wait for a certain number of other image frames to enter the same buffer queue before it can be retrieved from that buffer queue. The specific number of waiting frames is called the buffer frame number, and the buffer frame number is inversely proportional to the delay time of the camera circuit.
舉例而言,以具有與三個緩衝佇列131~133分別對應的三個攝影電路111~113的影像同步裝置100為例,假設三個攝影電路111~113各自的延遲時間分別為200毫秒、100毫秒以及300毫秒且三個攝影電路111~113的幀率皆為30幀/秒,處理器140可先取得300毫秒做為最大的延遲時間,並計算最大的延遲時間與各延遲時間之間的時間差分別為100毫秒(300-200)、200毫秒(300-100)以及0毫秒(300-300)。接著,處理器140計算出100毫秒、200毫秒以及0毫秒的時間差與此30幀/秒的幀率之間的乘積值,以分別做為三個緩衝佇列131~133各自的緩衝幀數(即,3幀、6幀以及0幀)。For example, taking an image synchronization device 100 with three camera circuits 111-113 corresponding to three buffer arrays 131-133 as an example, assuming that the delay times of the three camera circuits 111-113 are 200 milliseconds, 100 milliseconds and 300 milliseconds respectively, and the frame rate of the three camera circuits 111-113 is 30 frames/second, the processor 140 can first obtain 300 milliseconds as the maximum delay time, and calculate the time difference between the maximum delay time and each delay time as 100 milliseconds (300-200), 200 milliseconds (300-100) and 0 milliseconds (300-300) respectively. Next, processor 140 calculates the product of the time difference of 100 milliseconds, 200 milliseconds, and 0 milliseconds with the frame rate of 30 frames per second, which are used as the buffer frame numbers for the three buffer columns 131 to 133 respectively (i.e., 3 frames, 6 frames, and 0 frames).
回到圖4,於步驟S430中,處理器140經由傳輸電路120控制攝影電路111~11n以多個拍攝方向分別拍攝載具周圍以產生多個影像串流,並控制傳輸電路120將多個影像串流分別傳送至記憶體130中的緩衝佇列131~13n。換言之,處理器140可將攝影電路111~11n各自拍攝到的影像串流的所有影像幀儲存至與攝影電路111~11n分別對應的緩衝佇列131~13n。Returning to Figure 4, in step S430, the processor 140 controls the camera circuits 111-11n via the transmission circuit 120 to capture images of the area around the vehicle from multiple shooting directions to generate multiple image streams, and controls the transmission circuit 120 to transmit the multiple image streams to the buffer arrays 131-13n in the memory 130 respectively. In other words, the processor 140 can store all the image frames of the image streams captured by each of the camera circuits 111-11n to the buffer arrays 131-13n corresponding to the camera circuits 111-11n respectively.
於步驟S440中,處理器140檢測各緩衝佇列中暫存的影像串流的佇列幀數。在一些實施例中,處理器140週期性地檢測各緩衝佇列中目前暫存的影像串流的影像幀的數量做為佇列幀數。In step S440, processor 140 detects the number of frames in each buffer queue of the image stream temporarily stored in each buffer queue. In some embodiments, processor 140 periodically detects the number of image frames in each buffer queue that are currently temporarily stored in the image stream as the number of frames in each buffer queue.
於步驟S450中,處理器140根據各緩衝佇列的緩衝幀數以及佇列幀數判斷是否取出緩衝佇列131~13n中各自暫存的影像串流的首幀。在此實施例中,首幀為各影像串流最先儲存至對應的緩衝佇列的影像幀。當處理器140判斷取出緩衝佇列131~13n中各自暫存的影像串流的首幀時,處理器140執行步驟S460。反之,當處理器140判斷不取出緩衝佇列131~13n中各自暫存的影像串流的首幀時,處理器140再次執行步驟S430。In step S450, processor 140 determines whether to retrieve the first frame of each temporarily stored image stream in buffer sequences 131-13n based on the number of buffer frames and the number of frames in each buffer sequence. In this embodiment, the first frame is the image frame of each image stream that is first stored in the corresponding buffer sequence. When processor 140 determines that it should retrieve the first frame of each temporarily stored image stream in buffer sequences 131-13n, processor 140 executes step S460. Conversely, when the processor 140 determines that it does not retrieve the first frame of each of the temporarily stored image streams in the buffer arrays 131 to 13n, the processor 140 executes step S430 again.
以下進一步對取出判斷進行說明。一併參照圖7,圖7繪示本揭示在一些實施例中的應用於載具的影像同步方法的步驟S450包括的多個步驟S451~S454的流程圖。如圖7所示,圖4的步驟S450包括步驟S451~S454。The extraction judgment will be further explained below. Referring also to Figure 7, Figure 7 illustrates a flowchart of steps S451 to S454, which are included in step S450 of the image synchronization method for a vehicle disclosed in some embodiments. As shown in Figure 7, step S450 of Figure 4 includes steps S451 to S454.
於步驟S451中,處理器140將各緩衝佇列的佇列幀數與各緩衝佇列的緩衝幀數加一進行比較。當其中一緩衝佇列的佇列幀數大於緩衝幀數加一時,處理器140執行步驟S452。而當各緩衝佇列的佇列幀數皆不大於且未皆等於各緩衝佇列的緩衝幀數加一時,處理器140執行步驟S453。此外,當各緩衝佇列的佇列幀數皆等於各緩衝佇列的緩衝幀數加一時,處理器140執行步驟S454。In step S451, processor 140 compares the number of frames in each buffer column with the number of buffered frames in each buffer column plus one. When the number of frames in one buffer column is greater than the number of buffered frames plus one, processor 140 executes step S452. When the number of frames in each buffer column is neither greater than nor equal to the number of buffered frames in each buffer column plus one, processor 140 executes step S453. In addition, when the number of frames in each buffer column is equal to the number of buffer frames in each buffer column plus one, the processor 140 executes step S454.
於步驟S452中,處理器140刪除佇列幀數大於對應的緩衝幀數加一之緩衝佇列中暫存的影像串流的首幀。換言之,處理器140只要檢測到其中一緩衝佇列中暫存的影像幀的數量大於此緩衝佇列的緩衝幀數加一,就會刪除此緩衝佇列中暫存的影像串流的首幀。接著,處理器140會再次執行步驟S451。In step S452, processor 140 deletes the first frame of the temporarily stored image stream in buffer queues whose frame count is greater than the corresponding buffer frame count plus one. In other words, processor 140 will delete the first frame of the temporarily stored image stream in a buffer queue as soon as it detects that the number of temporarily stored image frames in a buffer queue is greater than the buffer frame count of that buffer queue plus one. Then, processor 140 will execute step S451 again.
於步驟S453中,處理器140判斷不取出緩衝佇列131~13n中各自暫存的影像串流的首幀。換言之,處理器140只要檢測到所有緩衝佇列131~13n的佇列幀數皆不大於各自的緩衝幀數加一且未皆等於各自的緩衝幀數加一,就不會從緩衝佇列131~13n中取出暫存的首幀。接著,處理器140可再次執行步驟S430(即,控制緩衝佇列131~13n等待從對應的攝影電路接收新的影像串流)。在一些實施例中,當處理器140在經過特定時間(例如,30分鐘)後檢測到其中一緩衝佇列的佇列幀數一直小於此緩衝佇列的緩衝幀數加一時,處理器140可判斷此緩衝佇列已發生異常,進而不再對此緩衝佇列的佇列幀數與緩衝幀數加一進行比較(即,只對其他緩衝佇列的佇列幀數與緩衝幀數加一進行比較)。In step S453, processor 140 determines not to retrieve the first frame of each temporarily stored image stream in buffer queues 131-13n. In other words, processor 140 will not retrieve the temporarily stored first frame from buffer queues 131-13n as long as it detects that the number of frames in all buffer queues 131-13n is not greater than and not equal to the number of frames in each buffer queue plus one. Then, processor 140 can execute step S430 again (i.e., control buffer queues 131-13n to wait to receive new image streams from the corresponding camera circuits). In some embodiments, when processor 140 detects after a specific period of time (e.g., 30 minutes) that the number of frames in one of the buffered columns is consistently less than the buffered frame count plus one, processor 140 can determine that this buffered column has malfunctioned and therefore will no longer compare the number of frames in this buffered column with the buffered frame count plus one (i.e., only compare the number of frames in other buffered columns with the buffered frame count plus one).
於步驟S454中,處理器140判斷取出緩衝佇列131~13n中各自暫存的影像串流的首幀。換言之,處理器140只要檢測到所有緩衝佇列131~13n的佇列幀數皆等於各自的緩衝幀數加一,就會判斷已經可以從緩衝佇列131~13n中取出暫存的首幀以繼續執行步驟S460。In step S454, processor 140 determines that the first frame of the image stream temporarily stored in each of the buffer queues 131 to 13n can be retrieved. In other words, as long as processor 140 detects that the number of frames in all buffer queues 131 to 13n is equal to the number of buffer frames plus one, it will determine that the temporarily stored first frame can be retrieved from the buffer queues 131 to 13n to continue executing step S460.
舉例而言,延續上個例子,假設緩衝佇列131~133各自的緩衝幀數分別為3幀、6幀以及0幀且緩衝佇列131~133各自的佇列幀數分別為5幀、6幀以及0幀,處理器140可判斷緩衝佇列131的佇列幀數大於緩衝佇列131的緩衝幀數加一(即,5幀大於4幀)。基於此,處理器140刪除緩衝佇列131中暫存的影像串流的首幀(即,剩下4幀),並再次將緩衝佇列131~133各自的佇列幀數與緩衝佇列131~133各自的緩衝幀數加一進行比較。For example, continuing from the previous example, suppose the number of buffer frames for buffer columns 131 to 133 are 3, 6, and 0 frames respectively, and the number of frames for each of buffer columns 131 to 133 are 5, 6, and 0 frames respectively. The processor 140 can determine that the number of frames for buffer column 131 is greater than the number of buffer frames for buffer column 131 plus one (i.e., 5 frames is greater than 4 frames). Based on this, the processor 140 deletes the first frame of the image stream temporarily stored in the buffer queue 131 (i.e., leaving 4 frames), and compares the number of frames in each of the buffer queues 131 to 133 with the number of buffered frames in each of the buffer queues 131 to 133 plus one.
此時,處理器140可判斷緩衝佇列131的佇列幀數等於緩衝佇列131的緩衝幀數加一(即,4幀等於4幀),判斷緩衝佇列132的佇列幀數小於緩衝佇列131的緩衝幀數加一(即,6幀小於7幀),以及判斷緩衝佇列133的佇列幀數小於緩衝佇列131的緩衝幀數加一(即,0幀小於1幀)。換言之,處理器140可判斷緩衝佇列131~133各自的佇列幀數皆不大於且未皆等於緩衝佇列131~133各自的緩衝幀數加一。基於此,處理器140控制緩衝佇列131~133等待從攝影電路111~113接收新的影像串流(即,再次執行步驟S430)。At this time, the processor 140 can determine that the number of frames in the buffer queue 131 is equal to the number of buffered frames in the buffer queue 131 plus one (i.e., 4 frames equals 4 frames), determine that the number of frames in the buffer queue 132 is less than the number of buffered frames in the buffer queue 131 plus one (i.e., 6 frames is less than 7 frames), and determine that the number of frames in the buffer queue 133 is less than the number of buffered frames in the buffer queue 131 plus one (i.e., 0 frames is less than 1 frame). In other words, the processor 140 can determine that the number of frames in each of the buffer columns 131-133 is neither greater than nor equal to the number of buffered frames in each of the buffer columns 131-133 plus one. Based on this, the processor 140 controls the buffer columns 131-133 to wait for new image streams to be received from the camera circuits 111-113 (i.e., to execute step S430 again).
值得注意的是,緩衝佇列131的佇列幀數大於緩衝佇列131的緩衝幀數的原因在於,攝影電路111開始拍攝的時間早於攝影電路112~113。由於攝影電路111在攝影電路112~113開始進行拍攝之前所拍攝到的影像串流無法與攝影電路112~113所拍攝到的影像同步,處理器140便需要刪除掉提早拍攝到的影像串流。緩衝佇列132的佇列幀數小於緩衝佇列132的緩衝幀數的原因在於,緩衝佇列132的首幀需要再多等待1個新的影像幀進入緩衝佇列132才能被輸出。緩衝佇列133的佇列幀數小於緩衝佇列133的緩衝幀數的原因在於,還沒有影像幀進入緩衝佇列133中。It is worth noting that the number of frames in buffer array 131 is greater than the number of frames in buffer array 131 because the shooting time of camera circuit 111 is earlier than that of camera circuits 112-113. Since the image stream captured by camera circuit 111 before camera circuits 112-113 start shooting cannot be synchronized with the images captured by camera circuits 112-113, processor 140 needs to delete the image stream captured earlier. The reason why the number of frames in buffer queue 132 is less than the number of buffered frames in buffer queue 132 is that the first frame of buffer queue 132 needs to wait for one more new image frame to enter buffer queue 132 before it can be output. The reason why the number of frames in buffer queue 133 is less than the number of buffered frames in buffer queue 133 is that no image frame has entered buffer queue 133 yet.
再者,當緩衝佇列131~133從攝影電路111~113接收新的影像串流的一個影像幀時,緩衝佇列131~133各自的佇列幀數分別為5幀、7幀以及1幀。處理器140將各緩衝佇列的佇列幀數與各緩衝佇列的緩衝幀數加一進行比較。接著,處理器140可判斷緩衝佇列131的佇列幀數大於緩衝佇列131的緩衝幀數加一(即,5幀大於4幀)。基於此,處理器140刪除緩衝佇列131中暫存的影像串流的首幀(即,剩下4幀),並再次將緩衝佇列131~133各自的佇列幀數與緩衝佇列131~133各自的緩衝幀數加一進行比較。Furthermore, when buffer columns 131-133 receive a frame from a new image stream from camera circuits 111-113, the number of frames per buffer column 131-133 is 5, 7, and 1, respectively. Processor 140 compares the number of frames per buffer column with the number of buffered frames per buffer column plus one. Then, processor 140 can determine that the number of frames per buffer column 131 is greater than the number of buffered frames per buffer column 131 plus one (i.e., 5 frames is greater than 4 frames). Based on this, the processor 140 deletes the first frame of the image stream temporarily stored in the buffer queue 131 (i.e., leaving 4 frames), and compares the number of frames in each of the buffer queues 131 to 133 with the number of buffered frames in each of the buffer queues 131 to 133 plus one.
此時,處理器140可判斷緩衝佇列131的佇列幀數等於緩衝佇列131的緩衝幀數加一(即,4幀等於4幀),判斷緩衝佇列132的佇列幀數等於緩衝佇列131的緩衝幀數加一(即,7幀等於7幀),以及判斷緩衝佇列133的佇列幀數等於緩衝佇列131的緩衝幀數加一(即,1幀等於1幀)。換言之,處理器140可判斷緩衝佇列131~133各自的佇列幀數皆等於緩衝佇列131~133各自的緩衝幀數加一。基於此,處理器140判斷已經可以取出緩衝佇列131~133中各自暫存的影像串流的首幀以進行後續的合併(即,步驟S460)。At this time, the processor 140 can determine that the number of frames in the buffer queue 131 is equal to the number of buffered frames in the buffer queue 131 plus one (i.e., 4 frames equals 4 frames), determine that the number of frames in the buffer queue 132 is equal to the number of buffered frames in the buffer queue 131 plus one (i.e., 7 frames equals 7 frames), and determine that the number of frames in the buffer queue 133 is equal to the number of buffered frames in the buffer queue 131 plus one (i.e., 1 frame equals 1 frame). In other words, processor 140 can determine that the number of frames in each of buffer columns 131 to 133 is equal to the number of buffered frames in each of buffer columns 131 to 133 plus one. Based on this, processor 140 determines that the first frame of the temporarily stored image stream in each of buffer columns 131 to 133 can be retrieved for subsequent merging (i.e., step S460).
回到圖4,於步驟S460中,處理器140將所有首幀合併為合併影像。在一些實施例中,處理器140控制顯示器150顯示此合併影像,或者是將此合併影像直接儲存於記憶體130中以進行後續應用(例如,影像優化或影像辨識等)。換言之,處理器140除了將合併影像顯示於顯示器150,也可僅僅將合併影像儲存於記憶體130中而不顯示於顯示器150上,更可同時將合併影像儲存於記憶體130中以及顯示於顯示器150上。Returning to Figure 4, in step S460, processor 140 merges all the first frames into a merged image. In some embodiments, processor 140 controls display 150 to display this merged image, or directly stores this merged image in memory 130 for subsequent applications (e.g., image optimization or image recognition). In other words, processor 140 can not only display the merged image on display 150, but also simply store the merged image in memory 130 without displaying it on display 150, or simultaneously store the merged image in memory 130 and display it on display 150.
在一些實施例中,合併影像中的多個區域分別包括多個首幀(即,畫面合併)。在另一些實施例中,當攝影電路111~11n各自的拍攝範圍存在重疊時,合併影像可以是一個全景影像。在另一些實施例中,合併影像也可包含具有同樣時間標記(例如,下午3點10分)的與多個拍攝範圍對應的多個首幀(即,時間軸合併)。當使用者選擇與其中一時間標記對應的其中一拍攝範圍時(例如,利用觸控式的顯示器150選擇),處理器140可控制顯示器150顯示與被選擇的拍攝範圍對應的首幀。值得注意的是,此合併影像可應用於一些應用場景中(例如,立體視覺、全景拼接、高速攝影),因此,這將達成保持影像的一致性以及對齊的效果。換言之,本揭示利用上述方法將達成對載具用的各不同來源影像進行時間同步或影像拼接的效果。In some embodiments, multiple regions in the merged image each include multiple first frames (i.e., image merging). In other embodiments, the merged image can be a panoramic image when the shooting ranges of the camera circuits 111-11n overlap. In still other embodiments, the merged image may also include multiple first frames corresponding to multiple shooting ranges with the same time marker (e.g., 3:10 PM) (i.e., timeline merging). When a user selects a shooting range corresponding to one of the time markers (e.g., using a touch-screen display 150), the processor 140 can control the display 150 to display the first frame corresponding to the selected shooting range. It is worth noting that this merged image can be applied to various scenarios (e.g., stereoscopic vision, panoramic stitching, high-speed photography), thus achieving the effect of maintaining image consistency and alignment. In other words, this disclosure utilizes the above method to achieve time synchronization or image stitching of images from different sources used in vehicles.
以下以實際例子對合併影像的產生進行說明。一併參照圖8,圖8繪示本揭示在一些實施例中的合併影像800的示意圖。如圖8所示,假設攝影電路及緩衝佇列的數量n等於2,合併影像800包括第一區域810以及第二區域820,第一區域810包含了攝影電路111所產生的首幀(例如,特定時間拍攝的車前影像),第二區域820包含了攝影電路112所產生的首幀(例如,特定時間拍攝的車後影像)。值得注意的是,在此雖以包含上下顯示的第一區域810以及第二區域820的合併影像800為例,然而,在其他實施例中,也可以採用包含左右顯示的第一區域810以及第二區域820的合併影像800。The generation of merged images is illustrated below with a practical example. Referring also to FIG8, FIG8 illustrates a schematic diagram of a merged image 800 disclosed in some embodiments. As shown in FIG8, assuming that the number of camera circuits and buffer arrays n is equal to 2, the merged image 800 includes a first region 810 and a second region 820. The first region 810 contains the first frame generated by camera circuit 111 (e.g., a front image of a vehicle taken at a specific time), and the second region 820 contains the first frame generated by camera circuit 112 (e.g., a rear image of a vehicle taken at a specific time). It is worth noting that although this example uses a combined image 800 that includes a first area 810 and a second area 820 displayed vertically, other embodiments may also use a combined image 800 that includes a first area 810 and a second area 820 displayed horizontally.
綜上所述,本揭示提出的應用於載具的影像同步方法以及裝置利用拍攝載具四周的攝影電路的延遲時間設定各緩衝佇列的緩衝幀數,並利用各緩衝佇列的佇列幀數以及緩衝幀數的比較來判斷什麼時候取出各緩衝佇列的首幀以進行環景的影像同步。藉此,這將可解決由各攝影電路之不同的傳輸延遲所造成的影像不同步的問題。此外,本揭示提出的應用於載具的影像同步方法以及裝置更可以在針對攝影電路採用無線連接的方式時處理無線網路的延遲或攝影電路產生的影像幀到達系統的時間的不穩定性,並避免針對攝影電路採用有線連接的方式所增加的佈線成本。另一方面而言,本揭示提出的應用於載具的影像同步方法以及裝置確保在多個攝影電路在同一時間拍攝同一場景時保持影像的一致性以及對齊。In summary, the image synchronization method and device for vehicles disclosed herein utilize the delay time settings of the camera circuits around the vehicle to determine the number of buffer frames for each buffer sequence. By comparing the number of frames in each buffer sequence with the number of buffered frames, it determines when to retrieve the first frame of each buffer sequence for ambient image synchronization. This solves the problem of image asynchrony caused by the different transmission delays of each camera circuit. Furthermore, the image synchronization method and device for vehicles disclosed herein can address the latency of wireless networks or the instability of image frames arriving at the system when wireless connections are used for camera circuits, and avoid the increased wiring costs associated with wired connections for camera circuits. On the other hand, the image synchronization method and device for vehicles disclosed herein ensures image consistency and alignment when multiple camera circuits are shooting the same scene simultaneously.
以上所述僅為本揭示之較佳具體實例,非因此即侷限本揭示之專利範圍,故舉凡運用本揭示內容所為之等效變化,均同理皆包含於本揭示之範圍內,合予陳明。The above description is merely a preferred embodiment of this disclosure and does not limit the scope of the patent. Therefore, all equivalent changes made using the content of this disclosure are similarly included within the scope of this disclosure and are hereby stated.
100:應用於載具的影像同步裝置 111~11n:攝影電路 120:傳輸電路 130:記憶體 140:處理器 150:顯示器 VH:載具 210~240:拍攝範圍 img1~imgn:影像串流 131~13n:緩衝佇列 S410~S460、S411~S413、S421~S422、S451~S454:步驟 800:合併影像 810:第一區域 820:第二區域100: Image synchronization device for the vehicle 111~11n: Camera circuit 120: Transmission circuit 130: Memory 140: Processor 150: Display VH: Vehicle 210~240: Shooting range img1~imgn: Video stream 131~13n: Buffer sequence S410~S460, S411~S413, S421~S422, S451~S454: Steps 800: Merged images 810: First area 820: Second area
圖1繪示本揭示在一些實施例中的應用於載具的影像同步裝置的方塊圖。Figure 1 is a block diagram illustrating an image synchronization device applied to a vehicle in some embodiments of the present invention.
圖2繪示本揭示在一些實施例中的多個攝影電路的設置方式的示意圖。Figure 2 is a schematic diagram illustrating the arrangement of multiple camera circuits in some embodiments.
圖3繪示本揭示在一些實施例中的多個緩衝佇列的示意圖。Figure 3 illustrates a schematic diagram of multiple buffer formations in some embodiments.
圖4繪示本揭示在一些實施例中的應用於載具的影像同步方法的流程圖。Figure 4 illustrates a flowchart of an image synchronization method for a vehicle, as shown in some embodiments.
圖5繪示本揭示在一些實施例中的應用於載具的影像同步方法的其中一步驟包括的多個步驟的流程圖。Figure 5 illustrates a flowchart of several steps included in one step of an image synchronization method for a vehicle, as shown in some embodiments.
圖6繪示本揭示在一些實施例中的應用於載具的影像同步方法的其中另一步驟包括的多個步驟的流程圖。Figure 6 illustrates a flowchart of several steps included in another step of the image synchronization method for a vehicle, as disclosed in some embodiments.
圖7繪示本揭示在一些實施例中的應用於載具的影像同步方法的其中另一步驟包括的多個步驟的流程圖。Figure 7 illustrates a flowchart of several steps included in another step of the image synchronization method for a vehicle, as disclosed in some embodiments.
圖8繪示本揭示在一些實施例中的合併影像的示意圖。Figure 8 is a schematic diagram illustrating the merged images in some embodiments.
S410~S460:步驟 S410~S460: Steps
Claims (10)
Publications (1)
| Publication Number | Publication Date |
|---|---|
| TWI913008B true TWI913008B (en) | 2026-01-21 |
Family
ID=
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117061885A (en) | 2016-09-30 | 2023-11-14 | 高通股份有限公司 | System and method for fusing images |
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117061885A (en) | 2016-09-30 | 2023-11-14 | 高通股份有限公司 | System and method for fusing images |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12200359B2 (en) | Frame synchronization in a dual-aperture camera system | |
| CN103795979B (en) | Method and device for synchronizing distributed image stitching | |
| CN106576160B (en) | Imaging Architecture for Depth Camera Mode with Mode Switching | |
| US6894692B2 (en) | System and method for sychronizing video data streams | |
| US8711207B2 (en) | Method and system for presenting live video from video capture devices on a computer monitor | |
| US9185302B2 (en) | Image processing apparatus and method for previewing still and motion images | |
| CN114071022B (en) | A control method, device, equipment and storage medium for image acquisition equipment | |
| WO2018228353A1 (en) | Control method and apparatus for synchronous exposure of multi-camera system, and terminal device | |
| US20170339329A1 (en) | Intelligence Interface for Interchangable Sensors | |
| KR102803875B1 (en) | Method and Apparatus for Synchronizing Camera Image Based on GM Clock Time Information | |
| JP7418101B2 (en) | Information processing device, information processing method, and program | |
| WO2016139898A1 (en) | Video processing apparatus, video processing system and video processing method | |
| KR102179549B1 (en) | Stereo camera synchronization device, stereo camera and stereo camera synchronization method | |
| TWI913008B (en) | Method and device of synchronizing images applied to vehicles | |
| KR102617898B1 (en) | Synchronization of image capture from multiple sensor devices | |
| US9088750B2 (en) | Apparatus and method for generating picture-in-picture (PIP) image | |
| CN112135007A (en) | Streaming media visual angle switching method and system | |
| US10965841B2 (en) | Receiving device, video recording system and method for reducing video latency in video recording system | |
| TWI691202B (en) | Multi-stream image processing apparatus and method of the same | |
| JP2001148806A (en) | Video synthesis arithmetic processing apparatus, method and system thereof | |
| JP4186673B2 (en) | Dual system video signal synchronization method and synchronization circuit | |
| JP2022155063A (en) | Communication device, control method, and program | |
| KR102183906B1 (en) | Method and system for image fusion |