TWI734116B - Method for spherical camera image stitching - Google Patents
Method for spherical camera image stitching Download PDFInfo
- Publication number
- TWI734116B TWI734116B TW108117418A TW108117418A TWI734116B TW I734116 B TWI734116 B TW I734116B TW 108117418 A TW108117418 A TW 108117418A TW 108117418 A TW108117418 A TW 108117418A TW I734116 B TWI734116 B TW I734116B
- Authority
- TW
- Taiwan
- Prior art keywords
- image
- images
- fisheye
- south
- similar
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 210000003746 feather Anatomy 0.000 claims description 6
- 238000007500 overflow downdraw method Methods 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 13
- 238000002156 mixing Methods 0.000 description 4
- 238000003708 edge detection Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Images
Landscapes
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
本發明有關於球型相機影像拼接方法,尤其是指使用分段球體投影方法,以基於相似邊緣的技巧拼接影像,然後將拼接完成的全景影像投影到3-D球形空間。 The present invention relates to a spherical camera image splicing method, and in particular refers to the use of a segmented spherical projection method to splice images based on similar edge techniques, and then project the spliced panoramic image into a 3-D spherical space.
請見圖1,為現有的等距柱狀投影(Equirectangular Projection,ERP)方法流程示意圖。由左上方開始,一魚眼相機1以背靠背的兩個魚眼鏡頭11、12取像(步驟1),取得兩張魚眼圖A、B(步驟2)。這兩張魚眼圖A、B分別代表兩個半球的影像,但各取約190°影像,所以下方各有一圈多出的約5°的環圈13、14(步驟3)。將兩個半球的影像攤平成矩形,則成為左下方的兩張展開平面圖(步驟4)。這種將半球影像轉成矩形平面圖的方法,即為已知的等距柱狀投影(ERP)法。
Please see Figure 1, which is a schematic diagram of the existing Equirectangular Projection (ERP) method. Starting from the upper left, a
這兩張展開平面圖各分成左右兩半,分別為A左、A右、B左、B右。A左的左邊條帶131與A右的右邊條帶132就是環圈13分開形成的,各佔約10°(-95°~-85°與85°~95°)。B左的左邊條帶141與B右的右邊條帶142就是環圈14分開形成的,各佔約10°(-95°~-85°與85°~95°)。
These two expanded plan views are divided into two halves, namely A left, A right, B left, and B right. The
將兩張展開平面圖的條帶132與條帶141拼接起來,且將條帶131與條帶142拼接起來,於是形成全景影像(步驟5),最後將拼接完成的全景影像投影到3-D球形空間形成球形環場影像(步驟6)。
Splicing the
這種等距柱狀投影(ERP)法是將球體轉成平面矩形,因此球體北極/南極相對於球體中央赤道區會有相當大的影像失真。 This equidistant column projection (ERP) method turns the sphere into a flat rectangle, so the north/south pole of the sphere will have considerable image distortion relative to the central equatorial region of the sphere.
本發明的目的在提出一種球型相機影像拼接方法,大幅度改良影像球體北極/南極相對於球體中央赤道區的影像失真。主要內容是採用分段球體投影法(Segmented Sphere Projection,SSP),將影像分成北極、赤道和南極三部分而處理,其處理的方法是用相似邊緣法和羽化融合法。 The purpose of the present invention is to provide a method for splicing images of a spherical camera, which greatly improves the image distortion of the north/south pole of the image sphere relative to the central equatorial region of the sphere. The main content is to use segmented sphere projection (Segmented Sphere Projection, SSP), the image is divided into three parts of the North Pole, the equator and the South Pole and processed. The processing method is to use the similar edge method and the feather fusion method.
本發明用一魚眼相機以背靠背的兩個魚眼鏡頭取像,取得兩張半球的魚眼影像,各取約190°影像;將該兩張魚眼影像各分為三部分:北極、赤道和南極,三段的分界在北緯+45°和南緯-45°,形成北緯+45°以上的北極、北緯+45°以下至南緯-45°的赤道與南緯-45°以下的南極三部分;將該兩張半球的魚眼影像的北極與南極各轉換成半圓形;將該兩張半球的魚眼影像的北極與南極的各兩個半圓形以相似邊緣法各拼接成圓形;拼接時將重疊區進行羽化融合,然後將圓形轉成球形表面;將赤道區域進行傳統ERP拼接(基於相似邊緣法與羽化融合法),然後將北極、赤道和南極三個區段接合形成全景影像;最後將拼接完成的全景影像投影到3-D球形空間形成球形環場影像。 The present invention uses a fisheye camera to take images with two fisheye lenses back-to-back to obtain two hemispherical fisheye images, each taking about 190° images; each of the two fisheye images is divided into three parts: the North Pole, the Equator and Antarctica, the three segments are divided at +45°N latitude and -45°South latitude, forming three parts of the North Pole above +45°N latitude, the equator below +45°N to -45°S latitude, and Antarctica below -45°S latitude; The north pole and the south pole of the two hemisphere fisheye images are respectively converted into a semicircle; the two semicircles of the north pole and the south pole of the two hemisphere fisheye images are respectively spliced into a circle by a similar edge method; At the same time, the overlapping area is feathered and merged, and then the circle is turned into a spherical surface; the equatorial area is merged with traditional ERP (based on the similar edge method and the feathering fusion method), and then the three sections of the North Pole, the Equator and the South Pole are joined to form a panoramic image ; Finally, the stitched panoramic image is projected into the 3-D spherical space to form a spherical ring field image.
1‧‧‧魚眼相機 1‧‧‧Fisheye Camera
11‧‧‧魚眼鏡頭 11‧‧‧Fisheye lens
12‧‧‧魚眼鏡頭 12‧‧‧Fisheye lens
A‧‧‧魚眼圖 A‧‧‧Fisheye
B‧‧‧魚眼圖 B‧‧‧Fisheye
13‧‧‧環圈 13‧‧‧Circle
14‧‧‧環圈 14‧‧‧Circle
131‧‧‧條帶 131‧‧‧Strip
132‧‧‧條帶 132‧‧‧Strip
141‧‧‧條帶 141‧‧‧Strip
142‧‧‧條帶 142‧‧‧Strip
31‧‧‧條帶 31‧‧‧Strip
32‧‧‧條帶 32‧‧‧Strip
33‧‧‧條帶 33‧‧‧Strip
34‧‧‧條帶 34‧‧‧Strip
131'‧‧‧條帶 131'‧‧‧ Strip
132'‧‧‧條帶 132'‧‧‧ Strip
141'‧‧‧條帶 141'‧‧‧ Strip
142'‧‧‧條帶 142'‧‧‧ Strip
圖1為現有的等距柱狀投影(Equirectangular Projection,ERP)方法流程示意圖。 Figure 1 is a schematic flow diagram of the existing Equirectangular Projection (ERP) method.
圖2、3、4為本發明分段球體投影(Segmented Sphere Projection,SSP)方法說明圖。 Figures 2, 3, and 4 are explanatory diagrams of the Segmented Sphere Projection (SSP) method of the present invention.
圖5為本發明相似邊緣法說明圖。 Fig. 5 is an explanatory diagram of the similar edge method of the present invention.
圖6為本發明羽化融合法說明圖。 Fig. 6 is an explanatory diagram of the feathering and fusion method of the present invention.
圖2、3、4為本發明分段球體投影(Segmented Sphere Projection,SSP)方法說明圖。圖2的步驟仍採用圖1的步驟1~3,圖3的步驟4'、51'、52'與圖4的步驟6'則為分段球體投影方法的步驟說明。
Figures 2, 3, and 4 are explanatory diagrams of the Segmented Sphere Projection (SSP) method of the present invention. The steps in FIG. 2 still use
請見圖2,由左邊開始,一魚眼相機1以背靠背的兩個魚眼鏡頭11、12取像(步驟1),取得兩張魚眼圖A、B(步驟2)。這兩張魚眼圖A、B分別代表兩個半球的影像,但各取約190°影像,所以下方各有一圈多出的約5°的環圈13、14(步驟3)。
Please see Figure 2. Starting from the left, a
圖3的步驟4'、51'、52'則為分段球體投影法(SSP法),將圖2步驟3中兩張魚眼圖分為三部分:北極、赤道和南極,三段的分界在球體影像的北緯+45°和南緯-45°,形成北緯+45°以上、北緯+45°以下至南緯-45°以上、南緯-45°以下三部分。 Steps 4', 51', 52' in Figure 3 are segmented sphere projection methods (SSP method). The two fisheye diagrams in step 3 of Figure 2 are divided into three parts: the North Pole, the Equator and the South Pole, the boundary of the three segments In the sphere image's north latitude +45° and south latitude -45°, three parts are formed.
圖3的步驟4'將魚眼圖A、B的赤道部分以ERP法將影像攤平成矩形,成為圖3中間的兩張展開的平面圖。 In step 4'of FIG. 3, the equatorial parts of the fisheye diagrams A and B are flattened into a rectangle by the ERP method, and become the two expanded plan views in the middle of FIG. 3.
將魚眼圖A、B的北緯+45°以上北極與南緯-45°以下南極各轉換成半圓形,如步驟51'所示。 Convert the north pole above +45° north latitude and the south pole below -45° south latitude of the fisheye diagrams A and B into a semicircle, as shown in step 51'.
進行北緯+45°以上基於相似邊緣法及羽化融合(Feather blending)的拼接(請見【0024】、【0025】、【0026】的說明),即條帶31與條帶32拼接形成圓形(步驟52')。
Perform stitching based on the similar edge method and Feather blending above the north latitude +45° (see the description of [0024], [0025], and [0026]), that is, the
進行南緯-45°以下基於相似邊緣法及羽化融合(Feather blending)的拼接(請見【0024】、【0025】、【0026】的說明),即條帶33條帶34拼接形成圓形(步驟52')。
Perform stitching based on the similar edge method and Feather blending below the south latitude -45° (see the description of [0024], [0025], and [0026]), that is, the
請見圖4,將北緯+45°以下至南緯-45°以上的赤道區域進行先前技術中所述的基於相似邊緣的ERP拼接及羽化融合(Feather blending)(請見【0024】、【0025】、【0026】的說明),然後將上、中、下三個區段(北極、赤道和南極)接合形成全景影像(步驟6')。 Please refer to Figure 4, the equatorial region from below +45° north latitude to above -45° south latitude is subjected to the ERP stitching and feather blending based on similar edges described in the previous technology (see [0024], [0025] , [0026] description), and then join the upper, middle, and lower three sections (North Pole, Equator and South Pole) to form a panoramic image (step 6').
最後將拼接完成的全景影像投影到3-D球形空間形成球形環場影像。 Finally, the stitched panoramic image is projected into a 3-D spherical space to form a spherical ring field image.
上述北極、赤道和南極三段的基於相似邊緣的拼接法說明如下:將兩塊欲拼接之區域,經過邊緣偵測,影像結果為1或0(1為邊緣,0為非邊緣)。一個像素由三個數值所組成:紅光的亮度、綠光的亮度、藍光的亮度,簡稱為RGB影像,分別介於0-255之間。三個數值疊合,得到各種顏色。灰階影像則為每個像素只有一個顏色,從最暗的黑色、多階的灰色到最亮的白色。RGB影像可經由分別對R、G、B進行加權平均而獲得灰階影像。然後進行邊緣偵測,所謂邊緣是指影像中亮度變化明顯的區域(稱為邊緣)。影像灰階化後,尋找灰階變化大的區域,設其為邊緣1,其餘地方為非邊緣0。這樣能大幅地減少資料量,並保留影像中重要的結構屬性。
The splicing method based on similar edges for the above three sections of North Pole, Equator and South Pole is explained as follows: After edge detection of the two regions to be spliced, the image result is 1 or 0 (1 is an edge, and 0 is a non-edge). A pixel is composed of three values: the brightness of red light, the brightness of green light, and the brightness of blue light, which are referred to as RGB images, which are between 0-255. The three values are superimposed to get various colors. Grayscale images have only one color per pixel, from the darkest black, multi-level gray to the brightest white. The RGB image can be obtained by weighting and averaging R, G, and B respectively to obtain a grayscale image. Then, edge detection is performed. The so-called edge refers to the area in the image with obvious brightness changes (called edge). After the image is gray-scaled, look for areas with large gray-scale changes, set it as
請見圖5,將兩塊欲拼接之區域(例如31與32或33與34或132'與141'或131'與142')左、右進行匹配,分別以中心為準,各取範圍L(主匹配)與範圍W(被匹配)。移動L在W上,將L與W上同等寬度的L'計算其分別的相似值,找出最相似的L'位置,將L與L'拼接。下面式子為相似值算法,相似值越近於1就是越相似,相似值等於1就是完全相同,a,b分別為兩張匹配影像邊緣數值(0或1),h為匹配影像的高,w為匹配影像的寬,i,j分別代表影像取樣點之縱座標值與橫座標值。 Please refer to Figure 5, match the two areas to be spliced (for example 31 and 32 or 33 and 34 or 132' and 141' or 131' and 142') to the left and right, respectively taking the center as the standard, and each taking the range L (Main match) and range W (matched). Move L on W, calculate the similarity between L and L'with the same width on W, find the most similar position of L', and join L and L'. The following formula is the similarity value algorithm. The closer the similarity value is to 1, the more similar. The similarity value is equal to 1 to be exactly the same. A and b are the edge values (0 or 1) of the two matched images respectively, and h is the height of the matched image. w is the width of the matched image, and i and j respectively represent the ordinate value and the abscissa value of the image sampling point.
請見圖6,將最相似的兩重疊區L與L'進行權重疊合,依據重疊區到各別照片邊界的距離,建立權重圖,越靠近自己這邊的權重為1,越遠離自己這邊的權重為0,正中間各佔0.5,兩個影像像素權重相加必為1。將影像L與L'分別乘上權重並疊合,使得重疊部分產生漸變的效果,更為平滑,減少突兀感。這就是羽化融合(Feather blending)。 Please refer to Figure 6, the two most similar overlapping areas L and L'are overlapped in weight, and a weight map is established based on the distance between the overlapped area and the boundary of each photo. The closer you are to yourself, the weight is 1, and the farther away you are The weight of the edge is 0, and the middle is 0.5. The sum of the weights of the two image pixels must be 1. The images L and L'are respectively multiplied by weights and superimposed, so that the overlapping part has a gradual effect, which is smoother and reduces the sense of abruptness. This is Feather blending.
本發明的精神與範圍決定於下面的申請專利範圍,不受限於 上述實施例。 The spirit and scope of the present invention are determined by the scope of the following patent applications and are not limited The above embodiment.
A‧‧‧魚眼圖 A‧‧‧Fisheye
B‧‧‧魚眼圖 B‧‧‧Fisheye
13‧‧‧環圈 13‧‧‧Circle
14‧‧‧環圈 14‧‧‧Circle
131'‧‧‧條帶 131'‧‧‧ Strip
132'‧‧‧條帶 132'‧‧‧ Strip
141'‧‧‧條帶 141'‧‧‧ Strip
142'‧‧‧條帶 142'‧‧‧ Strip
31‧‧‧條帶 31‧‧‧Strip
32‧‧‧條帶 32‧‧‧Strip
33‧‧‧條帶 33‧‧‧Strip
34‧‧‧條帶 34‧‧‧Strip
Claims (1)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW108117418A TWI734116B (en) | 2019-05-21 | 2019-05-21 | Method for spherical camera image stitching |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW108117418A TWI734116B (en) | 2019-05-21 | 2019-05-21 | Method for spherical camera image stitching |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| TW202109462A TW202109462A (en) | 2021-03-01 |
| TWI734116B true TWI734116B (en) | 2021-07-21 |
Family
ID=76035625
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW108117418A TWI734116B (en) | 2019-05-21 | 2019-05-21 | Method for spherical camera image stitching |
Country Status (1)
| Country | Link |
|---|---|
| TW (1) | TWI734116B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101739674A (en) * | 2008-11-19 | 2010-06-16 | 深圳迈瑞生物医疗电子股份有限公司 | Automatic image sequence splicing method and device and splicing system |
| CN102063280A (en) * | 2010-12-24 | 2011-05-18 | 广东威创视讯科技股份有限公司 | Image edge blending processing method and system |
| CN107644394A (en) * | 2016-07-21 | 2018-01-30 | 完美幻境(北京)科技有限公司 | A kind of processing method and processing device of 3D rendering |
| TWI614735B (en) * | 2016-12-14 | 2018-02-11 | 財團法人工業技術研究院 | Panoramic vision system |
| TW201842765A (en) * | 2017-04-27 | 2018-12-01 | 聯發科技股份有限公司 | Method and apparatus for mapping virtual-reality image to a segmented sphere projection format |
| TW201909644A (en) * | 2017-07-19 | 2019-03-01 | 聯發科技股份有限公司 | Method and apparatus for reduction of artifacts at discontinuous boundaries in coded virtual-reality images |
-
2019
- 2019-05-21 TW TW108117418A patent/TWI734116B/en active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101739674A (en) * | 2008-11-19 | 2010-06-16 | 深圳迈瑞生物医疗电子股份有限公司 | Automatic image sequence splicing method and device and splicing system |
| CN102063280A (en) * | 2010-12-24 | 2011-05-18 | 广东威创视讯科技股份有限公司 | Image edge blending processing method and system |
| CN102063280B (en) | 2010-12-24 | 2013-03-20 | 广东威创视讯科技股份有限公司 | Image edge blending processing method and system |
| CN107644394A (en) * | 2016-07-21 | 2018-01-30 | 完美幻境(北京)科技有限公司 | A kind of processing method and processing device of 3D rendering |
| TWI614735B (en) * | 2016-12-14 | 2018-02-11 | 財團法人工業技術研究院 | Panoramic vision system |
| TW201842765A (en) * | 2017-04-27 | 2018-12-01 | 聯發科技股份有限公司 | Method and apparatus for mapping virtual-reality image to a segmented sphere projection format |
| TW201909644A (en) * | 2017-07-19 | 2019-03-01 | 聯發科技股份有限公司 | Method and apparatus for reduction of artifacts at discontinuous boundaries in coded virtual-reality images |
Non-Patent Citations (3)
| Title |
|---|
| Chao-Yung Hsu;Chih-Ming Chang;Li-Wei Kang Kang;Ru-Hong Fu;Duan-Yu Chen;Ming-Fang Weng, "Fish-Eye Lenses-Based Camera Calibration and Panoramic Image Stitching", 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Year: 2018. |
| Chao-Yung Hsu;Chih-Ming Chang;Li-Wei Kang Kang;Ru-Hong Fu;Duan-Yu Chen;Ming-Fang Weng, "Fish-Eye Lenses-Based Camera Calibration and Panoramic Image Stitching", 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Year: 2018. 王詩晴,"球型相機影像自動拼接技術研究",國立交通大學 碩士論文,2019年 * |
| 王詩晴,"球型相機影像自動拼接技術研究",國立交通大學 碩士論文,2019年。 |
Also Published As
| Publication number | Publication date |
|---|---|
| TW202109462A (en) | 2021-03-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112085659B (en) | Panorama splicing and fusing method and system based on dome camera and storage medium | |
| CN114945943B (en) | Depth estimation based on iris size | |
| CN103020954B (en) | Irregular surface-orientated self-adaptive projection system | |
| CN103150715B (en) | Image mosaic processing method and device | |
| CN114331835B (en) | A panoramic image stitching method and device based on optimal mapping matrix | |
| CN116681636A (en) | Light infrared and visible light image fusion method based on convolutional neural network | |
| CN106357976A (en) | Omni-directional panoramic image generating method and device | |
| CN110838086B (en) | An Outdoor Image Mosaic Method Based on Correlation Template Matching | |
| US10614553B1 (en) | Method for spherical camera image stitching | |
| CN114331826B (en) | A fast correction method for fisheye images based on distortion stretch factor | |
| CN103295231A (en) | Method for geometrically correcting vertically mapped images of fisheye lenses in fisheye image mosaic | |
| CN107403408A (en) | A kind of double fish eye images spliced panoramic image seam fusion methods | |
| CN104680505A (en) | Panoramic view algorithm for fisheye lens correction | |
| CN111860632B (en) | Multipath image consistency fusion method | |
| CN107689033A (en) | A kind of fish eye images distortion correction method based on ellipse segmentation | |
| CN105657268A (en) | Multi-viewpoint video splicing and fusion algorithm based on multiple resolutions | |
| CN105243653A (en) | Fast mosaic technology of remote sensing image of unmanned aerial vehicle on the basis of dynamic matching | |
| TWI734116B (en) | Method for spherical camera image stitching | |
| CN104317144B (en) | Large-scale orthogonal full-length optical projection system optical radiation fast-compensation method | |
| CN116091314B (en) | Infrared image stitching method based on multi-scale depth homography | |
| CN105957005B (en) | Bridge image splicing method based on characteristic point and structure lines | |
| CN114897706B (en) | A green vegetation enhancement method based on panchromatic and multispectral image fusion | |
| CN106815602A (en) | A kind of runway FOD image detection method and devices based on multi-level features description | |
| KR101513931B1 (en) | Auto-correction method of composition and image apparatus with the same technique | |
| Dong et al. | Infrared image colorization using an edge aware auto encoder decoder with the multi-resolution fusion |