TW201101814A - Method of compensating for backlight image and photograph apparatus with backlight image compensation system - Google Patents
Method of compensating for backlight image and photograph apparatus with backlight image compensation system Download PDFInfo
- Publication number
- TW201101814A TW201101814A TW98121670A TW98121670A TW201101814A TW 201101814 A TW201101814 A TW 201101814A TW 98121670 A TW98121670 A TW 98121670A TW 98121670 A TW98121670 A TW 98121670A TW 201101814 A TW201101814 A TW 201101814A
- Authority
- TW
- Taiwan
- Prior art keywords
- image
- backlight
- compensation
- brightness
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 230000001537 neural effect Effects 0.000 claims abstract description 4
- 230000007246 mechanism Effects 0.000 claims description 18
- 238000006243 chemical reaction Methods 0.000 claims description 16
- 238000007781 pre-processing Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 230000008439 repair process Effects 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims 1
- 230000008569 process Effects 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 4
- 238000009825 accumulation Methods 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 210000000987 immune system Anatomy 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Landscapes
- Studio Devices (AREA)
- Facsimile Image Signal Circuits (AREA)
- Image Processing (AREA)
Abstract
Description
201101814 六、發明說明: 【發明所屬之技術領域】 ^日^有關於1影像修補方法及具有影像修補機 制之攝衫裝置,且特別是有關於一種背 具有背光影像補償機制的攝影裝置。 〜^貝4 【先前技術】 ❹ ❹ 隨著科技的發展’人類對生活品質的要求漸漸提升, : = 的隨ί電子產品’更要求同-個電子產品 月b、、、〇 S夕種功月b,以提升使用上的便利性。 近年來,内建的攝影裝置為手機或 的必備功能。但手機或PDA的攝 則助(A) 能,如防手震、補光、物件對 相機,導致使用者利用手機或PDA拍=不$ 透過電腦的修圖軟體進行修補,反而造成使用、 性’失去了原先將攝影裝置内建在手機或PDA的用意。 【發明内容】 因此’本發明之一態樣是在提供一種 旦 償機制之攝影裝置,較时可直接;;衫像補 參與德m腺固u #置接於攝衫裝置中修補背 “像❿不而將圖片上傳至電腦,再透過 軟體來修補背光影像。 ® 依據本發明一實施方★、,担处 機制之攝影裝置,包含1像取:/、有背光影像補償 S影像擷取模組、一類神經模糊網 4 201101814 路處理模組以及一背光補償模組。影像擷取模組用來攝錄 至少一影像。類神經模糊網路處理模組用來計算一補償 值,且由背光補償模組接收補償值,進而修補影像之亮度。 其中,類神經模糊網路處理模組包含一預處理單元、一輸 入單元、一運算單元以及一輸出單元。預處理單元用以自 該影像提出至少一背光因子。輸入單元用以接收背光因 子,而運算單元利用背光因子計算出一補償值,並透過輸 出單元輸出補償值。 藉此,使用者可直接於攝影裝置中修補背光影像,增 加拍攝上的便利性。 本發明之另一態樣是在提供一種背光影像補償方法, 其應用於一攝影裝置,其可將使用者所拍攝的影像直接在 攝影裝置中執行影像亮度修補,提升了攝影裝置所拍攝出 的影像品質。 依據本發明另一實施方式,提供一種背光影像補償方 法,係應用於一攝影裝置,其步驟如下:利用攝影裝置拍 攝一影像,並將影像進行亮度分群。利用亮度分群可提出 至少一背光因子,並將該背光因子輸入一類神經模糊網 路,以計算出一補償值。將該補償值轉化為一補償曲線, 並利用補償曲線對影像進行亮度補償。 藉此,將上述背光影像補償方法運用於具有拍照功能 之手機或PDA上,提升攝影裝置的拍照性能,也增加了拍 攝影像上的便利性。 【實施方式】 5 201101814 請參照第1圖,其繪示依照本發明一實施例的一種具 有背光影像補償機制之攝影裝置之方塊結構圖。第i圖 中’具有背光影像補償機制之攝影裝置100可為手機或 PDA等數位行動裝置,其包含一影像擷取模組2〇〇,一類 神經模糊網路處理模組300及一背光補償模組4〇〇。 影像擷取模組200可為攝影裝置100之鏡頭,用來攝 錄、捕捉一影像; 類神經模糊網路處理模組300接收影像擷取模組2〇〇 0 所拍攝之影像,並分析出影像的背光程度,再運算出一補 償值; 接著,背光補償模組400與類神經模糊網路處理模組 300連接’利用補償值對影像做出背光補償處理。 藉此,改善拍攝時,攝影裝置100與光源位置所造成 影像的背光問題。 值得一提的是’前述類神經模糊網路處理模組300可 為一處理晶片,係内嵌於攝影裝置;而其另可為一影像處 〇理程式,且攝影裝置1〇〇具有一作業系統,以執行影像處 理程式。類神經模糊網路處理模組3〇〇包含一預處理單元 t 31〇 ,而類神經模糊網路3:2以及因子, 322及輸出單元323 : 3 =單70 321、運异單元 *間。真产八廳色祕換層用以轉換影像之色彩 :二=:33;32用以r!中各個像素之亮度進行 光因子。輸入單元321用群的結果計算出兩個背 用以接收兩月光因子。運算單元322 6 201101814 利用兩背光因子計算出一補償值。輸出單元323用以輸出 補償值。 此外,前述背光補償模組400用來接收補償值,並修 補影像之凴度,其包含曲線轉換單元41〇及影像修補單元 420。曲線轉換單元41〇用以將補償值轉換成一補償曲線。 影像修補單元420利用補償曲線修補影像之亮度。藉此, 使用者可直接在攝影裝置上對影像進行背光補償。 本實施例之具體原理與達成功效茲解釋如下: ❹ 一般而s,拍攝的影像皆為彩色影像,而影像中的每 個像素皆由RGB的色彩空間》但彩色影像較為複雜,亦較 難分類、處理。因此,透過色彩轉換層311,將影像中的 每個像素由RGB轉換為YIQ的色彩空間,即是將彩色影 像轉為灰階影像。灰階影像中,γ代表亮度,而I及Q則 代表彩色資訊。每個像素只包含一個灰階度,而γ即可用 來代表灰階影像中每個像素的灰階值。雖然將亮度與彩色 資訊分開表示可提升影像分析的便利性,但只使用Y來代 Q 表灰階值可能會遺失一些影像的色彩資訊。因此,色彩轉 換層311中’使用特徵權重轉換演算法(Feature-weighted transformation algrothm)將彩色影像轉成灰階影像,如下列 公式(1): F(iJ) = R(iJ)(l + ^^Mi))201101814 VI. Description of the invention: [Technical field to which the invention pertains] ^Daily has a 1 image repairing method and a camera device having an image repairing mechanism, and particularly relates to a photographing device having a backlight image compensating mechanism. ~^贝贝 4 [Previous technology] ❹ ❹ With the development of technology, 'human requirements for quality of life are gradually increasing, := The electronic products of the 'requires the same - an electronic product month b,,, 〇S Month b to improve the convenience of use. In recent years, built-in photographic devices have been a must-have feature for mobile phones. However, the camera or PDA can help (A), such as anti-shake, fill light, and object-to-camera, causing the user to use the mobile phone or PDA to shoot = not to repair through the computer's retouching software, but cause use, sex 'The original intention of building a camera device on a mobile phone or PDA was lost. SUMMARY OF THE INVENTION Therefore, one aspect of the present invention is to provide a photographic device that provides a singularity mechanism, which can be directly used at the same time; ❿ 上传 将 将 将 将 将 将 将 将 将 ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® ® Group, a type of neuro-fuzzy network 4 201101814 road processing module and a backlight compensation module. The image capturing module is used to record at least one image. The neuro-fuzzy network processing module is used to calculate a compensation value, and is backlit. The compensation module receives the compensation value, thereby repairing the brightness of the image. The neuro-fuzzy network processing module includes a pre-processing unit, an input unit, an operation unit, and an output unit. The pre-processing unit is configured to At least one backlight factor. The input unit is configured to receive a backlight factor, and the operation unit calculates a compensation value by using a backlight factor, and outputs a compensation value through the output unit. The backlight image can be repaired directly in the photographic device to increase the convenience of shooting. Another aspect of the present invention is to provide a backlight image compensation method, which is applied to a photographic device that can capture images taken by a user. The image brightness correction is performed directly in the photographing device, and the image quality captured by the photographing device is improved. According to another embodiment of the present invention, a backlight image compensation method is provided, which is applied to a photographing device, and the steps are as follows: using a photographing device Shooting an image and grouping the images into brightness. Using brightness grouping, at least one backlight factor can be proposed, and the backlight factor is input into a type of neural fuzzy network to calculate a compensation value. The compensation value is converted into a compensation curve. The compensation curve is used to compensate the brightness of the image. Thereby, the backlight image compensation method is applied to a mobile phone or a PDA with a photographing function, thereby improving the photographing performance of the photographing device and increasing the convenience of photographing the image. 】 5 201101814 Please refer to FIG. 1 , which shows a real according to the present invention. A block diagram of a photographic device having a backlight image compensation mechanism. The photographic device 100 having a backlight image compensation mechanism may be a digital mobile device such as a mobile phone or a PDA, and includes an image capturing module 2 〇〇, a type of neuro-fuzzy network processing module 300 and a backlight compensation module 4. The image capturing module 200 can be a lens of the photographic device 100 for recording and capturing an image; The processing module 300 receives the image captured by the image capturing module 2〇〇0, and analyzes the degree of backlighting of the image, and then calculates a compensation value; then, the backlight compensation module 400 and the neuro-fuzzy network processing module The 300 connection 'uses the compensation value to perform backlight compensation processing on the image. Thereby, the backlight problem of the image caused by the position of the photographing device 100 and the light source at the time of shooting is improved. It is worth mentioning that the aforementioned neuro-fuzzy network processing module 300 can be a processing chip embedded in a photographic device; and the other can be an image processing program, and the photographic device 1 has an operation. System to execute the image processing program. The neuro-fuzzy network processing module 3 〇〇 includes a pre-processing unit t 31 〇 , and the neuro-fuzzy network 3: 2 and the factor 322 and the output unit 323 : 3 = single 70 321 , and the different units *. Really produced eight hall color secret layer to convert the color of the image: two =: 33; 32 for the brightness of each pixel in r! The input unit 321 calculates the two backs for receiving the two moonlight factors using the results of the group. The arithmetic unit 322 6 201101814 calculates a compensation value by using two backlight factors. The output unit 323 is for outputting a compensation value. In addition, the backlight compensation module 400 is configured to receive a compensation value and modify the intensity of the image, and includes a curve conversion unit 41 and an image repair unit 420. The curve conversion unit 41 is configured to convert the compensation value into a compensation curve. The image repairing unit 420 repairs the brightness of the image using the compensation curve. Thereby, the user can perform backlight compensation on the image directly on the photographing device. The specific principles and functions of this embodiment are explained as follows: ❹ Generally, s, the captured images are all color images, and each pixel in the image is composed of RGB color space. However, the color image is more complicated and difficult to classify. ,deal with. Therefore, through the color conversion layer 311, each pixel in the image is converted from RGB to a color space of YIQ, that is, the color image is converted into a grayscale image. In grayscale images, γ represents brightness, while I and Q represent color information. Each pixel contains only one grayscale, and γ can be used to represent the grayscale value of each pixel in the grayscale image. Although separately separating the brightness from the color information can improve the convenience of image analysis, using only Y to represent the gray scale value of the Q table may lose the color information of some images. Therefore, the color conversion layer 311 'uses a feature-weighted transformation algrothm to convert a color image into a grayscale image, as shown in the following formula (1): F(iJ) = R(iJ)(l + ^ ^Mi))
N - +G(i,j)(1+^EM) ……⑴ N 、’ - +B(i;j)(1+Am〇uiffl(iJ)) 7 201101814 其中’ R(1,J)、G(i,j)和B(i,j)分別代表影像中(i,j)位置之 像素之 RGB 值 ° AmountR(i,j)、 Am_tG(i,j)和N - +G(i,j)(1+^EM) (1) N , ' - +B(i;j)(1+Am〇uiffl(iJ)) 7 201101814 where ' R(1,J), G(i,j) and B(i,j) represent the RGB values of the pixels at the (i,j) position in the image, respectively, AmountR(i,j), Am_tG(i,j), and
Am〇UntB(i,j)則分別代表整個影像與影像位置(i,j)之像素具 有相同RGB值的像素數目代表影像的總像素數。最後 算出每個像素的_值F後,正規化其數值,使其介於 0〜255間。藉此,即獲得影像中每個像素之灰階值。 党度分群層312主要是用來將影像中的物件與背景物 件或背景進行分離。在色彩轉換層311中所得到每個像素 〇的灰階值F,即代表各個像素的亮度。因此,在亮度分群 層中,使用Fuzzy C-mean(fcm)演算法進行亮度分群的動 作。分群公式如下列公式(2):Am〇UntB(i,j) represents the total number of pixels of the image having the same RGB value for the pixel of the entire image and image position (i, j). Finally, after calculating the _value F of each pixel, normalize its value to be between 0 and 255. Thereby, the grayscale value of each pixel in the image is obtained. The party clustering layer 312 is primarily used to separate objects in the image from background objects or backgrounds. The gray scale value F of each pixel 〇 obtained in the color conversion layer 311 represents the luminance of each pixel. Therefore, in the luminance grouping layer, the operation of luminance grouping is performed using the Fuzzy C-mean (fcm) algorithm. The grouping formula is as follows (2):
FCM V. Z|;uk(ij)q|F(iJ)-vk|2 F(i,j)-vn ZyUk(i,j)qxF(i,j) ΣFCM V. Z|;uk(ij)q|F(iJ)-vk|2 F(i,j)-vn ZyUk(i,j)qxF(i,j) Σ
l<:k<C (2)l<:k<C (2)
l<k^Cl<k^C
附件1是背光影像補償方法及具有背光影像補償機制 之攝影裝置之技術論文’參考附件1第5頁的圖3,左方 的背光影像經過亮度分群層312的計算,可匯出右方的分 群示意圖。由該分群示意圖可知,位於影像中間的物件, 即為背光的物件,相較於背光物件兩側後方背景的亮度, 背光物件的亮度遠小於背景的亮度。 因子計算層313利用了上述亮度分群層312所做出影 像分群的結果,計算出兩個背光因子B_hist313a及 201101814 B_fcm313b。Hist被定義為高於某亮度門檻值的像素佔總像 素的比例,它可偵測出圖片亮度分布的狀態。而B_hist313a 則是取整張影像各階層亮度Hist的平均。:Bj*cm3l3b則用 來代表影像的背光程度。 請參照第2圖,其為第1圖實施例中利用亮度分群後 的結果,所繪出之Hist直方圖。 在因子計算層313中,可先利用亮度分群層312對影 像所做出的亮度分群,續出一 Hist直方圖,其中,Hist直 Ο 方圖之橫軸為亮度,縱軸則為Hist值。為了量測附件t第 6頁的圖4中,背景與背光物件的亮度分群直方圖的梯度, 因子計算層312使用Sliding Window(SW)以拍屮昔本 B—咖。配合第2圖,當Hist的累積量小於〇2時,我3 异出最大的SW,即swmax。經由觀察我們可發現到背光程 度會提升SWmax。所以我們定義B—hist經由下^列方程式(3) ΟAttachment 1 is a technical image of a backlight image compensation method and a photographic device having a backlight image compensation mechanism. Referring to FIG. 3 on page 5 of Annex 1, the left backlight image is calculated by the brightness grouping layer 312, and the right group can be remitted. schematic diagram. It can be seen from the grouping diagram that the object located in the middle of the image is the object of the backlight, and the brightness of the backlight object is much smaller than the brightness of the background compared to the brightness of the background of the back side of the backlight object. The factor calculation layer 313 calculates the two backlight factors B_hist 313a and 201101814 B_fcm 313b by using the result of image grouping by the above-described luminance grouping layer 312. Hist is defined as the ratio of pixels above a certain brightness threshold to the total pixels, which detects the state of the picture's brightness distribution. B_hist313a is the average of the brightness Hist of each level of the entire image. :Bj*cm3l3b is used to represent the degree of backlighting of the image. Please refer to Fig. 2, which is a Hist histogram drawn by the result of luminance grouping in the embodiment of Fig. 1. In the factor calculation layer 313, the luminance grouping of the image by the luminance grouping layer 312 can be first used to continue a Hist histogram, wherein the horizontal axis of the Hist straight square graph is luminance, and the vertical axis is Hist value. To measure the gradient of the background and the luminance group histogram of the backlight object in Figure 4 on page 6 of the attachment t, the factor calculation layer 312 uses the Sliding Window (SW) to capture the current B-Caf. In conjunction with Figure 2, when the cumulative amount of Hist is less than 〇2, I am the largest SW, ie swmax. Through observation we can see that the degree of backlighting will increase SWmax. So we define B-hist via the following equation (3) Ο
SW B-hiSt = Thist^) ……Ο) 其中,T*一轉換聽,可以將B〜hist轉換為模 度。轉換函數T為下列函數(4): ,if 0.6 > χ > Q j ,if x >= 0.6 ,otherwise (4) (X - 0.3)/(0.6 - 0.3) T (X) = i 1 hist 0 請參照第3圖,其為另一實施例利用亮度分群結果 9 201101814 緣出之Hist直方圖。由第3圖所示,mst直方圖之橫轴為 亮度’縱韩則為Hist值,而B_hist則是取整個影像各階層 亮度Hist的平均值。 請參照第4圖,其為第1圖實施例中,利用亮度分群 結果所繪出的直方圖。 因子計鼻層313中亦使用了亮度分群層中Fuzzy C-mean(fcm)演算法所分群的結果—背景與背光物件在背光 影像中分別代表著明亮區域與灰暗區域。為了分割背景與 〇 背光物件’因子計算層313將分群數目設置為2。經過Fuzzy C-mean(fcm)演算法’可得兩個亮度群5〇0、51〇的群中心 C1和C2,其中,C1和C2分別代表經亮度分群層312進 行分群後’最暗的兩個亮度群500、51〇的群中心。很明顯 的,背景與背光物件間,沒有明顯亮度累積。換句話說, 背景與背光物件間的亮度累積越低,其背光程度越高。因 此,可定義另一背光因子B—fcm313b,其用以決定一個影 像的背光程度,換言之,B—fcm313b可代表背光物件與背 景的對比程度,如下公式(5): ” ( Σ p(r ) B fcm p(c ) + p(c ) (5) 其中,~ = ΐ為第i灰階的百分比,代表著當 的總像素❼n i為影像中i灰階值的總數々$ 為办像 其可將B—fcm轉換為模糊程度。轉換函數τ為下列函二):’ 201101814 (x - 0.3;/(0.6 - 0.3) ,if 0.6 > x > 〇 3 ^ (x) ~ < ^ Jf x >= 〇 6 (6) hist 〇 ,otherwise V» 其中,背光因子B一fam313b代表兩個群中心Cl和C2 之間的累積亮度,而背光程度則跟B_fcm 313b —起增大。 利用Fuzzy C_mean分群結果,以最暗的兩個亮度群5〇〇、 510作為補償的基準,因此,只取最暗兩個亮度群5⑼、51〇 的群中心C1及C2之絕對距離當作B_fcm 313b,並正規化 〇 值域於0〜1間,其公式(7)如下: B fcm 二 C2 _ Cl _ 255 ⑺ 請參照第5圖’其為第1圖之類神經模糊網路320之 運作流程示意圖。其中,X!代表背光因子B_hist 313a,χ2 代表背光因子B_fcm 313b。 將上述因子計算層313推算出的兩個背光因子B_hist 313a及B_fcm 313b ’丟入類神經模糊網路320中進行運 算。類神經模糊網路320為一種函數型的類神經模糊網路 O (FNFN),並搭配一個以免疫系統為基礎的粒子群最佳化演 算法(IPSO),類神經模糊網路320中包含輸入單元321、運 算單元322及輸出單元323。其中,運算單元322包含歸 屬函數層322a、法則層322b以及推論層322c。 輸入單元321用來接收背光因子B_Hist 313a(即第5 圖中的叉1)及3_&1113131)(即第5圖中的\2),而此輸入單元 321中,每個節點稱為輸入項節點(input term node),每個 輪入項節點分別對應一個接收的背光因子,即: wf} = xi ......⑻ 201101814 並將其傳輸至類神經模糊網路320中。 歸屬函數層322a中每個節點稱為輸入語意識的節點 (input linguistic node)’本歸屬函數層322a主要功能相當於 模糊控制中之模糊化介面,其歸屬值表示一輸入值(即一 背光因子)屬於一模糊集合的程度。在歸屬函數層322a中, 我們採用尚斯歸屬函數(Gaussian membership function), 即: ,(2) exp ]2 (9) Ο οSW B-hiSt = Thist^) ......Ο) Where T* is converted to listen, B~hist can be converted to a modulus. The conversion function T is the following function (4): , if 0.6 > χ > Q j , if x >= 0.6 , otherwise (4) (X - 0.3) / (0.6 - 0.3) T (X) = i 1 Hist 0 Please refer to FIG. 3, which is a Hist histogram of another embodiment using the luminance grouping result 9 201101814. As shown in Fig. 3, the horizontal axis of the mst histogram is the brightness 'the vertical is the Hist value, and the B_hist is the average value of the brightness Hist of the entire image. Please refer to Fig. 4, which is a histogram drawn by the luminance grouping result in the embodiment of Fig. 1. The factor meter nasal layer 313 also uses the results of the grouping of the Fuzzy C-mean (fcm) algorithm in the luminance grouping layer—the background and the backlight object represent the bright and dark areas in the backlight image, respectively. In order to divide the background and 〇 backlight object' factor calculation layer 313, the number of clusters is set to 2. After the Fuzzy C-mean (fcm) algorithm, two clusters C1 and C2 of the luminance group 5〇0, 51〇 are obtained, wherein C1 and C2 respectively represent the two darkest groups after being grouped by the luminance grouping layer 312. Groups of brightness groups 500, 51〇. Obviously, there is no significant accumulation of brightness between the background and the backlight. In other words, the lower the brightness accumulation between the background and the backlight object, the higher the degree of backlighting. Therefore, another backlight factor B-fcm 313b can be defined, which is used to determine the degree of backlighting of an image. In other words, B-fcm 313b can represent the degree of contrast between the backlight object and the background, as shown in the following formula (5): ” ( Σ p(r ) B fcm p(c ) + p(c ) (5) where ~ = ΐ is the percentage of the ith gray scale, which represents the total pixel ❼n i is the total number of i gray scale values in the image 々$ B-fcm can be converted to degree of blur. The conversion function τ is the following two): ' 201101814 (x - 0.3; / (0.6 - 0.3) , if 0.6 > x > 〇 3 ^ (x) ~ < ^ Jf x >= 〇6 (6) hist 〇, otherwise V» where the backlight factor B-fam313b represents the cumulative brightness between the two cluster centers C1 and C2, and the degree of backlighting increases with B_fcm 313b. For the fuzzy C_mean grouping result, the two darkest luminance groups 5〇〇 and 510 are used as the basis for compensation. Therefore, only the absolute distances of the group centers C1 and C2 of the darkest two luminance groups 5(9) and 51〇 are taken as B_fcm 313b. And normalize the 〇 value field between 0 and 1, and its formula (7) is as follows: B fcm II C2 _ Cl _ 255 (7) Please refer to Figure 5, which is the first picture or the like. Schematic diagram of the operational flow of the neuro-fuzzy network 320. Among them, X! represents the backlight factor B_hist 313a, and χ2 represents the backlight factor B_fcm 313b. The two backlight factors B_hist 313a and B_fcm 313b derived from the above-mentioned factor calculation layer 313 are thrown into the nerve-like body. The operation is performed in the fuzzy network 320. The neuro-fuzzy network 320 is a functional neuro-fuzzy network O (FNFN), and is combined with an immune system-based particle swarm optimization algorithm (IPSO). The neural fuzzy network 320 includes an input unit 321, an operation unit 322, and an output unit 323. The operation unit 322 includes a home function layer 322a, a rule layer 322b, and an inference layer 322c. The input unit 321 is configured to receive the backlight factor B_Hist 313a (ie, In the fifth figure, the forks 1) and 3_&1113131) (i.e., \2 in Fig. 5), and in the input unit 321, each node is called an input term node, and each round entry The nodes respectively correspond to a received backlight factor, namely: wf} = xi ...... (8) 201101814 and transmitted to the neuro-fuzzy network 320. Each node in the attribution function layer 322a is called input language awareness. Point (input linguistic node) 'present home function layer 322a corresponding to the main functions of the fuzzy fuzzy control interface, which represents the value of a home input value (i.e., a backlight Factor) belongs to a fuzzy set degrees. In the attribution function layer 322a, we use the Gaussian membership function, namely: , (2) exp ] 2 (9) Ο ο
V 其中,气·和表不高斯歸屬函數之中心點(center)及寬度 (width) ° 法則層322b中每個節點稱為法則節點(_加㈣,而 每個法則節點代表-個模糊法則。法則層322b完成模糊邏 輯法則前鑑部匹m所叫㈣點完成模糊且(fuzzy AND)或積(product)動作: ……do 推論層322C中每個節點稱為後鑑部節點(consequent node),各個後鑑部節點是法則層322b的輸出乘上輸出單元 之'和連結權重的線性組合,如下: ur=uT(w〇j+p^ ...··.(11) 其中,線性組合流程600中,先將〜及义2作歸一化 (Nonnalizati〇n)6〇l ,並進行函數擴展(f刪— expanSi〇n)602 ’而產生輪出單元之^和連結權重的線性組 12 201101814 合0 輸出單元323中的每個節點 node),其完成解模糊化的動作,如下輸出節點(〇UtputV, where the gas and the table do not Gaussian attribution function center and width ° Each node in the law layer 322b is called a rule node (_ plus (four), and each rule node represents a fuzzy rule. The rule layer 322b completes the fuzzy logic rule. The front part of the module is called (four) point to complete the fuzzy and (fuzzy AND) or product action: ...do each node in the inference layer 322C is called a consequent node. Each of the posterior nodes is a linear combination of the output of the rule layer 322b multiplied by the output unit and the link weight, as follows: ur=uT(w〇j+p^ ...··.(11) where linear combination In the process 600, the normalization of (~Nonnalizati〇n)6〇l is performed first, and the function expansion (f-deletion-expanSi〇n) 602′ is performed to generate a linear group of the rounded unit and the connected weight. 12 201101814 0 0 each node node in the output unit 323, which completes the defuzzification action, as follows output node (〇Utput
(12) 其中 Be。 y即為補償值B_c,並利用 輪出單元323輸出補償值 Ο 在運算單元322中’以非線性 方式的模糊法則如下(13)表示之: "為推論 部,此種(12) where Be. y is the compensation value B_c, and the compensation value is output by the rounding unit 323. 模糊 In the arithmetic unit 322, the fuzzy law in a nonlinear manner is expressed as follows (13): " is the inference unit, such
IFIF
lSAiJ andlSAiJ and
XNXN
is AIs A
(13) ο(13) ο
Rule-y: x\isAlj^dx2kA2r^ndx.Rule-y: x\isAlj^dx2kA2r^ndx.
其中,Rule-j指出第j個法則,Aij是模糊集合,xi為網 路輸入變數,推論部是利用函數型的類神經模糊網路32〇 的局部輸出j>;表示出一連串非線性組合。 藉由類神經模糊網路320的推論後,可得到影像的補 償值B_c。而輸出單元323可將補償值B_c自類神經模糊 網路320輸出。 、、 請參照第6圖,係為第1圖之曲線轉換單元41〇所繪 出的補償曲線。背光補償模組400用來接收補償值B c, 並修補影像之亮度’其包含曲線轉換單元410及影像;^補 13 201101814 單元420。為了降低影俊 b 異,曲線轉換· 41Q /^4\物件與背景間的明暗差 曰地方可以獲得補償,而亮的地方可以獲 付抑制,,、補彳員曲線方程式(1句如下: -if a < Cl f(X) =飞(X-a)2 +b f(X) = S^(x'a)2+b — ……(14) …補償曲線絲式(9)巾,下拋曲線對影像巾暗的地方做 〇補ϋ的動作ffij上抛曲線則對亮的地夺做抑制的動作。其 中’a和b為曲線的調整點(τρ),其取決於影像亮度群的中 心以及補償值B-C。而調整點(ΤΡ)的定義如下公式(15): TP(a, b) = (C1, c 1+B_fcm* B_c) (15) ,由第6圖了知,曲線412為—比較例,其代表背光影 像尚未補償的曲線,而利用方程式(14)所緣出的補償曲線 411可對曲線412進行補償,以改善背光影像的亮度。 〇 最後,影像修補單元420利用將影像中每個像素都套 2補償曲線411 i ’即可修補影像的亮度,而得到補償 β參照附件1第13頁圖12 ’其為-PDA影像操作介面 ^桑作流程。當使用者湘PDA之攝影裝置拍攝—影像, 7從,作介面上選擇背光補償系統,並選擇此影像是否是 $光影像。當使用者選擇此影像非背光影像時,影像不會 :任何變更;反之’當使用麵擇此影像為背光影像時, 刼作介面即進入下一步,並可供使用者選擇是否對該影像 201101814 進行背光補償。當使用者選擇不對此影像進行背光補償 時’此影像即不會受任何變更;反之當使用者選擇對此影 像進行背光補償時’此影像則可透過第丨圖之類神經模糊 網路處理模組300及背光補償模組400來進行背光補償。 附件 1 第 16-19 頁的圖 n(a)、18(a)、19(a)及 20(a)皆 為背光影像’而這些影像在攝影裝置中,可直接透過上述Among them, Rule-j points out the j-th rule, Aij is a fuzzy set, xi is the network input variable, and the inference part is a local output j> using a functional type of neuro-fuzzy network 32;; represents a series of nonlinear combinations. After the inference of the neuro-fuzzy network 320, the image compensation value B_c can be obtained. The output unit 323 can output the compensation value B_c from the neuro-fuzzy network 320. Please refer to Fig. 6, which is the compensation curve drawn by the curve conversion unit 41〇 of Fig. 1. The backlight compensation module 400 is configured to receive the compensation value B c and repair the brightness of the image ‘which includes the curve conversion unit 410 and the image; 补 13 201101814 unit 420. In order to reduce the difference between the shadows and the curves, the curve conversion · 41Q / ^ 4 \ between the object and the background can be compensated, and the bright place can be compensated, and the complement curve equation (1 sentence is as follows: - If a < Cl f(X) = fly (Xa)2 +bf(X) = S^(x'a)2+b — (14) ...compensation curve wire type (9) towel, lower throw curve The action of ffij on the dark side of the image towel is the action of suppressing the bright ground. 'a and b are the adjustment points of the curve (τρ), which depends on the center of the image brightness group and The compensation value BC. The adjustment point (ΤΡ) is defined as the following formula (15): TP(a, b) = (C1, c 1+B_fcm* B_c) (15), as seen from Fig. 6, curve 412 is - In the comparative example, it represents a curve that the backlight image has not compensated, and the compensation curve 411 obtained by the equation (14) can compensate the curve 412 to improve the brightness of the backlight image. Finally, the image repairing unit 420 utilizes the image in the image. Each pixel is compensated by the curve 2 411 i ' to repair the brightness of the image, and the compensation is obtained. Refer to Figure 12 on page 13 of the Attachment 1 'the PDA image operation interface ^Sang Zuo process. When the user of the PDA camera shoots the image, 7 selects the backlight compensation system from the interface and selects whether the image is a light image. When the user selects the image as a non-backlight image, the image No: Any change; otherwise, 'When using this image as a backlit image, the interface will go to the next step, and the user can choose whether to backlight the image 201101814. When the user chooses not to perform this image, When the backlight is compensated, the image will not be changed. Otherwise, when the user chooses to perform backlight compensation for the image, the image can be transmitted through the neuro-fuzzy network processing module 300 and the backlight compensation module 400 such as the first image. For backlight compensation. Figures n(a), 18(a), 19(a) and 20(a) on page 16-19 are all backlit images' and these images are directly visible in the camera.
❹ 的類神經模糊網路320,將影像進行背光補償,而其結果 分別如附件1第16-19頁的圖i7(b)、18(b)、19(b)及20(b) 的補償後影像。很明顯的,經過背光補償後的影像,其物 件與方厅、的凴度較為協調,且物件也較為清晰。 綜上所述,實施例之具有背光影像補償機制之攝影 置及其背光影像補償方法,具有下列優點: 收於2二Ϊ函數型的類神經模糊網路具有較少的計算量且 定本發明,任』=揭::二離然”非用以限 範圍内,當可作各種之更動與潤 本發明之精神和 圍當視後附之申請專利範圍所界^者^本發明之保護範 圖式簡單說明】 、特徵、優點與實施例 為讓本發明之上述和其他目的 15 201101814 能更明顯易懂,所附圖式之說明如下: 第1圖係繪示依照本發明一實施例的一種具有背光影 像補償機制之攝影裝置之方塊結構圖。 第2圖為第1圖實施例中,利用亮度分群後的結果所 繪出之Hist直方圖。 第3圖為另一實施例中,利用亮度分群結果而繪出之 Hist直方圖。 第4圖為第1圖實施例中,利用亮度分群結果所繪出 0 的直方圖。 第5圖為第1圖之類神經模糊網路320之運作流程示 意圖。 第6圖為第1圖之曲線轉換單元410所繪出的補償曲 線。 附件1為背光影像補償方法及具有背光影像補償機制 之攝影裝置之技術論文 〇 【主要元件符號說明】 影像擷取模組 預處理模組 亮度分群層 313a : B_hist 320 :類神經模糊網路 322 :運算單元 322b :法則層 323 :輸出單元 100:攝影裝置 200 300 :類神經模糊網路處理模組310 311 :色彩轉換層 313 :因子計算層 313b : B fcm 321 :輸入單元 322a :歸屬函數層 322c :推論層 312 16 201101814 400 : 曲線轉換單元 410 411 : 曲線 412 420 : 影像修補單元 500 510 : 亮度群 600 601 : 步驟 602 曲線轉換單元 曲線 亮度群 線性組合流程 步驟❹ The neuro-fuzzy network 320 compensates for the image, and the results are compensated for Figures i7(b), 18(b), 19(b), and 20(b) on pages 16-19 of Annex 1, respectively. Rear image. Obviously, after the backlight-compensated image, the object and the square are more harmonious, and the objects are clearer. In summary, the photographic arrangement with backlight image compensation mechanism and the backlight image compensation method thereof have the following advantages: The neuro-fuzzy network of the 2nd Ϊ function type has less computational complexity and is determined by the invention.任 』 揭 揭 揭 揭 : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : BRIEF DESCRIPTION OF THE DRAWINGS The above and other objects of the present invention 15 201101814 can be more clearly understood, and the description of the drawings is as follows: FIG. 1 is a diagram showing an embodiment of the present invention. Block diagram of a photographing apparatus having a backlight image compensating mechanism. Fig. 2 is a Hist histogram drawn by the result of luminance grouping in the embodiment of Fig. 1. Fig. 3 is a diagram showing luminance in another embodiment. Hist histogram drawn by grouping results. Fig. 4 is a histogram plotting 0 using the luminance grouping result in the embodiment of Fig. 1. Fig. 5 is a flow chart of the neuro-fuzzy network 320 such as Fig. 1. Schematic. The figure is a compensation curve drawn by the curve conversion unit 410 of Fig. 1. Annex 1 is a technical image of a backlight image compensation method and a photographic device having a backlight image compensation mechanism 〇 [Main component symbol description] Image capture module preprocessing Module brightness grouping layer 313a: B_hist 320: neuro-fuzzy network 322: arithmetic unit 322b: law layer 323: output unit 100: photographing apparatus 200 300: neuro-fuzzy network processing module 310 311: color conversion layer 313: Factor calculation layer 313b: B fcm 321 : input unit 322a: attribution function layer 322c: inference layer 312 16 201101814 400 : curve conversion unit 410 411 : curve 412 420 : image patch unit 500 510 : luminance group 600 601 : step 602 curve conversion Unit curve brightness group linear combination process step
1717
Claims (1)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW98121670A TW201101814A (en) | 2009-06-26 | 2009-06-26 | Method of compensating for backlight image and photograph apparatus with backlight image compensation system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW98121670A TW201101814A (en) | 2009-06-26 | 2009-06-26 | Method of compensating for backlight image and photograph apparatus with backlight image compensation system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| TW201101814A true TW201101814A (en) | 2011-01-01 |
Family
ID=44837136
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW98121670A TW201101814A (en) | 2009-06-26 | 2009-06-26 | Method of compensating for backlight image and photograph apparatus with backlight image compensation system |
Country Status (1)
| Country | Link |
|---|---|
| TW (1) | TW201101814A (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI454940B (en) * | 2011-01-28 | 2014-10-01 | Univ Nat Taiwan Science Tech | Method for designining fuzzy membership functions of of fuzzy controller auto focus |
| TWI579593B (en) * | 2011-10-31 | 2017-04-21 | Lg伊諾特股份有限公司 | Camera module and method for compensating images of the same |
| TWI635481B (en) * | 2017-10-30 | 2018-09-11 | 佳世達科技股份有限公司 | Display and color correction method |
| TWI649698B (en) * | 2017-12-21 | 2019-02-01 | 財團法人工業技術研究院 | Object detection device, object detection method, and computer readable medium |
| US10748033B2 (en) | 2018-12-11 | 2020-08-18 | Industrial Technology Research Institute | Object detection method using CNN model and object detection apparatus using the same |
-
2009
- 2009-06-26 TW TW98121670A patent/TW201101814A/en unknown
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI454940B (en) * | 2011-01-28 | 2014-10-01 | Univ Nat Taiwan Science Tech | Method for designining fuzzy membership functions of of fuzzy controller auto focus |
| TWI579593B (en) * | 2011-10-31 | 2017-04-21 | Lg伊諾特股份有限公司 | Camera module and method for compensating images of the same |
| TWI635481B (en) * | 2017-10-30 | 2018-09-11 | 佳世達科技股份有限公司 | Display and color correction method |
| TWI649698B (en) * | 2017-12-21 | 2019-02-01 | 財團法人工業技術研究院 | Object detection device, object detection method, and computer readable medium |
| CN109948637A (en) * | 2017-12-21 | 2019-06-28 | 财团法人工业技术研究院 | Object test equipment, method for checking object and computer-readable medium |
| US10600208B2 (en) | 2017-12-21 | 2020-03-24 | Industrial Technology Research Institute | Object detecting device, object detecting method and non-transitory computer-readable medium |
| CN109948637B (en) * | 2017-12-21 | 2021-12-17 | 财团法人工业技术研究院 | Object detection device, object detection method, and computer-readable medium |
| US10748033B2 (en) | 2018-12-11 | 2020-08-18 | Industrial Technology Research Institute | Object detection method using CNN model and object detection apparatus using the same |
| TWI708209B (en) * | 2018-12-11 | 2020-10-21 | 財團法人工業技術研究院 | Object detection method using cnn model and object detection apparatus using the same |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Fan et al. | Multiscale low-light image enhancement network with illumination constraint | |
| Ren et al. | Gated fusion network for single image dehazing | |
| CN115223004B (en) | Method for generating image enhancement of countermeasure network based on improved multi-scale fusion | |
| CN111327824B (en) | Method, device, storage medium and electronic device for selecting shooting parameters | |
| KR100900694B1 (en) | Nonlinear Low Light Compensation Apparatus, Method, and Computer-readable Recording Media | |
| CN103973991B (en) | A kind of automatic explosion method judging light scene based on B P neutral net | |
| CN111292264A (en) | A Deep Learning-Based Image High Dynamic Range Reconstruction Method | |
| CN114066812A (en) | A reference-free image quality assessment method based on spatial attention mechanism | |
| CN111401324A (en) | Image quality evaluation method, device, storage medium and electronic equipment | |
| WO2019061766A1 (en) | Image processing method and device | |
| TW201101814A (en) | Method of compensating for backlight image and photograph apparatus with backlight image compensation system | |
| CN105184754A (en) | Image contrast enhancement method | |
| CN111724447B (en) | Image processing method, system, electronic equipment and storage medium | |
| CN113538223B (en) | Noise image generation method, device, electronic device and storage medium | |
| CN102446347B (en) | White balance method and device for image | |
| CN110211070B (en) | A Low Illumination Color Image Enhancement Method Based on Local Extremum | |
| CN112489144B (en) | Image processing method, image processing device, terminal device and storage medium | |
| KR20210123608A (en) | Image sensing device and operating method of the same | |
| Song et al. | Optimizing nighttime infrared and visible image fusion for long-haul tactile internet | |
| CN116229081A (en) | Unmanned aerial vehicle panoramic image denoising method based on attention mechanism | |
| CN116823674B (en) | Cross-modal fusion underwater image enhancement method | |
| CN118469884A (en) | Multi-degradation low-illumination image enhancement method based on frequency domain sensing and reconciliation | |
| CN101478690B (en) | A Method of Image Illumination Correction Based on Color Gamut Mapping | |
| TWI768282B (en) | Method and system for establishing light source information prediction model | |
| CN105118032B (en) | A kind of wide method for dynamically processing of view-based access control model system |