[go: up one dir, main page]

TWI695295B - Image processing method, device and electronic equipment based on augmented reality - Google Patents

Image processing method, device and electronic equipment based on augmented reality Download PDF

Info

Publication number
TWI695295B
TWI695295B TW107144780A TW107144780A TWI695295B TW I695295 B TWI695295 B TW I695295B TW 107144780 A TW107144780 A TW 107144780A TW 107144780 A TW107144780 A TW 107144780A TW I695295 B TWI695295 B TW I695295B
Authority
TW
Taiwan
Prior art keywords
yuv
rgba
format
value
component
Prior art date
Application number
TW107144780A
Other languages
Chinese (zh)
Other versions
TW201933046A (en
Inventor
袁飛虎
張敏琪
Original Assignee
香港商阿里巴巴集團服務有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 香港商阿里巴巴集團服務有限公司 filed Critical 香港商阿里巴巴集團服務有限公司
Publication of TW201933046A publication Critical patent/TW201933046A/en
Application granted granted Critical
Publication of TWI695295B publication Critical patent/TWI695295B/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Color Image Communication Systems (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本說明書提供一種基於擴增實境的圖像處理方法、裝置及電子設備,針對RGBA格式圖像,透過在圖形處理器中讀取RGBA格式圖像中像素點的RGBA值,圖形處理器可以根據所述像素點的RGBA值快速計算得到YUV分量值,之後讀取出所述圖像處理器計算出的YUV分量值,可以獲得YUV格式圖像。本說明書實施例利用GPU的硬體加速能力,可以快速地進行RGBA格式與YUV格式的轉換,圖像處理速度較快,從而可以滿足AR場景中對速度的要求。This specification provides an augmented reality-based image processing method, device, and electronic equipment. For RGBA format images, by reading the RGBA values of pixels in the RGBA format image in the graphics processor, the graphics processor can The RGBA value of the pixel point is quickly calculated to obtain a YUV component value, and then the YUV component value calculated by the image processor is read out to obtain a YUV format image. The embodiments of the present specification use the hardware acceleration capability of the GPU to quickly convert between the RGBA format and the YUV format, and the image processing speed is fast, which can meet the speed requirements in the AR scene.

Description

基於擴增實境的圖像處理方法、裝置及電子設備Image processing method, device and electronic equipment based on augmented reality

本說明書係有關圖像處理技術領域,尤其有關基於擴增實境的圖像處理方法、裝置及電子設備。This specification relates to the field of image processing technology, and in particular to image processing methods, devices and electronic equipment based on augmented reality.

AR是一種在設備攝像模組所拍攝畫面上添加虛擬模型的技術,這種技術可以將真實的環境和虛擬的模型疊加到了同一個畫面而同時存在,從而給與用戶超越現實的感官體驗。在AR場景中,需要將AR素材進行處理後渲染顯示在電子設備螢幕上,因此對處理速度的要求較高。AR is a technology that adds a virtual model to the screen shot by the device camera module. This technology can superimpose the real environment and the virtual model to the same screen and exist at the same time, thereby giving the user a sensory experience that transcends reality. In the AR scene, the AR material needs to be processed and rendered and displayed on the screen of the electronic device, so the processing speed is relatively high.

為克服相關技術中存在的問題,本說明書提供了基於擴增實境的圖像處理、圖像處理方法、裝置及電子設備。 根據本說明書實施例的第一態樣,提供一種基於擴增實境的圖像處理方法,所述方法包括: 獲取AR素材,所述AR素材包括RGBA格式圖像; 利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 可選地,所述根據所述像素點的RGBA值計算得到YUV分量值,包括: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值,包括: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 根據本說明書實施例的第二態樣,提供一種圖像處理方法,所述方法包括: 獲取待轉換的RGBA格式圖像; 利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像。 可選地,所述根據所述像素點的RGBA值計算得到YUV分量值,包括: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值,包括: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 根據本說明書實施例的協力廠商側,提供一種基於擴增實境的圖像處理裝置,所述裝置包括: 獲取模組,用以:獲取AR素材,所述AR素材包括RGBA格式圖像; 計算模組,用以:利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取模組,用以:讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 可選地,所述計算模組,具體用以: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述計算模組,具體用以: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 根據本說明書實施例的第四態樣,提供一種圖像處理裝置,所述裝置包括: 獲取模組,用以:獲取待轉換的RGBA格式圖像; 計算模組,用以:利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取模組,用以:讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像。 根據本說明書實施例的第五態樣,提供一種電子設備,包括: 處理器; 用以儲存處理器可執行指令的記憶體; 其中,所述處理器被配置成: 獲取AR素材,所述AR素材包括RGBA格式圖像; 利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 本說明書的實施例提供的技術方案可以包括以下有益效果: 本說明書中,針對RGBA格式圖像,透過在圖形處理器(GPU,Graphics Processing Unit)中讀取RGBA格式圖像中像素點的RGBA值,圖形處理器可以根據所述像素點的RGBA值快速計算得到YUV分量值,之後讀取出所述圖像處理器計算出的YUV分量值,可以獲得YUV格式圖像。本說明書實施例利用GPU的硬體加速能力,可以快速地進行RGBA格式與YUV格式的轉換,圖像處理速度較快,從而可以滿足AR場景中對速度的要求。 應當理解的是,以上的一般描述和後文的細節描述僅是示例性和解釋性的,並不能限制本說明書。In order to overcome the problems in the related art, this specification provides image processing, image processing methods, devices, and electronic equipment based on augmented reality. According to a first aspect of the embodiments of the present specification, an image processing method based on augmented reality is provided. The method includes: Obtain AR material, which includes RGBA format images; Using a graphics processor to read the RGBA values of pixels in the RGBA format image, and calculating the YUV component values according to the RGBA values of the pixels; The YUV component value calculated by the image processor is read out to obtain a YUV format image, and the YUV format image is used for AR processing. Optionally, the calculating the YUV component value according to the RGBA value of the pixel includes: The YUV component value is calculated according to the RGBA value of the pixel according to the arrangement order of the YUV components specified in the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format. Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; The YUV component arrangement order specified according to the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format, and calculating the YUV component value according to the RGBA value of the pixel point include: Store the RGBA value of each pixel in the RGBA format image in each original segment; Configure the storage area for storing images in YUV format; After determining the target segment in the storage area according to the YUV format, determine that the target segment needs to store the YUV component according to the arrangement order of the YUV components; For the target segment, the YUV component needs to be stored, the RGBA value in the original segment is obtained, the YUV component is calculated according to the conversion algorithm, and then stored in the target segment; Output the amount of data stored in each target segment in the storage area to obtain the YUV format image. According to a second aspect of the embodiments of the present specification, an image processing method is provided. The method includes: Obtain the RGBA format image to be converted; Using a graphics processor to read the RGBA values of pixels in the RGBA format image, and calculating the YUV component values according to the RGBA values of the pixels; The YUV component value calculated by the image processor is read to obtain a YUV format image. Optionally, the calculating the YUV component value according to the RGBA value of the pixel includes: The YUV component value is calculated according to the RGBA value of the pixel according to the arrangement order of the YUV components specified in the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format. Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; The YUV component arrangement order specified according to the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format, and calculating the YUV component value according to the RGBA value of the pixel point include: Store the RGBA value of each pixel in the RGBA format image in each original segment; Configure the storage area for storing images in YUV format; After determining the target segment in the storage area according to the YUV format, determine that the target segment needs to store the YUV component according to the arrangement order of the YUV components; For the target segment, the YUV component needs to be stored, the RGBA value in the original segment is obtained, the YUV component is calculated according to the conversion algorithm, and then stored in the target segment; Output the amount of data stored in each target segment in the storage area to obtain the YUV format image. According to the third party side of the embodiment of the present specification, an augmented reality-based image processing device is provided, and the device includes: An acquisition module for: acquiring AR material, the AR material including RGBA format images; The calculation module is used to: read the RGBA value of pixels in the RGBA format image using a graphics processor, and calculate the YUV component value according to the RGBA value of the pixels; The reading module is used to: read out the YUV component value calculated by the image processor, obtain a YUV format image, and perform AR processing using the YUV format image. Optionally, the calculation module is specifically used to: The YUV component value is calculated according to the RGBA value of the pixel according to the arrangement order of the YUV components specified in the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format. Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; The calculation module is specifically used to: Store the RGBA value of each pixel in the RGBA format image in each original segment; Configure the storage area for storing images in YUV format; After determining the target segment in the storage area according to the YUV format, determine that the target segment needs to store the YUV component according to the arrangement order of the YUV components; For the target segment, the YUV component needs to be stored, the RGBA value in the original segment is obtained, the YUV component is calculated according to the conversion algorithm, and then stored in the target segment; Output the amount of data stored in each target segment in the storage area to obtain the YUV format image. According to a fourth aspect of the embodiments of the present specification, an image processing device is provided, the device including: Acquisition module for: acquiring RGBA format images to be converted; The calculation module is used to: read the RGBA value of pixels in the RGBA format image using a graphics processor, and calculate the YUV component value according to the RGBA value of the pixels; The reading module is used to: read out the YUV component value calculated by the image processor to obtain a YUV format image. According to a fifth aspect of the embodiments of the present specification, an electronic device is provided, including: processor; Memory for storing processor executable instructions; Wherein, the processor is configured to: Obtain AR material, which includes RGBA format images; Using a graphics processor to read the RGBA value of pixels in the RGBA format image, and calculating the YUV component value according to the RGBA value of the pixels; The YUV component value calculated by the image processor is read to obtain a YUV format image, and the YUV format image is used for AR processing. The technical solutions provided by the embodiments of the present specification may include the following beneficial effects: In this specification, for RGBA format images, by reading the RGBA values of pixels in the RGBA format image in a graphics processing unit (GPU, Graphics Processing Unit), the graphics processor can quickly calculate the RGBA values of the pixels The YUV component value is obtained, and then the YUV component value calculated by the image processor is read to obtain a YUV format image. The embodiments of this specification use the hardware acceleration capability of the GPU to quickly convert between the RGBA format and the YUV format, and the image processing speed is relatively fast, which can meet the speed requirements in the AR scene. It should be understood that the above general description and the following detailed description are only exemplary and explanatory, and do not limit this specification.

這裡將詳細地對示例性實施例進行說明,其示例表示在附圖中。下面的描述涉及附圖時,除非另有表示,不同附圖中的相同數字表示相同或相似的要素。以下示例性實施例中所描述的實施方式並不代表與本說明書相一致的所有實施方式。相反地,它們僅是與如所附申請專利範圍中所詳述的、本說明書的一些態樣相一致的裝置和方法的例子。 在本說明書使用的術語是僅僅出於描述特定實施例的目的,而非旨在限制本說明書。在本說明書和所附申請專利範圍中所使用的單數形式的“一種”、“所述”和“該”也旨在包括多數形式,除非上下文清楚地表示其他含義。還應當理解,本文中使用的術語“和/或”是指並包含一個或多個相關聯的列出專案的任何或所有可能組合。 應當理解,儘管在本說明書可能採用術語第一、第二、第三等來描述各種資訊,但這些資訊不應限於這些術語。這些術語僅用來將同一類型的資訊彼此區分開。例如,在不脫離本說明書範圍的情況下,第一資訊也可以被稱為第二資訊,類似地,第二資訊也可以被稱為第一資訊。取決於語境,如在此所使用的詞語“如果”可以被解釋成為“在……時”或“當……時”或“回應於確定”。 擴增實境(Augmented Reality,簡稱AR)技術,是一種將真實世界資訊和虛擬世界資訊無縫整合的新技術,該技術可以將虛擬的資訊應用到真實世界,真實環境和虛擬物體即時地疊加到了同一個畫面或空間而同時存在。 如圖1A所示,是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理方法的應用場景圖,圖1A中用戶所持有的智慧型手機內置有攝像模組,用戶可以持該智慧型手機拍攝真實環境畫面,智慧型手機可以根據所拍攝的真實環境畫面中疊加渲染有AR素材。AR素材中可以包括有圖像,在某些電子設備的顯示模組中,需要獲取YUV格式圖像進行AR處理。然而,電子設備獲取到的圖像素材有可能是以其他格式進行儲存,例如RGBA等。因此,在進行AR處理前,需要對圖像的格式進行轉換。 AR場景中對資料處理速度的要求通常較高,因此,本說明書實施例提供一種基於擴增實境的圖像處理方案,針對RGBA格式圖像,透過在圖形處理器(GPU,Graphics Processing Unit)中讀取RGBA格式圖像中像素點的RGBA值,圖形處理器可以根據所述像素點的RGBA值快速計算得到YUV分量值,之後讀取出所述圖像處理器計算出的YUV分量值,可以獲得YUV格式圖像。本說明書實施例利用GPU的硬體加速能力,可以快速地進行RGBA格式與YUV格式的轉換,從而可以滿足AR場景中對速度的要求。接下來對本說明書實施例進行詳細說明。 如圖1B所示,圖1B是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理方法的流程圖,可應用於電子設備中,包括以下步驟: 在步驟102中,獲取AR素材,所述AR素材包括RGBA格式圖像; 在步驟104中,利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 在步驟106中,讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 其中,RGBA是代表Red(紅色)、Green(綠色)、Blue (藍色)和Alpha的色彩空間。如圖1C所示,是本說明書根據一示例性實施例示出的一種RGBA格式圖像的示意圖,該圖像的長為w、寬為h,每個像素點的像素值佔用4個位元組,分別是該像素的R、G、B和A四個分量值。 而YUV格式圖像包括Y、U和V三個分量,“Y”表示明亮度(Luminance或Luma),也就是灰階值;而“U”和“V”表示的則是色度(Chrominance或Chroma),作用是描述圖像色彩及飽和度,用以指定像素的顏色。 圖形處理器可以利用RGBA格式與YUV格式的轉換演算法,將所述像素點的RGBA值進行轉換計算得到YUV分量值。具體轉換時,RGBA格式圖像被輸入至圖形處理器,圖像資料被儲存於圖形處理器記憶體中,圖形處理器讀取記憶體中的圖像資料後進行格式轉換。 其中,對於RGBA格式圖像,圖像資料儲存時像素點的R、G、B和A四個分量值是連續儲存的;而YUV格式圖像,YUV格式的Y分量和U、V分量是分離的。YUV格式圖像中,根據圖像儲存方式的不同,還細分有多種類型,例如:YUY2、YUYV、YVYU、UYVY、AYUV、Y41P、Y411、Y211、IF09、IYUV、YV12、YVU9、YUV411或YUV420等。其中,不同類型對應不同的Y、U和V分量的排列順序。 舉例來說,如圖1D所示,是本說明書根據一示例性實施例示出的一種YUV格式圖像的示意圖,圖1D所示圖像的YUV格式以NV12為例,前w*h個位元組是Y分量,每個分量佔用一個位元組,分別表示第(i,j)個像素的Y分量。而後續的w*h/2個位元組表示UV分量,每一列是以UVUVUV交替的形式來進行儲存的。 以720×488大小的圖像為例,按照儲存方式分為如下三個部分: Y分量:(720×480)個位元組 U分量:(720×480/2)個位元組 V分量:(720×480/2)個位元組 三個部分內部均是列優先儲存,三個部分之間的排列順序是Y分量排列在前,之後由U和V交替順序儲存。也就是,YUV資料的0至720×480位元組是Y分量值,後續的位元組是U和V交替儲存。 再以其他儲存格式儲存的YUV格式圖像為例,例如,Y41P(和Y411)的儲存格式中,YUV分量排列順序如下:U0 Y0 V0 Y1 U4 Y2 V4 Y3 Y4 Y5 Y6 Y8 …。 基於此,圖形處理器可以根據所述像素點的RGBA值計算得到YUV分量值,考慮到RGBA格式與YUV格式的儲存方式不同,在一個可選的實現方式中,可以包括: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 其中,作為示例,所述RGBA格式轉換YUV格式的轉換演算法具體可以採用如下公式表示: Y = 0.299×R + 0.587×G + 0.114×B + 0 U = -0.169×R-0.331×G + 0.499×B + 0.5 V = 0.499×R-0.418×G-0.0813×B + 0.5 透過上述方式,針對YUV格式圖像中像素點YUV分量排列順序,圖形處理器可以讀取RGBA圖像中像素點的RGBA值,並按照轉換演算法計算得到YUV分量值。計算完成後,圖形處理器中記憶體區域所儲存的計算結果即是YUV格式圖像中像素點YUV分量值,也就是YUV格式圖像。 實際應用中,GPU通用計算方面的標準包括有OpenCL、CUDA、ATI STREAM等,這些標準可以理解為GPU底層所提供的應用程式設計發展介面,因此,可以根據不同GPU所提供的程式設計介面類別型靈活地實現本說明書方案。 接下來以OpenCL(全稱Open Computing Language,開放運算語言)介面為例進行說明,OpenCL是第一個面向異構系統通用目的並行程式設計的開放式、免費標準,也是一個統一的程式設計環境,便於軟體發展人員為高性能計算伺服器、桌上型計算系統、手持設備編寫高效輕便的代碼。 OpenGL採用片段作為基本儲存單元,在一些例子中,OpenGL在處理RGBA格式圖像時,一個片段對應一個像素的RGBA值。仍以圖1C所示的RGBA格式圖像轉換為圖1D所示的YUV格式圖像為例,圖1E示出了計算Y分量的過程示意圖,圖1E中左邊是RGBA格式,右邊是YUV格式。本說明書實施例中以一個片段對應一個像素的RGBA值為例,轉換前,記憶體區域中儲存的RGBA值可以透過如下表格所示,其中,為了示例方便,以下表格僅以3個像素點(像素點(0,0)、像素點(0,1)和像素點(0,2))為例進行說明:

Figure 02_image001
而相應的YUV格式圖像,其記憶體區域中儲存的YUV分量值可以透過如下表格所示:
Figure 02_image003
透過上述對比可知,根據YUV分量排列順序,我們期望轉換後的YUV格式圖像中片段1是4個像素點的Y分量,而這一個片段(4個像素點)的Y分量,需要讀取RGBA格式圖像中四個片段(4個像素點)的像素值而計算得到。 基於此,為了實現快速的轉換,在一個可選的實現方式中,所述根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格式的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值,可以包括: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 作為例子,針對圖1E再次進行說明。GPU在轉換時,將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中,並且,需要配置用以儲存YUV格式圖像的儲存區域。 圖1E中,以圖像的長為w、寬為h為例,對於Y分量的計算,Y分量需要儲存於前w*h個位元組的記憶體區域中,本實施例將四個像素點的Y分量看成一個片段,也就是這四個Y分量佔據了一個目標片段。而這個目標片段,涉及到四個像素點,因此需要讀取RGBA格式圖像中相應的四個片段(4個像素點)進行計算得到。具體地,因為寬度是w,所以一列有w/4個片段。對於每一列,將左邊的w個像素點繪製到右邊的w/4個片段中。所以對於(i,j)位置的片段(Ya,Yb,Yc,Yd),其對應著座標中的(Pa,Pb,Pc,Pd),Pa對應著(m,n-1.5)位置的像素點,Pb對應(m,n-0.5)位置的像素點,其它依此類推。這裡的兩個相鄰像素距離為1,是以總寬度來計算的。若OpenGL中需要歸一化,像素間距可以歸一化為1/w,所以Pa即對應著(m, n-1.5*1/w)。因此,根據YUV格式可以確定儲存區域中的目標片段,之後根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量,接著再獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中。 根據RGBA格式轉換YUV格式的轉換演算法,因此對於YUV格式圖像中的片段(i,j),可以求出其需要儲存為Y分量為: Ya = 0.299 × Ra + 0.587 × Ga + 0.114 × Ba + 0 Yb = 0.299 × Rb + 0.587 × Gb + 0.114 × Bb + 0 Yc = 0.299 × Rc + 0.587 × Gc + 0.114 × Bc + 0 Yd = 0.299 × Rd + 0.587 × Gd + 0.114 × Bd + 0 根據YUV分量排列順序,儲存YUV格式圖像的儲存區域中,Y分量的起始點為儲存區域的起點,寬度為w/4,高度為h,作為一個可選的實現方式,可以由OpenGL集中進行Y分量的轉換計算,根據OpenGL介面函數,可以設定視埠為glViewport(0,0,w/4,h),再利用上述轉換公式在該儲存區域內集中轉換儲存得到Y分量。 針對UV分量的計算,如圖1F所示,UV分量是緊接著Y分量後面大小為w*h/2的儲存區域,這裡將兩個UV,即UVUV看成一個片段,也分別對應著片段中的RGBA。同樣每一列有w/4個片段,但與Y分量不同的是,它的高度只有一半,因為是4個像素才有一個UV。所以可以理解為對於RGBA圖像中兩列2w個像素點,映射到了YUV格式圖像中的一列w/4個片段中。所以對於(i,j)位置的片段(Ua, Va, Uc, Vc),其需要RGBA格式圖像中八個像素點才可計算得到,也就是八個片段,例如Pa對應著(m-0.5, n-1.5)位置的像素點,Pc對應(m-0.5, n+0.5)位置的像素點。為簡化理解和計算,這裡可以將a和c的橫坐標都取m。若需要歸一化,可以得到Pa為像素點(m, n-1.5*1/w),Pb為像素點(m, n+0.5*1/w)。根據轉換公式得到: Ua = - 0.169 × Ra - 0.331 × Ga + 0.499 × Ba + 0.5 Va = 0.499 × Ra - 0.418 × Ga - 0.0813 × Ba + 0.5 Ub = - 0.169 × Rb - 0.331 × Gb + 0.499 × Bb + 0.5 Vb = 0.499 × Rb - 0.418 × Gb - 0.0813 × Bb + 0.5 根據YUV分量排列順序,儲存YUV格式圖像的儲存區域中,UV分量的起始點為Y分量後的位置,寬度也為w/4,高度為h/2,作為一個可選的實現方式,可以由OpenGL集中進行Y分量的轉換計算,根據OpenGL介面函數,可以設定視埠為glViewport(0, h, w/4, h/2),再利用上述轉換公式在該儲存區域內集中轉換儲存得到UV分量。 如圖2所示,是本說明書根據一示例性實施例示出的另一種圖像處理方法,所述方法包括: 在步驟202中,獲取待轉換的RGBA格式圖像; 在步驟204中,利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 在步驟206中,讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像。 可選地,所述根據所述像素點的RGBA值計算得到YUV分量值,包括: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值,包括: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 本實施例的具體細節可參考圖1A至圖1F所示的實施例,在此不再贅述。 與前述圖像處理方法、基於擴增實境的圖像處理的實施例相對應,本說明書還提供了圖像處理裝置、基於擴增實境的圖像處理裝置及其所應用的電子設備的實施例。 本說明書圖像處理裝置/基於擴增實境的圖像處理裝置的實施例可以應用在電子設備上。裝置實施例可以透過軟體來實現,也可以透過硬體或者軟硬體結合的方式來實現。以軟體實現為例,作為一個邏輯意義上的裝置,是透過其所在檔處理的處理器將非易失性記憶體中對應的電腦程式指令讀取到記憶體中運行而形成的。從硬體層面而言,如圖3所示,為本說明書圖像處理裝置/基於擴增實境的圖像處理裝置所在電子設備的一種硬體結構圖,除了圖3所示的處理器310、記憶體330、網路介面320、以及非易失性記憶體340之外,實施例中裝置331所在的電子設備,通常根據該電子設備的實際功能,還可以包括其他硬體,對此不再贅述。 如圖4所示,圖4是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理裝置的方塊圖,所述裝置包括: 獲取模組41,用以:獲取AR素材,所述AR素材包括RGBA格式圖像; 計算模組42,用以:利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取模組43,用以:讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 可選地,所述計算模組42,具體用以: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述計算模組42,具體用以: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 如圖5所示,圖5是本說明書根據一示例性實施例示出的一種圖像處理裝置的方塊圖, 獲取模組51,用以:獲取待轉換的RGBA格式圖像; 計算模組52,用以:利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取模組53,用以:讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像。 可選地,所述計算模組52,具體用以: 根據YUV格式所指定的YUV分量排列順序、以及RGBA格式轉換YUV格的轉換演算法,根據所述像素點的RGBA值計算得到YUV分量值。 可選地,所述圖像處理器提供有OpenGL介面,所述OpenGL介面採用片段作為基本儲存單元; 所述計算模組52,具體用以: 將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中; 配置用以儲存YUV格式圖像的儲存區域; 根據YUV格式確定所述儲存區域中的目標片段後,根據所述YUV分量排列順序確定所述目標片段需儲存YUV分量; 針對目標片段需儲存YUV分量,獲取所述原始片段中的RGBA值,根據所述轉換演算法計算得到YUV分量後儲存於所述目標片段中; 輸出所述儲存區域中各所述目標片段儲存的資料量,獲得所述YUV格式圖像。 相應地,本說明書還提供一種電子設備,包括: 處理器; 用以儲存處理器可執行指令的記憶體; 其中,所述處理器被配置成: 獲取AR素材,所述AR素材包括RGBA格式圖像; 利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像,利用所述YUV格式圖像進行AR處理。 本說明書還提供另一種電子設備,包括: 處理器; 用以儲存處理器可執行指令的記憶體; 其中,所述處理器被配置成: 獲取待轉換的RGBA格式圖像; 利用圖形處理器讀取所述RGBA格式圖像中像素點的RGBA值,根據所述像素點的RGBA值計算得到YUV分量值; 讀取出所述圖像處理器計算出的YUV分量值,獲得YUV格式圖像。 上述圖像處理裝置/基於擴增實境的圖像處理裝置中各個模組的功能和作用的實現過程具體詳見上述方法中對應步驟的實現過程,在此不再贅述。 對於裝置實施例而言,由於其基本對應於方法實施例,所以相關之處參見方法實施例的部分說明即可。以上所描述的裝置實施例僅僅是示意性的,其中,所述作為分離部件說明的模組可以是或者也可以不是物理上分開的,作為模組顯示的部件可以是或者也可以不是物理模組,即可以位於一個地方,或者也可以分佈到多個網路模組上。可以根據實際的需要選擇其中的部分或者全部模組來實現本說明書方案的目的。本領域普通技術人員在不付出創造性勞動的情況下,即可以理解並實施。 上述對本說明書特定實施例進行了描述。其它實施例在所附申請專利範圍的範疇內。在一些情況下,在申請專利範圍中記載的動作或步驟可以按照不同於實施例中的順序來執行並且仍然可以實現期望的結果。另外,在附圖中描繪的過程不一定要求示出的特定順序或者連續順序才能實現期望的結果。在某些實施方式中,多工處理和並行處理也是可以的或者可能是有利的。 本領域技術人員在考慮說明書及實踐這裡說明書的發明後,將容易想到本說明書的其它實施方案。本說明書旨在涵蓋本說明書的任何變型、用途或者適應性變化,這些變型、用途或者適應性變化遵循本說明書的一般性原理並包括本說明書未說明書的本技術領域中的公知常識或慣用技術手段。說明書和實施例僅被視為示例性的,本說明書的真正範圍和精神由下面的申請專利範圍來指出。 應當理解的是,本說明書並不局限於上面已經描述並在附圖中示出的精確結構,並且可以在不脫離其範圍進行各種修改和改變。本說明書的範圍僅由所附的申請專利範圍來限制。 以上所述僅為本說明書的較佳實施例而已,並不用以限制本說明書,凡在本說明書的精神和原則之內,所做的任何修改、等同替換、改進等,均應包含在本說明書保護的範圍之內。Exemplary embodiments will be described in detail here, examples of which are shown in the drawings. When referring to the drawings below, unless otherwise indicated, the same numerals in different drawings represent the same or similar elements. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with this specification. Rather, they are only examples of devices and methods consistent with some aspects of this specification as detailed in the scope of the attached patent applications. The terminology used in this specification is for the purpose of describing particular embodiments only, and is not intended to limit this specification. The singular forms "a", "said" and "the" used in this specification and the appended patent applications are also intended to include most forms unless the context clearly indicates other meanings. It should also be understood that the term "and/or" as used herein refers to and includes any or all possible combinations of one or more associated listed items. It should be understood that although the terms first, second, third, etc. may be used in this specification to describe various information, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other. For example, without departing from the scope of this specification, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information. Depending on the context, the word "if" as used herein can be interpreted as "when" or "when..." or "responsive to certainty". Augmented Reality (Augmented Reality, AR for short) technology is a new technology that seamlessly integrates real world information and virtual world information. This technology can apply virtual information to the real world, real-time environment and virtual objects are superimposed in real time To the same picture or space and exist at the same time. As shown in FIG. 1A, it is an application scenario diagram of an augmented reality-based image processing method according to an exemplary embodiment of this specification. In FIG. 1A, a smartphone held by a user has a built-in camera module. The user can hold the smart phone to shoot the real environment picture, and the smart phone can superimpose and render AR material according to the real environment picture taken. AR materials can include images. In some electronic device display modules, you need to obtain YUV format images for AR processing. However, the image material acquired by the electronic device may be stored in other formats, such as RGBA. Therefore, before performing AR processing, the image format needs to be converted. The requirements for data processing speed in AR scenes are usually high. Therefore, the embodiments of the present specification provide an image processing solution based on augmented reality. For RGBA format images, the graphics processing unit (GPU, Graphics Processing Unit) is used. Read the RGBA value of the pixel in the RGBA format image, the graphics processor can quickly calculate the YUV component value according to the RGBA value of the pixel, and then read the YUV component value calculated by the image processor, You can get images in YUV format. The embodiments of this specification use the hardware acceleration capability of the GPU to quickly convert between the RGBA format and the YUV format, thereby meeting the speed requirements in the AR scene. Next, the embodiments of this specification will be described in detail. As shown in FIG. 1B, FIG. 1B is a flowchart of an augmented reality-based image processing method according to an exemplary embodiment of this specification, which can be applied to an electronic device and includes the following steps: In step 102, Obtaining AR material, the AR material includes an RGBA format image; In step 104, a graphics processor is used to read RGBA values of pixels in the RGBA format image, and a YUV component is calculated according to the RGBA values of the pixels In step 106, the YUV component value calculated by the image processor is read out to obtain a YUV format image, and AR processing is performed using the YUV format image. Among them, RGBA is a color space representing Red (red), Green (green), Blue (blue) and Alpha. As shown in FIG. 1C, it is a schematic diagram of an RGBA format image shown in this specification according to an exemplary embodiment. The image has a length of w and a width of h, and the pixel value of each pixel occupies 4 bytes , Are the four component values of R, G, B and A of the pixel. The image in YUV format includes three components, Y, U and V. "Y" means brightness (Luminance or Luma), that is, grayscale value; and "U" and "V" means chroma (Chrominance or Chroma), the role is to describe the color and saturation of the image, used to specify the color of the pixel. The graphics processor may use the conversion algorithm of the RGBA format and the YUV format to convert and calculate the RGBA value of the pixel point to obtain the YUV component value. During the specific conversion, the RGBA format image is input to the graphics processor, and the image data is stored in the memory of the graphics processor. The graphics processor reads the image data in the memory and performs format conversion. Among them, for RGBA format images, the four component values of pixels R, G, B and A are stored continuously when image data is stored; while for YUV format images, the Y component and U, V components of YUV format are separated of. In YUV format images, there are many types of subdivision according to the different image storage methods, such as: YUY2, YUYV, YVYU, UYVY, AYUV, Y41P, Y411, Y211, IF09, IYUV, YV12, YVU9, YUV411 or YUV420, etc. . Among them, different types correspond to different arrangement order of Y, U and V components. For example, as shown in FIG. 1D, it is a schematic diagram of a YUV format image shown in this specification according to an exemplary embodiment. The YUV format of the image shown in FIG. 1D uses NV12 as an example, with the first w*h bits The group is the Y component, and each component occupies one byte, respectively representing the Y component of the (i, j)th pixel. The subsequent w*h/2 bytes represent the UV component, and each column is stored in an alternating form of UVUVUV. Taking an image of 720×488 size as an example, it is divided into the following three parts according to the storage method: Y component: (720×480) byte U component: (720×480/2) byte V component: The three parts of (720×480/2) bytes are stored first in columns. The arrangement order of the three parts is that the Y component is arranged first, and then U and V are stored alternately. That is, 0 to 720×480 bytes of YUV data are Y component values, and the subsequent bytes are U and V stored alternately. Taking YUV format images stored in other storage formats as an example, for example, in the storage format of Y41P (and Y411), the order of YUV components is as follows: U0 Y0 V0 Y1 U4 Y2 V4 Y3 Y4 Y5 Y6 Y8 …. Based on this, the graphics processor can calculate the YUV component value according to the RGBA value of the pixel point. Considering that the storage method of the RGBA format and the YUV format are different, in an optional implementation manner, it may include: specified according to the YUV format The order of the YUV component arrangement and the conversion algorithm for converting the YUV grid in the RGBA format, and calculating the YUV component value according to the RGBA value of the pixel. As an example, the conversion algorithm of the RGBA format to YUV format can be specifically expressed by the following formula: Y = 0.299×R + 0.587×G + 0.114×B + 0 U = -0.169×R-0.331×G + 0.499 ×B + 0.5 V = 0.499×R-0.418×G-0.0813×B + 0.5 Through the above method, the graphics processor can read the RGBA of the pixels in the RGBA image for the order of the YUV component arrangement of the pixels in the YUV format image Value, and the YUV component value is calculated according to the conversion algorithm. After the calculation is completed, the calculation result stored in the memory area of the graphics processor is the YUV component value of the pixel in the YUV format image, that is, the YUV format image. In practical applications, GPU general computing standards include OpenCL, CUDA, ATI STREAM, etc. These standards can be understood as the application programming development interface provided by the bottom layer of the GPU. Therefore, the programming interface types provided by different GPUs can be used Realize the plan of this specification flexibly. Next, the OpenCL (Open Computing Language, Open Computing Language) interface is used as an example. OpenCL is the first open and free standard for parallel programming for heterogeneous systems and a unified programming environment, which is convenient for Software developers write efficient and lightweight code for high-performance computing servers, desktop computing systems, and handheld devices. OpenGL uses fragments as the basic storage unit. In some examples, when OpenGL processes RGBA format images, one fragment corresponds to the RGBA value of one pixel. Still taking the conversion of the RGBA format image shown in FIG. 1C into the YUV format image shown in FIG. 1D as an example, FIG. 1E shows a schematic diagram of the process of calculating the Y component. In FIG. 1E, the left side is the RGBA format, and the right side is the YUV format. In the embodiment of the present specification, the RGBA value of one fragment corresponding to one pixel is used as an example. Before conversion, the RGBA value stored in the memory area can be shown by the following table. Among them, for convenience of example, the following table only uses 3 pixels ( Pixel point (0, 0), pixel point (0, 1) and pixel point (0, 2)) are used as an example for description:
Figure 02_image001
The corresponding YUV format image, the YUV component value stored in the memory area can be shown by the following table:
Figure 02_image003
Through the above comparison, we can see that according to the order of the YUV components, we expect that the segment 1 in the converted YUV format image is the Y component of 4 pixels, and the Y component of this segment (4 pixels) needs to read RGBA The pixel values of four segments (4 pixels) in the format image are calculated. Based on this, in order to achieve a fast conversion, in an optional implementation, the YUV component arrangement order specified in the YUV format, and the conversion algorithm for converting the YUV format to the RGBA format, based on the RGBA value of the pixel Calculating the YUV component value may include: storing the RGBA value of each pixel in the RGBA format image in each original segment; configuring a storage area for storing the YUV format image; determining the storage area according to the YUV format After determining the target segment in the YUV component according to the arrangement order of the YUV components, the YUV component needs to be stored for the target segment; the RGBA value in the original segment needs to be stored for the target segment, and the YUV is calculated according to the conversion algorithm The component is stored in the target segment; the amount of data stored in each target segment in the storage area is output to obtain the YUV format image. As an example, FIG. 1E will be described again. When the GPU converts, the RGBA value of each pixel in the RGBA format image is stored in each original segment, and a storage area for storing the YUV format image needs to be configured. In FIG. 1E, taking the image length w and width h as an example, for the calculation of the Y component, the Y component needs to be stored in the memory area of the first w*h bytes. In this embodiment, four pixels The Y component of a point is regarded as a segment, that is, the four Y components occupy a target segment. This target segment involves four pixels, so you need to read the corresponding four segments (4 pixels) in the RGBA format image to calculate. Specifically, because the width is w, there are w/4 segments in one column. For each column, draw the w pixels on the left into the w/4 segments on the right. So for the fragment (Ya, Yb, Yc, Yd) at the (i, j) position, it corresponds to the (Pa, Pb, Pc, Pd) in the coordinates, and Pa corresponds to the pixel at the (m, n-1.5) position , Pb corresponds to the pixel at (m, n-0.5), and so on. The distance between two adjacent pixels is 1, which is calculated based on the total width. If normalization is required in OpenGL, the pixel pitch can be normalized to 1/w, so Pa corresponds to (m, n-1.5*1/w). Therefore, the target segment in the storage area can be determined according to the YUV format, and then the target segment needs to be stored according to the arrangement order of the YUV components, and then the RGBA value in the original segment is obtained, according to the conversion algorithm The calculated YUV component is stored in the target segment. The conversion algorithm for converting YUV format according to RGBA format, so for the segment (i, j) in the YUV format image, it can be found that it needs to be stored as the Y component: Ya = 0.299 × Ra + 0.587 × Ga + 0.114 × Ba + 0 Yb = 0.299 × Rb + 0.587 × Gb + 0.114 × Bb + 0 Yc = 0.299 × Rc + 0.587 × Gc + 0.114 × Bc + 0 Yd = 0.299 × Rd + 0.587 × Gd + 0.114 × Bd + 0 According to the YUV component Arrangement order, in the storage area where YUV format images are stored, the starting point of the Y component is the starting point of the storage area, the width is w/4, and the height is h. As an optional implementation, the Y component can be concentrated by OpenGL For the conversion calculation, according to the OpenGL interface function, the viewport can be set to glViewport(0, 0, w/4, h), and then use the above conversion formula to centrally convert and store the Y component in the storage area. For the calculation of the UV component, as shown in FIG. 1F, the UV component is a storage area with a size of w*h/2 immediately after the Y component. Here, two UVs, that is, UVUV, are regarded as a segment, which also correspond to the segments. RGBA. There are also w/4 segments in each column, but unlike the Y component, its height is only half, because it is 4 pixels to have a UV. So it can be understood that for two columns of 2w pixels in the RGBA image, it is mapped to one column of w/4 segments in the YUV format image. Therefore, for the fragment (Ua, Va, Uc, Vc) at the position of (i, j), it needs eight pixels in the RGBA format image to be calculated, that is, eight fragments, for example, Pa corresponds to (m-0.5 , n-1.5) pixels, Pc corresponds to (m-0.5, n+0.5) pixels. In order to simplify understanding and calculation, the abscissa of a and c can be taken as m. If you need to normalize, you can get Pa as pixels (m, n-1.5*1/w) and Pb as pixels (m, n+0.5*1/w). According to the conversion formula: Ua =-0.169 × Ra-0.331 × Ga + 0.499 × Ba + 0.5 Va = 0.499 × Ra-0.418 × Ga-0.0813 × Ba + 0.5 Ub =-0.169 × Rb-0.331 × Gb + 0.499 × Bb + 0.5 Vb = 0.499 × Rb-0.418 × Gb-0.0813 × Bb + 0.5 According to the arrangement order of YUV components, in the storage area where YUV format images are stored, the starting point of the UV component is the position after the Y component, and the width is also w /4, the height is h/2, as an optional implementation method, the conversion calculation of the Y component can be concentrated by OpenGL. According to the OpenGL interface function, the viewport can be set to glViewport(0, h, w/4, h/ 2), then use the above conversion formula to centrally convert and store the UV component in the storage area. As shown in FIG. 2, it is another image processing method shown in this specification according to an exemplary embodiment. The method includes: in step 202, an RGBA format image to be converted is obtained; in step 204, a graphic is used The processor reads the RGBA value of the pixel in the RGBA format image, and calculates the YUV component value according to the RGBA value of the pixel; in step 206, reads the YUV component calculated by the image processor Value to obtain an image in YUV format. Optionally, the calculation of the YUV component value according to the RGBA value of the pixel includes: the arrangement order of the YUV component specified in the YUV format, and the conversion algorithm for converting the YUV grid in the RGBA format, according to the pixel The RGBA value is calculated to obtain the YUV component value. Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; the YUV component arrangement order specified in the YUV format, and a conversion algorithm for converting YUV grids in the RGBA format, Calculating the YUV component value according to the RGBA value of the pixel includes: storing the RGBA value of each pixel in the RGBA format image in each original segment; configuring a storage area for storing the YUV format image; After determining the target segment in the storage area in the YUV format, it is determined that the target segment needs to store the YUV component according to the arrangement order of the YUV components; for the target segment, the YUV component needs to be stored, and the RGBA value in the original segment is obtained. The conversion algorithm calculates the YUV component and stores it in the target segment; outputs the amount of data stored in each target segment in the storage area to obtain the YUV format image. For specific details of this embodiment, reference may be made to the embodiments shown in FIGS. 1A to 1F, and details are not described herein again. Corresponding to the foregoing image processing method and the embodiment of augmented reality-based image processing, this specification also provides an image processing device, an augmented reality-based image processing device, and an electronic device to which it is applied Examples. The embodiments of the image processing apparatus/augmented reality-based image processing apparatus of this specification can be applied to electronic devices. The device embodiments can be implemented by software, or by hardware or a combination of hardware and software. Taking software implementation as an example, as a device in a logical sense, it is formed by reading the corresponding computer program instructions in the non-volatile memory into the memory through the processor in which the file processing is located. From the hardware level, as shown in FIG. 3, it is a hardware structure diagram of the electronic equipment where the image processing device/the augmented reality-based image processing device is located, except for the processor 310 shown in FIG. 3. In addition to the memory 330, the network interface 320, and the non-volatile memory 340, the electronic device where the device 331 is located in the embodiment usually includes other hardware according to the actual function of the electronic device, which is not Repeat again. As shown in FIG. 4, FIG. 4 is a block diagram of an augmented reality-based image processing apparatus according to an exemplary embodiment of the present specification. The apparatus includes: an acquisition module 41 for: acquiring AR material , The AR material includes an RGBA format image; a calculation module 42 is configured to: use a graphics processor to read RGBA values of pixels in the RGBA format image, and calculate YUV components according to the RGBA values of the pixels The reading module 43 is used to: read out the YUV component value calculated by the image processor, obtain a YUV format image, and perform AR processing using the YUV format image. Optionally, the calculation module 42 is specifically used to: calculate the YUV component value according to the RGBA value of the pixel point according to the YUV component arrangement order specified in the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format . Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; the calculation module 42 is specifically configured to: use the RGBA value of each pixel in the RGBA format image Store in each original segment; configure the storage area for storing images in YUV format; after determining the target segment in the storage region according to the YUV format, determine that the target segment needs to store the YUV component according to the arrangement order of the YUV components ; The target segment needs to store the YUV component, obtain the RGBA value in the original segment, calculate the YUV component according to the conversion algorithm, and store it in the target segment; output each target segment in the storage area for storage The amount of data to obtain images in the YUV format. As shown in FIG. 5, FIG. 5 is a block diagram of an image processing apparatus according to an exemplary embodiment of this specification. The acquisition module 51 is used to: acquire an RGBA format image to be converted; a calculation module 52, Used to: use the graphics processor to read the RGBA value of pixels in the RGBA format image, and calculate the YUV component value according to the RGBA value of the pixels; the reading module 53 is used to: read out the The YUV component value calculated by the image processor obtains a YUV format image. Optionally, the calculation module 52 is specifically used to: calculate the YUV component value according to the RGBA value of the pixel point according to the YUV component arrangement order specified in the YUV format and the conversion algorithm for converting the YUV grid in the RGBA format . Optionally, the image processor is provided with an OpenGL interface, and the OpenGL interface uses fragments as a basic storage unit; the calculation module 52 is specifically configured to: use the RGBA value of each pixel in the RGBA format image Store in each original segment; configure the storage area for storing images in YUV format; after determining the target segment in the storage region according to the YUV format, determine that the target segment needs to store the YUV component according to the arrangement order of the YUV components ; The target segment needs to store the YUV component, obtain the RGBA value in the original segment, calculate the YUV component according to the conversion algorithm, and store it in the target segment; output each target segment in the storage area for storage The amount of data to obtain images in the YUV format. Correspondingly, this specification also provides an electronic device, including: a processor; a memory for storing executable instructions of the processor; wherein, the processor is configured to: obtain AR material, and the AR material includes an RGBA format map Like; using a graphics processor to read the RGBA values of pixels in the RGBA format image, and calculating the YUV component values according to the RGBA values of the pixels; reading the YUV component values calculated by the image processor To obtain a YUV format image, and perform AR processing using the YUV format image. This specification also provides another electronic device, including: a processor; a memory for storing executable instructions of the processor; wherein, the processor is configured to: acquire an RGBA format image to be converted; read using a graphics processor Taking the RGBA value of the pixel in the RGBA format image, and calculating the YUV component value according to the RGBA value of the pixel; reading the YUV component value calculated by the image processor to obtain the YUV format image. The implementation process of the functions and functions of each module in the image processing apparatus/image processing apparatus based on augmented reality is described in detail in the implementation process of the corresponding steps in the above method, and will not be repeated here. As for the device embodiments, since they basically correspond to the method embodiments, the relevant parts can be referred to the description of the method embodiments. The device embodiments described above are only schematic, wherein the modules described as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical modules , That is, it can be located in one place, or can be distributed to multiple network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution in this specification. Those of ordinary skill in the art can understand and implement without paying creative labor. The foregoing describes specific embodiments of the present specification. Other embodiments are within the scope of the attached patent application. In some cases, the actions or steps described in the scope of the patent application may be performed in a different order than in the embodiment and still achieve the desired result. In addition, the processes depicted in the drawings do not necessarily require the particular order shown or sequential order to achieve the desired results. In some embodiments, multiplexing and parallel processing are also possible or may be advantageous. After considering the description and practicing the invention described herein, those skilled in the art will easily think of other embodiments of the description. This specification is intended to cover any variations, uses, or adaptive changes of this specification. These variations, uses, or adaptive changes follow the general principles of this specification and include common general knowledge or customary technical means in the technical field not described in this specification. . The description and examples are only to be considered exemplary, and the true scope and spirit of this description are indicated by the following patent application. It should be understood that this specification is not limited to the precise structure that has been described above and shown in the drawings, and that various modifications and changes can be made without departing from the scope thereof. The scope of this specification is limited only by the scope of the attached patent application. The above are only the preferred embodiments of this specification and are not intended to limit this specification. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of this specification should be included in this specification Within the scope of protection.

310‧‧‧處理器 320‧‧‧網路介面 330‧‧‧記憶體 331‧‧‧圖像處理裝置/基於擴增實境的圖像處理裝置 340‧‧‧非易失性記憶體 41‧‧‧獲取模組 42‧‧‧計算模組 43‧‧‧讀取模組 51‧‧‧獲取模組 52‧‧‧計算模組 53‧‧‧讀取模組310‧‧‧ processor 320‧‧‧Web interface 330‧‧‧Memory 331‧‧‧Image processing device/Image processing device based on augmented reality 340‧‧‧Non-volatile memory 41‧‧‧Get Module 42‧‧‧computing module 43‧‧‧Reading module 51‧‧‧Get Module 52‧‧‧Calculation module 53‧‧‧Reading module

此處的附圖被併入說明書中並構成本說明書的一部分,示出了符合本說明書的實施例,並與說明書一起用來解釋本說明書的原理。 圖1A是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理方法的應用場景圖。 圖1B是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理方法的流程圖。 圖1C是本說明書根據一示例性實施例示出的一種RGBA格式圖像的示意圖。 圖1D是本說明書根據一示例性實施例示出的一種YUV格式圖像的示意圖。 圖1E是本說明書根據一示例性實施例示出的一種示出了計算Y分量的過程示意圖。 圖1F是本說明書根據一示例性實施例示出的一種示出了計算UV分量的過程示意圖。 圖2是本說明書根據一示例性實施例示出的一種圖像處理方法的流程圖。 圖3是本說明書圖像處理裝置/基於擴增實境的圖像處理裝置所在電子設備的一種硬體結構圖。 圖4是本說明書根據一示例性實施例示出的一種基於擴增實境的圖像處理裝置的方塊圖。 圖5是本說明書根據一示例性實施例示出的一種圖像處理裝置的方塊圖。The drawings herein are incorporated into and constitute a part of this specification, show embodiments consistent with this specification, and are used to explain the principles of this specification together with the specification. Fig. 1A is an application scenario diagram of an augmented reality-based image processing method according to an exemplary embodiment of this specification. Fig. 1B is a flowchart of an augmented reality-based image processing method according to an exemplary embodiment of this specification. Fig. 1C is a schematic diagram of an RGBA format image according to an exemplary embodiment of this specification. Fig. 1D is a schematic diagram of a YUV format image according to an exemplary embodiment of this specification. Fig. 1E is a schematic diagram showing the process of calculating the Y component according to an exemplary embodiment of this specification. Fig. 1F is a schematic diagram showing a process of calculating UV components according to an exemplary embodiment of this specification. Fig. 2 is a flowchart of an image processing method according to an exemplary embodiment of this specification. FIG. 3 is a hardware structure diagram of electronic equipment in which the image processing apparatus/the augmented reality-based image processing apparatus is located. Fig. 4 is a block diagram of an augmented reality-based image processing device according to an exemplary embodiment of this specification. Fig. 5 is a block diagram of an image processing apparatus according to an exemplary embodiment of this specification.

Claims (5)

一種基於擴增實境的圖像處理方法,該方法包括:獲取AR素材,該AR素材包括RGBA格式圖像;利用圖形處理器讀取該RGBA格式圖像中像素點的RGBA值,該圖形處理器利用RGBA格式與YUV格式的轉換演算法,根據該像素點的RGBA值計算得到YUV分量值,其中,該圖形處理器提供有OpenGL介面,該OpenGL介面採用片段作為基本儲存單元;該YUV分量值透過如下方式計算得到:將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中;配置用以儲存YUV格式圖像的儲存區域;根據YUV格式確定該儲存區域中的目標片段後,根據該YUV分量排列順序確定該目標片段需儲存YV分量;針對目標片段需儲存YUV分量,獲取該原始片段中的RGBA值,根據該轉換演算法計算得到YUV分量後儲存於該目標片段中;以及讀取出該儲存區域中各該目標片段儲存的YUV分量值,獲得YUV格式圖像,利用該YUV格式圖像進行AR處理。 An augmented reality-based image processing method, which includes: acquiring AR material, the AR material including an RGBA format image; using a graphics processor to read RGBA values of pixels in the RGBA format image, the graphics processing The converter uses the conversion algorithm of RGBA format and YUV format to calculate the YUV component value according to the RGBA value of the pixel. Among them, the graphics processor provides an OpenGL interface, which uses fragments as the basic storage unit; the YUV component value Calculated by: storing the RGBA value of each pixel in the RGBA format image in each original segment; configuring the storage area for storing the YUV format image; determining the target segment in the storage area according to the YUV format Then, according to the arrangement order of the YUV components, it is determined that the target segment needs to store the YV component; for the target segment, the YUV component needs to be stored, the RGBA value in the original segment is obtained, the YUV component is calculated according to the conversion algorithm, and then stored in the target segment And reading out the YUV component value stored in each target segment in the storage area, obtaining a YUV format image, and performing AR processing using the YUV format image. 一種圖像處理方法,該方法包括:獲取待轉換的RGBA格式圖像;利用圖形處理器讀取該RGBA格式圖像中像素點的RGBA值,該圖形處理器利用RGBA格式與YUV格式的轉 換演算法,根據該像素點的RGBA值計算得到YUV分量值,其中,該圖形處理器提供有OpenGL介面,該OpenGL介面採用片段作為基本儲存單元;該YUV分量值透過如下方式計算得到:將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中;配置用以儲存YUV格式圖像的儲存區域;根據YUV格式確定該儲存區域中的目標片段後,根據該YUV分量排列順序確定該目標片段需儲存YV分量;針對目標片段需儲存YUV分量,獲取該原始片段中的RGBA值,根據該轉換演算法計算得到YUV分量後儲存於該目標片段中;以及讀取出該儲存區域中各該目標片段儲存的YUV分量值,獲得YUV格式圖像。 An image processing method, which includes: acquiring an RGBA format image to be converted; using a graphics processor to read RGBA values of pixels in the RGBA format image, the graphics processor using RGBA format and YUV format conversion The conversion algorithm calculates the YUV component value based on the RGBA value of the pixel. The graphics processor is provided with an OpenGL interface. The OpenGL interface uses fragments as the basic storage unit; the YUV component value is calculated by: RGBA The RGBA value of each pixel in the format image is stored in each original segment; the storage area configured to store the YUV format image; after determining the target segment in the storage area according to the YUV format, arrange the order according to the YUV component Determine that the target segment needs to store the YV component; for the target segment, need to store the YUV component, obtain the RGBA value in the original segment, calculate the YUV component according to the conversion algorithm, and store it in the target segment; and read out the storage area The YUV component values stored in the target segment in each are used to obtain a YUV format image. 一種基於擴增實境的圖像處理裝置,該裝置包括:獲取模組,用以:獲取AR素材,該AR素材包括RGBA格式圖像;計算模組,用以:利用圖形處理器讀取該RGBA格式圖像中像素點的RGBA值,該圖形處理器利用RGBA格式與YUV格式的轉換演算法,根據該像素點的RGBA值計算得到YUV分量值,其中,該圖形處理器提供有OpenGL介面,該OpenGL介面採用片段作為基本儲存單元;該YUV分量值透過如下方式計算得到:將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中;配置用以儲存YUV格式圖像的儲存區域;根據YUV格式確定該儲存區域 中的目標片段後,根據該YUV分量排列順序確定該目標片段需儲存YV分量;針對目標片段需儲存YUV分量,獲取該原始片段中的RGBA值,根據該轉換演算法計算得到YUV分量後儲存於該目標片段中;以及讀取模組,用以:讀取出該儲存區域中各該目標片段儲存的YUV分量值,獲得YUV格式圖像,利用該YUV格式圖像進行AR處理。 An augmented reality-based image processing device, the device includes: an acquisition module for: acquiring AR material, the AR material includes RGBA format images; a calculation module for: using a graphics processor to read the The RGBA value of the pixel in the RGBA format image. The graphics processor uses the conversion algorithm of the RGBA format and the YUV format to calculate the YUV component value according to the RGBA value of the pixel. Among them, the graphics processor provides an OpenGL interface. The OpenGL interface uses fragments as the basic storage unit; the YUV component value is calculated by: storing the RGBA value of each pixel in the RGBA format image in each original fragment; configured to store the YUV format image Storage area; determine the storage area according to YUV format After determining the target segment in the YUV component according to the arrangement order of the YUV component, the YUV component needs to be stored for the target segment; the RGBA value in the original segment needs to be stored for the target segment, and the YUV component is calculated according to the conversion algorithm and then stored in In the target segment; and a reading module for: reading out YUV component values stored in the target segment in the storage area, obtaining a YUV format image, and performing AR processing using the YUV format image. 一種圖像處理裝置,該裝置包括:獲取模組,用以:獲取待轉換的RGBA格式圖像;計算模組,用以:利用圖形處理器讀取該RGBA格式圖像中像素點的RGBA值,該圖形處理器利用RGBA格式與YUV格式的轉換演算法,根據該像素點的RGBA值計算得到YUV分量值,其中,該圖形處理器提供有OpenGL介面,該OpenGL介面採用片段作為基本儲存單元;該YUV分量值透過如下方式計算得到:將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中;配置用以儲存YUV格式圖像的儲存區域;根據YUV格式確定該儲存區域中的目標片段後,根據該YUV分量排列順序確定該目標片段需儲存YV分量;針對目標片段需儲存YUV分量,獲取該原始片段中的RGBA值,根據該轉換演算法計算得到YUV分量後儲存於該目標片段中;以及讀取模組,用以:讀取出該儲存區域中各該目標片段儲存的YUV分量值,獲得YUV格式圖像。 An image processing device, comprising: an acquisition module for: acquiring an RGBA format image to be converted; a calculation module for: using a graphics processor to read RGBA values of pixels in the RGBA format image , The graphics processor uses the conversion algorithm of RGBA format and YUV format to calculate the YUV component value according to the RGBA value of the pixel, wherein the graphics processor provides an OpenGL interface, and the OpenGL interface uses fragments as the basic storage unit; The YUV component value is calculated by: storing the RGBA value of each pixel in the RGBA format image in each original segment; configuring the storage area for storing the YUV format image; determining the storage area according to the YUV format After determining the target segment in the YUV component according to the arrangement order of the YUV component, the YUV component needs to be stored for the target segment; the RGBA value in the original segment needs to be stored for the target segment, and the YUV component is calculated according to the conversion algorithm and then stored in In the target segment; and a reading module for: reading out YUV component values stored in the target segment in the storage area to obtain a YUV format image. 一種電子設備,包括:處理器;用以儲存處理器可執行指令的記憶體;其中,該處理器被配置成:獲取AR素材,該AR素材包括RGBA格式圖像;利用圖形處理器讀取該RGBA格式圖像中像素點的RGBA值,該圖形處理器利用RGBA格式與YUV格式的轉換演算法,根據該像素點的RGBA值計算得到YUV分量值,其中,該圖形處理器提供有OpenGL介面,該OpenGL介面採用片段作為基本儲存單元;該YUV分量值透過如下方式計算得到:將RGBA格式圖像中每個像素點的RGBA值儲存於每個原始片段中;配置用以儲存YUV格式圖像的儲存區域;根據YUV格式確定該儲存區域中的目標片段後,根據該YUV分量排列順序確定該目標片段需儲存YV分量;針對目標片段需儲存YUV分量,獲取該原始片段中的RGBA值,根據該轉換演算法計算得到YUV分量後儲存於該目標片段中;以及讀取出該圖形處理器計算出的YUV分量值,獲得YUV格式圖像,利用該YUV格式圖像進行AR處理。 An electronic device, including: a processor; a memory for storing processor executable instructions; wherein, the processor is configured to: obtain AR material, the AR material includes RGBA format images; use a graphics processor to read the The RGBA value of the pixel in the RGBA format image. The graphics processor uses the conversion algorithm of the RGBA format and the YUV format to calculate the YUV component value according to the RGBA value of the pixel. Among them, the graphics processor provides an OpenGL interface. The OpenGL interface uses fragments as the basic storage unit; the YUV component value is calculated by: storing the RGBA value of each pixel in the RGBA format image in each original fragment; configured to store the YUV format image Storage area; after determining the target segment in the storage area according to the YUV format, determine that the target segment needs to store the YV component according to the arrangement order of the YUV components; for the target segment, store the YUV component and obtain the RGBA value in the original segment, according to the The conversion algorithm calculates the YUV component and stores it in the target segment; and reads the YUV component value calculated by the graphics processor to obtain a YUV format image, and uses the YUV format image for AR processing.
TW107144780A 2018-01-24 2018-12-12 Image processing method, device and electronic equipment based on augmented reality TWI695295B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201810068563.2 2018-01-24
CN201810068563.2A CN108322722B (en) 2018-01-24 2018-01-24 Image processing method, device and electronic device based on augmented reality
??201810068563.2 2018-01-24

Publications (2)

Publication Number Publication Date
TW201933046A TW201933046A (en) 2019-08-16
TWI695295B true TWI695295B (en) 2020-06-01

Family

ID=62887604

Family Applications (1)

Application Number Title Priority Date Filing Date
TW107144780A TWI695295B (en) 2018-01-24 2018-12-12 Image processing method, device and electronic equipment based on augmented reality

Country Status (3)

Country Link
CN (1) CN108322722B (en)
TW (1) TWI695295B (en)
WO (1) WO2019144744A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108322722B (en) * 2018-01-24 2020-01-21 阿里巴巴集团控股有限公司 Image processing method, device and electronic device based on augmented reality
CN109410308A (en) * 2018-09-29 2019-03-01 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium
CN111093096A (en) * 2019-12-25 2020-05-01 广州酷狗计算机科技有限公司 Video encoding method and apparatus, and storage medium
CN111858022A (en) * 2020-02-27 2020-10-30 北京嘀嘀无限科技发展有限公司 Image recognition method, embedded terminal, electronic device and readable storage medium
CN113554721B (en) * 2021-07-23 2023-11-14 北京百度网讯科技有限公司 Image data format conversion method and device
CN114040246A (en) * 2021-11-08 2022-02-11 网易(杭州)网络有限公司 Image format conversion method, device, equipment and storage medium of graphic processor
CN118175157B (en) * 2024-05-09 2024-08-02 江苏北弓智能科技有限公司 Remote mobile cloud desktop acquisition method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201504990A (en) * 2013-04-22 2015-02-01 英特爾公司 Color buffer compression
US20150317085A1 (en) * 1998-11-09 2015-11-05 Broadcom Corporation Graphics display system with unified memory architecture
CN106231205A (en) * 2016-08-10 2016-12-14 苏州黑盒子智能科技有限公司 Augmented reality mobile terminal
CN107071516A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A method for image file processing
CN107274346A (en) * 2017-06-23 2017-10-20 中国科学技术大学 Real-time panoramic video splicing system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4568120B2 (en) * 2005-01-04 2010-10-27 株式会社東芝 Playback device
US8014615B2 (en) * 2005-09-02 2011-09-06 Adobe Systems Incorporated System and method for decompressing video data and alpha channel data using a single stream
CN102103463B (en) * 2011-02-17 2013-03-13 浙江宇视科技有限公司 Processing method and equipment of user interface information with transparency
CN106228581B (en) * 2016-08-01 2019-06-21 武汉斗鱼网络科技有限公司 Pixel format is converted to the method and system of NV12 by GPU by ARGB
CN109348226B (en) * 2017-04-08 2022-11-11 腾讯科技(深圳)有限公司 Picture file processing method and intelligent terminal
CN108322722B (en) * 2018-01-24 2020-01-21 阿里巴巴集团控股有限公司 Image processing method, device and electronic device based on augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150317085A1 (en) * 1998-11-09 2015-11-05 Broadcom Corporation Graphics display system with unified memory architecture
TW201504990A (en) * 2013-04-22 2015-02-01 英特爾公司 Color buffer compression
CN106231205A (en) * 2016-08-10 2016-12-14 苏州黑盒子智能科技有限公司 Augmented reality mobile terminal
CN107071516A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A method for image file processing
CN107274346A (en) * 2017-06-23 2017-10-20 中国科学技术大学 Real-time panoramic video splicing system

Also Published As

Publication number Publication date
CN108322722A (en) 2018-07-24
TW201933046A (en) 2019-08-16
WO2019144744A1 (en) 2019-08-01
CN108322722B (en) 2020-01-21

Similar Documents

Publication Publication Date Title
TWI695295B (en) Image processing method, device and electronic equipment based on augmented reality
US12400393B2 (en) Method and system for rendering panoramic video
CN104971499B (en) Game providing server
JP6799017B2 (en) Terminal devices, systems, programs and methods
WO2019095830A1 (en) Video processing method and apparatus based on augmented reality, and electronic device
US11748911B2 (en) Shader function based pixel count determination
US10237563B2 (en) System and method for controlling video encoding using content information
CN108256072B (en) Album display method, apparatus, storage medium and electronic device
US9807315B1 (en) Lookup table interpolation in a film emulation camera system
CN118043842A (en) A rendering format selection method and related device
CN114549281A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114693894A (en) Method, system and device for converting pictures into building blocks in virtual world
WO2025020657A1 (en) Image display method for virtual scene, device, medium and program product
WO2024183469A1 (en) Game picture display method and apparatus, device, and computer-readable storage medium
CN117152171B (en) Image processing method, device, equipment, storage medium and program product
CN117689786A (en) Image generation method, device, non-volatile storage medium and computer equipment
TWM642706U (en) Surrounding scene generation system
CN110969674B (en) Method and device for generating winding drawing, terminal equipment and readable storage medium
HK1256770A1 (en) Image processing method and device based on augmented reality, and electronic equipment
HK1256770B (en) Image processing method and device based on augmented reality, and electronic equipment
CN114677464A (en) Image processing method, image processing device, computer equipment and storage medium
TWI723119B (en) Image preview method and device for camera application and camera application system
TWI828575B (en) Scenery generation system and control method
CN113706665B (en) Image processing method and device
KR102701292B1 (en) An image processing method that synthesizes images by distinguishing backgrounds and objects